Not work on Oobabooga.

#1
by RGTails - opened

I get this error.

Loading checkpoint shards: 16%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 4/25 [00:03<00:19, 1.05it/s]
22:47:01-428385 ERROR Failed to load the model.
Traceback (most recent call last):
File "I:\text-generation-webui\modules\ui_model_menu.py", line 205, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\modules\models.py", line 43, in load_model
output = load_func_maploader
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\modules\models.py", line 95, in transformers_loader
return load_model_HF(model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\modules\transformers_loader.py", line 259, in load_model_HF
model = LoaderClass.from_pretrained(path_to_model, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\installer_files\env\Lib\site-packages\transformers\models\auto\auto_factory.py", line 604, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 288, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 5179, in from_pretrained
) = cls._load_pretrained_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 5642, in _load_pretrained_model
_error_msgs, disk_offload_index, cpu_offload_index = load_shard_file(args)
^^^^^^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 933, in load_shard_file
state_dict = load_state_dict(
^^^^^^^^^^^^^^^^
File "I:\text-generation-webui\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 508, in load_state_dict
with safe_open(checkpoint_file, framework="pt") as f:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
safetensors_rust.SafetensorError: Error while deserializing header: incomplete metadata, file not fully covered

What you use for load model and using SillyTavern?

I use llama.cpp

yamatazen changed discussion status to closed

Sign up or log in to comment