TypeError: MistralModel.__init__() got an unexpected keyword argument 'safe_serialization'
asma-10 opened this issue · 1 comments
asma-10 commented
hello, i am working on a RAG model and using : HuggingFaceEmbedding(model_name='BAAI/bge-m3')
and it used to work perfectly but now i am encountrng this error :
TypeError Traceback (most recent call last)
Cell In[17], line 1
----> 1 Settings.embed_model = HuggingFaceEmbedding(model_name='BAAI/bge-m3')
File /home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/llama_index/embeddings/huggingface/base.py:83, in HuggingFaceEmbedding.__init__(self, model_name, tokenizer_name, pooling, max_length, query_instruction, text_instruction, normalize, model, tokenizer, embed_batch_size, cache_folder, trust_remote_code, device, callback_manager, safe_serialization)
77 if model is None: # Use model_name with AutoModel
78 model_name = (
79 model_name
80 if model_name is not None
81 else DEFAULT_HUGGINGFACE_EMBEDDING_MODEL
82 )
---> 83 model = AutoModel.from_pretrained(
84 model_name,
85 cache_dir=cache_folder,
86 trust_remote_code=trust_remote_code,
87 safe_serialization=safe_serialization,
88 )
89 elif model_name is None: # Extract model_name from model
90 model_name = model.name_or_path
File /home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:566, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
564 elif type(config) in cls._model_mapping.keys():
565 model_class = _get_model_class(config, cls._model_mapping)
--> 566 return model_class.from_pretrained(
567 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
568 )
569 raise ValueError(
570 f"Unrecognized configuration class {config.__class__} for this kind of AutoModel: {cls.__name__}.\n"
571 f"Model type should be one of {', '.join(c.__name__ for c in cls._model_mapping.keys())}."
572 )
File /home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/transformers/modeling_utils.py:3594, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, *model_args, **kwargs)
3588 config = cls._autoset_attn_implementation(
3589 config, use_flash_attention_2=use_flash_attention_2, torch_dtype=torch_dtype, device_map=device_map
3590 )
3592 with ContextManagers(init_contexts):
3593 # Let's make sure we don't run the init function of buffer modules
-> 3594 model = cls(config, *model_args, **model_kwargs)
3596 # make sure we use the model's config since the __init__ call might have copied it
3597 config = model.config
TypeError: MistralModel.__init__() got an unexpected keyword argument 'safe_serialization'
i think it has to do with another version of transformers so i tried to upgrade the transformers using
pip install transformers --upgrade
but it didn't work
so if anyone can help with this i will be more than grateful
julien-c commented
i think this is an issue for the transformers repo, or better, the Forum