Adapter Merge for Idefics2
alielfilali01 opened this issue · 2 comments
Feature request
I have fine-tuned the Idefics2 model and obtained an adapter. However, when attempting to use the adapter, I encounter the following error:
ValueError Traceback (most recent call last)
Cell In[15], line 5
2 trainer.save_model("output_yalla")
4 # Load the PEFT model on CPU
----> 5 model = AutoPeftModelForCausalLM.from_pretrained(
6 "Ali-C137/idefics2-8b-yalla-finetuned-cutural",
7 torch_dtype=torch.float16,
8 )
10 # Merge the adapter with the base model
11 merged_model = model.merge_and_unload()
File ~/miniconda/lib/python3.9/site-packages/peft/auto.py:100, in _BaseAutoPeftModel.from_pretrained(cls, pretrained_model_name_or_path, adapter_name, is_trainable, config, **kwargs)
98 target_class = getattr(parent_library, base_model_class)
99 else:
--> 100 raise ValueError(
101 "Cannot infer the auto class from the config, please make sure that you are loading the correct model for your task type."
102 )
104 base_model = target_class.from_pretrained(base_model_path, **kwargs)
106 tokenizer_exists = False
ValueError: Cannot infer the auto class from the config, please make sure that you are loading the correct model for your task type.
The motivation for this proposal is to enable the merging of adapters for the Idefics2 model. This feature is crucial for effectively utilizing the fine-tuned adapters in downstream tasks. Without this capability, users who have invested time in fine-tuning Idefics2 are unable to fully leverage their adapters, limiting the model's practical utility.
Motivation
Implement support for merging adapters with the Idefics2 model. This functionality should allow users to seamlessly load their fine-tuned adapters and merge them with the base model.
Your contribution
No, I am unable to submit a PR at this time.
Thank you for considering this feature request. Implementing this capability would greatly enhance the usability of the Idefics2 model for the community.