Luodian/Otter

loading pretrained model

Closed this issue · 5 comments

I downloaded the otter weights: OTTER-MPT7B-Init, where should I put this downloaded file? (now I put it in /Otter/pipeline/demos/interactive/)
And, when I try to run the demo (file path: /Otter/pipeline/demos/interactive/otter_image.ipynb), the following line reports error:
model = OtterForConditionalGeneration.from_pretrained("luodian/OTTER-MPT7B-Init", device_map="sequential")
error message:
OSError: We couldn't connect to 'https://huggingface.co/' to load this file, couldn't find it in the cached files and it looks like mosaicml/mpt-7b-instruct is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

I wonder how can I use the downloaded weights? And can the weights: OTTER-MPT7B-Init be loaded by module OtterForConditionalGeneration? Or I should load it by code: from otter_ai.models.flamingo.modeling_flamingo import FlamingoForConditionalGeneration?

  • model (model configuration, components, etc.),

Thank you for your contributions!

Or I should load it by code: from otter_ai.models.flamingo.modeling_flamingo import FlamingoForConditionalGeneration?

Yes, you should load it via Flamingo, that's better way. However, you can still load it via OtterForConditionalGeneration since we've handled the logic of loading different models.

As for the connection problem, may you need a VPN for it?

Or I should load it by code: from otter_ai.models.flamingo.modeling_flamingo import FlamingoForConditionalGeneration?

Yes, you should load it via Flamingo, that's better way. However, you can still load it via OtterForConditionalGeneration since we've handled the logic of loading different models.

As for the connection problem, may you need a VPN for it?

I have already downloaded the weights, how to use this weights instead of diresctly downloading it from huggingface?

或者我应该通过代码加载它: from otter_ai.models.flamingo.modeling_flamingo import FlamingoForConditionalGeneration?

是的,你应该通过 Flamingo 加载它,这是更好的方法。但是,您仍然可以通过 OtterForConditionalGeneration 加载它,因为我们已经处理了加载不同模型的逻辑。
至于连接问题,需要VPN吗?

我已经下载了权重,如何使用这个权重而不是直接从huggingface下载?

I put the downloaded weights under huggingface path and the weights seem to be loaded successfully, while I have encounter another problem, when I load the weights by : if args.model_name.lower() == "otter": model = OtterForConditionalGeneration.from_pretrained( args.pretrained_model_name_or_path, **kwargs, )
reports an error of :
Traceback (most recent call last): File "/data/data/project/LLMs/Otter/pipeline/train/instruction_following.py", line 536, in <module> main() File "/data/data/project/LLMs/Otter/pipeline/train/instruction_following.py", line 332, in main model = OtterForConditionalGeneration.from_pretrained( File "/data/ldata/anaconda3/envs/otter/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3236, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/data/data/anaconda3/envs/otter/lib/python3.9/site-packages/deepspeed/runtime/zero/partition_parameters.py", line 503, in wrapper f(module, *args, **kwargs) File "/data/data/project/LLMs/Otter/src/otter_ai/models/otter/modeling_otter.py", line 751, in __init__ text_tokenizer = AutoTokenizer.from_pretrained("luodian/OTTER-MPT7B-Init") File "/data/data/anaconda3/envs/otter/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 733, in from_pretrained config = AutoConfig.from_pretrained( File "/data/data/anaconda3/envs/otter/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 1064, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/data/data/anaconda3/envs/otter/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 761, in __getitem__ raise KeyError(key) KeyError: 'otter'.

Is it because otter can't be used as model_name? I'm confused.

Can you check your transformers version via pip show transformers.

You may need to follow our fixed version.

transformers is always updating, and would cause many unexpected errors.

您可以通过查看您的变形金刚版本吗pip show transformers

您可能需要遵循我们的固定版本。

变压器总是在更新,并且会导致许多意想不到的错误。

My version of transformer is the same as the one you provided.

I've solved this problem, by re-downloading the model weights from huggingface's mirror site. Thank you for your reply.