RUCKBReasoning/codes

codes-7b-merged load error

Closed this issue · 2 comments

when i load model

model = AutoModelForCausalLM.from_pretrained(model_name, device_map = "auto", torch_dtype = torch.float16)
  • error
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/mnt/disk3/home/ls/miniconda3/envs/codes/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 561, in from_pretrained
    return model_class.from_pretrained(
  File "/mnt/disk3/home/ls/miniconda3/envs/codes/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3118, in from_pretrained
    raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /mnt/disk3/home/ls/project/text2sql-demo/model/seeklhy/codes.

some of pip package:

torch              1.13.1+cu117
torchaudio         0.13.1
torchvision        0.14.1
tokenizers         0.15.2
transformers       4.38.2
flash-attn         2.5.6

what is wrongwith me?
and I found some related issues:
oobabooga/text-generation-webui#122 (comment)

Hi!

As far as I know, the most common cause of this problem comes from missing some key files when downloading the model from Hugging Face.

Ensure you've downloaded all the files from codes-7b-merged and subsequently placed them into the /mnt/disk3/home/ls/project/text2sql-demo/model/seeklhy/codes directory.

ok,I redownloaded the model and now it works!but,I checked the SHA256 of the model I downloaded before and it is the same and Its location is no problem either。