报错
pkuhsczy opened this issue · 2 comments
OSError: Unable to load weights from pytorch checkpoint file for '/home/albay/zhaoyang/neukg/TechGPT-7B/pytorch_model.bin' at '/home/albay/zhaoyang/neukg/TechGPT-7B/pytorch_model.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.
上面是直接用网页示例给出的错误。
如果按照反馈加上from_tf=True,会报下面的错误:
Traceback (most recent call last):
File "/home/albay/zhaoyang/techgpt.py", line 13, in
model = AutoModelForCausalLM.from_pretrained(ckpt_path, torch_dtype=load_type, config=model_config, from_tf=True)
File "/root/miniconda3/envs/nlp_techgpt1/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 493, in from_pretrained
return model_class.from_pretrained(
File "/root/miniconda3/envs/nlp_techgpt1/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2869, in from_pretrained
model, loading_info = load_tf2_checkpoint_in_pytorch_model(
File "/root/miniconda3/envs/nlp_techgpt1/lib/python3.10/site-packages/transformers/modeling_tf_pytorch_utils.py", line 442, in load_tf2_checkpoint_in_pytorch_model
tf_model_class = getattr(transformers, tf_model_class_name)
File "/root/miniconda3/envs/nlp_techgpt1/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1092, in getattr
raise AttributeError(f"module {self.name} has no attribute {name}")
AttributeError: module transformers has no attribute TFLlamaForCausalLM
我的torch和transformers是最新版的,不知道是哪里的问题。
可以看一下该路径是否有完整权重:'/home/albay/zhaoyang/neukg/TechGPT-7B'
的确,是我下载的权重文件(就是文件名里有00)的那个不完整。完整的话可以运行。