jianzhnie/LLamaTuner

baichuan-7B: AttributeError: 'CastOutputToFloat' object has no attribute 'weight'

Closed this issue · 1 comments

chatllms - INFO - Adding special tokens.
Using pad_token, but it is not set yet.
Traceback (most recent call last):
File "/content/drive/MyDrive/Efficient-Tuning-LLMs/train_qlora.py", line 156, in
main()
File "/content/drive/MyDrive/Efficient-Tuning-LLMs/train_qlora.py", line 80, in main
add_special_tokens_if_missing(tokenizer, model)
File "/content/drive/MyDrive/Efficient-Tuning-LLMs/chatllms/utils/model_utils.py", line 47, in add_special_tokens_if_missing
smart_tokenizer_and_embedding_resize(special_tokens_dict, tokenizer,
File "/content/drive/MyDrive/Efficient-Tuning-LLMs/chatllms/utils/model_utils.py", line 77, in smart_tokenizer_and_embedding_resize
model.resize_token_embeddings(len(tokenizer))
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1395, in resize_token_embeddings
model_embeds = self._resize_token_embeddings(new_num_tokens)
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1416, in _resize_token_embeddings
new_lm_head = self._get_resized_lm_head(old_lm_head, new_num_tokens)
File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1520, in _get_resized_lm_head
old_lm_head.weight.size() if not transposed else old_lm_head.weight.t().size()
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1614, in getattr
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'CastOutputToFloat' object has no attribute 'weight'

想请教一下是什么问题

this bug has solved