OpenLMLab/LOMO

adalomo在使用chatglm2模型出现错误

JorunoJobana opened this issue · 2 comments

您好,感谢您出色的工作,我在chaglm2模型和adalomo时出现了问题

Traceback (most recent call last):
File "/home/pycharmProjcet/adalomo/instruction-tuning/train_dx_chatglm2.py", line 276, in
train()
File "/home/pycharmProjcet/adalomo/instruction-tuning/train_dx_chatglm2.py", line 259, in train
trainer.train()
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/controller/trainer.py", line 364, in train
loss = self.train_fn(self, batch, self.global_batch_idx)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/controller/trainer.py", line 434, in train_fn
outputs = trainer.engine(**batch)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/deepspeed-0.11.1-py3.10.egg/deepspeed/utils/nvtx.py", line 15, in wrapped_fn
ret_val = func(*args, **kwargs)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/deepspeed-0.11.1-py3.10.egg/deepspeed/runtime/engine.py", line 1807, in forward
loss = self.module(*inputs, **kwargs)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1212, in _call_impl
result = forward_call(*input, **kwargs)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/models/chatglm2/model.py", line 646, in forward
past_key_values = self._get_past_key_values(self.layers)
File "/home/anaconda3/envs/adalomo/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1269, in getattr
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'ChatGLM2ForCausalLM' object has no attribute '_get_past_key_values'

我使用本项目中adalomo的指令微调train.py示例,将模型导入的相关代码改为collie/examples/finetune_chatglm2_for_summary.py中的对应代码

你好,dev分支里chatglm2的646行跟报错信息对不上,请更新到最新的dev分支尝试一下
https://github.com/OpenLMLab/collie/blob/3b35692e54bb81f1966d1c785e435e38dcb4a437/collie/models/chatglm2/model.py#L646

您好,我更新到最新dev分支使用adalomo重新训练chatglm2后,在验证阶段设置max_new_tokens参数进行验证时报错,报错信息:
Traceback (most recent call last):
File "/home/pycharmProjcet/adalomo/instruction-tuning/train_dx_chatglm2.py", line 323, in
train()
File "/home/pycharmProjcet/adalomo/instruction-tuning/train_dx_chatglm2.py", line 306, in train
trainer.train()
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/controlle
r/trainer.py", line 383, in train
self.eval()
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/controlle
r/trainer.py", line 405, in eval
results = evaluator.eval(dataloader)
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/controlle
r/evaluator.py", line 173, in eval
result = self.eval_fn(self, batch)
File "/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/torch/autograd/grad_mode.py", line
27, in decorate_context
return func(*args, **kwargs)
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/controlle
r/evaluator.py", line 251, in eval_fn
generated_ids = evaluator.engine.module.generate(
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/models/ba
se.py", line 91, in generate
res = super().generate(*args, **kwargs)
File "/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/torch/autograd/grad_mode.py", line
27, in decorate_context
return func(*args, **kwargs)
File "/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/transformers/generation/utils.py",
line 1602, in generate
return self.greedy_search(
File "/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/transformers/generation/utils.py",
line 2450, in greedy_search
outputs = self(
File "/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/torch/nn/modules/module.py", line
1212, in _call_impl
result = forward_call(*input, **kwargs)
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/models/ch
atglm2/model.py", line 778, in forward
output = self.model(
File "/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/torch/nn/modules/module.py", line
1212, in _call_impl
result = forward_call(*input, **kwargs)
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/models/ch
atglm2/model.py", line 677, in forward
inputs.update(layer(inputs))
File "/homen/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/torch/nn/modules/module.py", line
1212, in _call_impl
result = forward_call(*input, **kwargs)
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/models/ch
atglm2/model.py", line 591, in forward
full_attention_mask = self.get_masks(
File
"/home/anaconda3/envs/adalomo_dev/lib/python3.10/site-packages/collie_lm-1.0.5-py3.10.egg/collie/models/ch
atglm2/model.py", line 428, in get_masks
if past_key_values:
RuntimeError: Boolean value of Tensor with more than one value is ambiguous

这是我的generation_config:
generation_config=GenerationConfig(
eos_token_id=tokenizer.eos_token_id,
pad_token_id=tokenizer.pad_token_id,
max_new_tokens=5
)