lich99/ChatGLM-finetune-LoRA

报错

Opened this issue · 0 comments

fxb392 commented

return {'input_ids': torch.tensor(input_ids).long(),
'attention_mask': attention_mask,
'labels': torch.stack(labels),
'position_ids':torch.stack(position_ids)}
这里为什么有torch.tensor()和torch.stack()呢?
不写成torch.stack()有时会报ValueError: expected sequence of length 50 at dim 1 (got 49)这个错,可以解答一下吗?