运行vllm_offline.py报错
zjjznw123 opened this issue · 3 comments
zjjznw123 commented
版本cuda12.1 torch 2.1.0 vllm0.2.2,根据你的版本步骤,就报错, self.stop_words_ids=[self.tokenizer.im_start_id,self.tokenizer.im_end_id,self.tokenizer.eos_token_id]
AttributeError: 'Qwen2TokenizerFast' object has no attribute 'im_start_id',请问是我哪错了
XingYu-Zhong commented
我也遇到同样的问题,请问是哪里错了呀?
rivergao87 commented
请问怎么解决的呢
XingYu-Zhong commented
我用的qwen版本不对,我是qwen1.5的版本