xusenlinzy/api-for-open-llm

采用Vllm方式启动baichuan2-7b模型,回复乱码

hzs0828 opened this issue · 2 comments

提交前必须检查以下项目 | The following items must be checked before submission

  • 请确保使用的是仓库最新代码(git pull),一些问题已被解决和修复。 | Make sure you are using the latest code from the repository (git pull), some issues have already been addressed and fixed.
  • 我已阅读项目文档FAQ章节并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案 | I have searched the existing issues / discussions

问题类型 | Type of problem

模型推理和部署 | Model inference and deployment

操作系统 | Operating system

Linux

详细描述问题 | Detailed description of the problem

# 请在此处粘贴运行代码(如没有可删除该代码块)
# Paste the runtime code here (delete the code block if you don't have it)

模型:baichuan2-7b
vllm方式启动回答是乱码,如下图
image
vllm设置为false启动,回答效果如下图:
image

请问这个是什么原因导致的呢

Dependencies

# 请在此处粘贴依赖情况
# Please paste the dependencies here

运行日志或截图 | Runtime logs or screenshots

# 请在此处粘贴运行日志
# Please paste the run log here

启动的是server.py脚本

vllm-project/vllm#1085

应该是vllm的一个bug