THUDM/ChatGLM-6B

[BUG/Help] ValueError: not enough values to unpack (expected 2, got 1)

ChenSir254 opened this issue · 2 comments

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

在运行命令行对话demo时报错

Expected Behavior

No response

Steps To Reproduce

1.从github 中clone好chatglm3代码
2.从huggingface下载模型chatglm3-6b-base
3.配置好环境
4.运行python cli_demo.py

Environment

- OS: Ubuntu 20.04
- Python:3.7
- Transformers: 4.26.1
- PyTorch:2.30
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :True

Anything else?

No response

Is it solved?

cli_demo.py build_prompt 方法有 bug,可以修改一下

def build_prompt(history):
    prompt = "欢迎使用 ChatGLM-6B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序"
    for item in history:
        query = item["role"]
        response = item["content"]
        prompt += f"\n\n用户:{query}"
        prompt += f"\n\nChatGLM-6B:{response}"
    return prompt