THUDM/ChatGLM-6B

[BUG/Help] <title>chat 无法跳转到本地下载的模型modeling_chatglm.py 文件

Opened this issue · 0 comments

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

为什么model.chat跳转到 .cache\huggingface\modules\transformers_modulesodules\modeling_chatglm.py 执行,而不是跳转到我本地下载的模型 modeling_chatglm.py

Expected Behavior

No response

Steps To Reproduce

默认

Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

No response