Can not make access with local llm
Opened this issue · 1 comments
KANG-HYUNIL commented
Checked other resources
- I searched the Codefuse documentation with the integrated search.
- I used the GitHub search to find a similar question and didn't find it.
- I am sure that this is a bug in Codefuse-Repos rather than my code.
- I added a very descriptive title to this issue.
System Info
Windows
Code Version
Latest Release
Description
i already changed VLLM_MODEL_DICT in model_config.py and downloaded chatglm-6b in llm_models folder. Docker image has some dependencies error, so i used local services. Even though i modified the code as wrote in fastchat.md but i can not connect with local llm, also do not record a log at llm_api.log and sdfile_api.log .
Example Code
#thats how i changed VLLM_MODEL_DICT
VLLM_MODEL_DICT = VLLM_MODEL_DICT or {
'chatglm2-6b': "chatglm-6b",
}
Error Message and Stack Trace (if applicable)
No response
Dzz2004 commented
查看一下是没是没有启动sdfile_api,你运行的终端可以在git bash 里运行