本地大模型openai api接口支持问题
Closed this issue · 2 comments
我在本地起了qwen14b大模型,运行tests/chains_test.py,报错如下:
2024-03-27 16:30:00.221 | WARNING | coagent.sandbox.pycodebox:_check_port:245 - Port is conflict, please check your codebox's port 5050
2024-03-27 16:30:00.224 | INFO | coagent.sandbox.pycodebox:start:311 - port_status: True, connect_status: True
2024-03-27 16:30:00,231 - openai.py[line:193] - WARNING: WARNING! stop is not default parameter.
stop was transferred to model_kwargs.
Please confirm that stop is what you intended.
知识库 default/general_planner/recall 缓存刷新:1
2024-03-27 16:30:15,264 - before_sleep.py[line:65] - WARNING: Retrying langchain.chat_models.openai.ChatOpenAI.completion_with_retry.._completion_with_retry in 4.0 seconds as it raised APIError: Invalid response object from API: 'upstream request timeout' (HTTP response code was 504).
2024-03-27 16:30:34,271 - before_sleep.py[line:65] - WARNING: Retrying langchain.chat_models.openai.ChatOpenAI.completion_with_retry.._completion_with_retry in 4.0 seconds as it raised APIError: Invalid response object from API: 'upstream request timeout' (HTTP response code was 504).
2024-03-27 16:30:53,279 - before_sleep.py[line:65] - WARNING: Retrying langchain.chat_models.openai.ChatOpenAI.completion_with_retry.._completion_with_retry in 4.0 seconds as it raised APIError: Invalid response object from API: 'upstream request timeout' (HTTP response code was 504).
2024-03-27 16:31:12,289 - before_sleep.py[line:65] - WARNING: Retrying langchain.chat_models.openai.ChatOpenAI.completion_with_retry.._completion_with_retry in 8.0 seconds as it raised APIError: Invalid response object from API: 'upstream request timeout' (HTTP response code was 504).
openai.error.APIError: Invalid response object from API: 'upstream request timeout' (HTTP response code was 504)
~/llm_models/openai_model.py 加一个日志确认下配置是否正确,本地启动的url应该是 http://localhost:8888/v1
def getChatModelFromConfig(llm_config: LLMConfig, callBack: AsyncIteratorCallbackHandler = None, ) -> Union[ChatOpenAI, LLM]:
logger.debug(f"llm type is {llm_config}")
如果这边没有问题的话,在本周五会对issue进行关闭,有其它问题可以再次打开