openLLM configuration issues
yananchen1989 opened this issue · 2 comments
yananchen1989 commented
hi there,
for open models, such as mistral
, ChatGLM3-6B-32K
etc, in your code
https://github.com/OSU-NLP-Group/TravelPlanner/blob/main/agents/tool_agents.py#L127,
why model_name="gpt-3.5-turbo"
?
may I know the reason ? thanks.
hsaest commented
Hi Yanan,
Thank you for your interest in our work.
LangChain uses OpenAI model names by default, so we need to assign some faux OpenAI model names to our local model.
We use fastchat to deploy our local models, and you can see the details here.
Feel free to contact us if you have further questions.
Best,
Jian
yananchen1989 commented
thanks a lot.