haseeb-heaven/code-interpreter

Local-model not working, as it expects openai api key.

bcosculluela opened this issue · 3 comments

Hello!
Regarding this issue, I am currently using LM Studio. When using local-model, it does not work. As I can see in code, in interpreter_lib.py, line 324, variable custom_llm_provider is set to 'openai', so it expects de openai api key. Which has to be the value of this variable when using open-source LLMs as Mistral?

Okay i will take a look at this issue today.

Thanks! In my case, the issue has been solved by setting:
model = "ollama/llama2"
And removing the variable custom_llm_provider.
Just in case it helps! 😄

Fix this bug in this PR : #14