Possible to use with a local LLM?
chrisbward opened this issue ยท 4 comments
chrisbward commented
As titled, would prefer to use a local LLM instead of OpenAI's GPT. I arrived here via this tutorial/introduction to RAG;
henomis commented
I suggest to use localAI and use a custom LLM. Then connect LinGoose to LocalAI using a custom openai client (WithClient( )
) with local endpoint
airtonix commented
henomis commented
@airtonix I will check this project and the possibility of integrating it into Lingoose. Thanks for the suggestion.
henomis commented
Ollama will be supported in the next lingoose version.