Ollama support
gregorym opened this issue · 4 comments
gregorym commented
Ollama support
buhe commented
Hey, Ollama focus on local model, and this lib(https://github.com/buhe/langchain-swift) current not support it.
FYI, #40
buhe commented
Ollama has remote API, maybe use it.
rhx commented
Given that a lot of people are using Ollama and LMStudio cannot easily share models with Ollama, it might be worth reconsidering Ollama support. I have been working on this and will send through a pull request soon.