goniszewski/grimoire

No way to choose model in Ollama

Closed this issue · 6 comments

There is nothing to choose in model dropdown

Screenshot from 2023-12-09 12-14-56

curl http://10.0.0.76:11434/
Ollama is running%  

I am not sure if ollama is expected to be working in current release.
If not, you can close the issue.

Hello @SoiledBrush! Integration with Ollama is an early preview. Nonetheless, it should fetch the tags of models available locally.

I have just tested it on a fresh Grimoire + Ollama installation, and it works just fine. Please keep in mind, that you need to press the TEST button to make a first call to Ollama's API.

Screenshot 2023-12-10 at 12 41 01

image

Ok, test buttons works if I start grimoire on the local machine. But doesn't work if I run it in on my "homelab".
Maybe it has something to do with the fact that I changed ORIGIN=http://localhost:5173 to http://10.0.0.254:5173 to avoid "cross-site form submissions forbidden" error.

You probably should configure Ollama to accept incoming request from Grimoire's origin. Please refer to this document for more information: https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network

cat /etc/systemd/system/ollama.service.d/environment.conf     
[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"

I already have that set up.

Ok, I see. I don't know what's the exact origin of the request that comes from Grimoire. The call to Ollama API is made on the client-side, so you can check the response for it in the Network tab in your browser.

It should be now fully resolved with the fixes/improvements made for #71.