Ollama not Working
Opened this issue · 9 comments
How am i supposed to use Ollama with this?
None replied for 4 days, have you had any luck getting it to work? I tried today for a bit to get it work, but I just keep getting this error, no matter what I try.
I confirmed my local model, Protocol, Hostname, Port, and path, just won't connect, not sure what else to try.
plugin:smart-connections:2476 Error: net::ERR_CONNECTION_REFUSED at SimpleURLLoaderWrapper.<anonymous> (node:electron/js2c/browser_init:2:108522) at SimpleURLLoaderWrapper.emit (node:events:517:28) handle_error @ plugin:smart-connections:2476 complete @ plugin:smart-connections:2368 await in complete (async) new_user_message @ plugin:smart-connections:4689 await in new_user_message (async) new_user_message @ plugin:smart-connections:12969 handle_send @ plugin:smart-connections:12827 eval @ plugin:smart-connections:2849
What configuration are you using? Can you share a screenshot? 🌴
oh, man, Ollama is different from any other local LLM clients I know. Try openAI compatible endpoint ones.
Did you setup 'ollama serve' ?
Also path should be api/chat (at least this solved my issue)
I am able to hit the ollama url using the following command on terminal, however I am getting "No API key found for custom_local. Cannot retrieve models." error when calling from the chat.
The terminal command that runs and gives output (showing that the ollama server is working fine) :-
curl http://localhost:11434/api/generate -d '{ "model": "llama3.2", "prompt": "How are you today?", "stream": false}'
You have to use the chat endpoint http://localhost:11434/api/chat
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
Smart Chat v2 won't require any configuration for Ollama and will import your available models for easy selection (just built this part in the past day) 🌴
@brianpetro I don't see anywhere a preconfigured Ollama integration with available models that you describe... And generally the current plugin settings option's don't seem to work with ollama as expected...
Can you give us more info if this is solved?
The next version of the Smart Chat makes using Ollama a lot easier, and even loads the models you have installed for easy configuration. You can see a demo in this video https://youtu.be/PUI4gAzLfaM?si=8L9auuPx_ZrDeRIL
This next version is currently in early-release testing with supporters 🌴