brianpetro/obsidian-smart-connections

Using Ollama as a server no longer works as of 3.2.42

Opened this issue · 3 comments

Im using the Custom API(OpenAI format)

using the same configuration with the same values for both versions
image

after I updated to 3.2 it doesn't work anymore and I get these errors when I try to send a message:

Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'startsWith')
at SmartChatModelRequestAdapter.to_openai (plugin:smart-connections:11045:20)
at SmartChatModelRequestAdapter.to_platform (plugin:smart-connections:11029:17)
at SmartChatModelCustomAdapter.complete (plugin:smart-connections:10787:33)
at SmartChatModel.invoke_adapter_method (plugin:smart-connections:9947:38)
at SmartChatModel.complete (plugin:smart-connections:10087:23)
at SmartThread.complete (plugin:smart-connections:15033:46)
at async SmartMessage.init (plugin:smart-connections:15632:7)

There is a new Ollama option that should work without all the configuration. Is there a reason your not using that instead of the custom API?

Screenshot_20241207-082422

🌴

yes, I'm not using it locally so that option doesn't work for me (if there was a custom hostname option for that though I would find it helpful)

@TheOnlyThing thanks for clarifying that.

This commit brianpetro/jsbrains@6a488bb will fix the error in your first post when the next version is released.

🌴