brianpetro/obsidian-smart-connections

Requirement for a WEBGPU although I'm using a remote OLLAMA configuration (`Using Custom Local (OpenAI format)`)

Closed this issue · 3 comments

Hello, I'm using Smart Connections with a remote OLLAMA instance. I configured it as following:

Using Custom Local (OpenAI format)

  • model name: qwen2.5:32b-instruct-q4_K_M
  • protocol: http
  • hostname: [my_ip_address]
  • port: 11434
  • path: /v1/chat/completions
  • streaming: activated
  • max input tokens: 8192

It works fine for general questions:
image

But as soon as I ask a question that requires access to the notes, I get a popup showing an error:

According to my notes, what is Pixel's brithday?

image

I don't understand why do I get this message, especially because I've configured a remote API query, so why would it ask for a webgpu?

have pretty much the same error. On arch linux

It's because the embedding happens using a different type of model. Try turning on the legacy transformers option in the settings 🌴

It's because the embedding happens using a different type of model. Try turning on the legacy transformers option in the settings 🌴

I already had it checked. But I forced refreshed all (and reimported, using the button just above), and everything went fine! Thanks