brianpetro/obsidian-smart-connections

Context lookup requires tool support in chat models since v.2.3.37

Opened this issue · 4 comments

I use OpenRouter, and before SC 2.3.37, I could flip between the models and try them out. Now, nearly every attempt to chat with a model results in an error that this model does not support tools. When I can manage to get around that warning, I'm faced with this instead:

image

No idea what, if anything, I can do. Any hints on which models I should focus on? I use the SC sidebar as a writing coach to help with fiction.

Thank you.

Hi @EasternPA

Here is a link to see which models from Open Router are supposed to support tools https://openrouter.ai/models?supported_parameters=tools

The earlier versions of Smart Chat supported lookup for chat models that didn't use tools. But, this was a bit of a hackjob that I decided not to include in the latest version since tool support is seemingly becoming ubiquitous among newly released models.

However, if enough people reply to this post that they want tool-less model retrieval, then I will have to consider re-adding the feature.

Lastly, while the semantic lookup will not work without tools, other methods for including context, like specifying notes directly using @ to bring up the context selection UI, should still work 🌴

Thank you, but meta-llama-3.2-3b-instruct:free (which is on the list) is still returning this, even after updating to 2.3.40.

image

I will try @ and [[ to look for different results.

Edit: Same

image

@EasternPA yeah, in my testing I noticed that some models were failing to use tools even when they said they supported them,

In the second screenshot, the message is using the folder syntax, which will still use the lookup tool.

Instead, you will have to mention files directly like this (screenshot is using the meta-llama-3.2-3b-instruct:free model)
image

It might still be possible to get tools working with that model, but it will take some time to play around with it. To start, I would try to remove the specified property from the request in the chat model adapter for open_router. But that may have other unintended side effects. And the logic should probably be model specific since the existing adapter works with other open_router models. In case anyone is interested, the relevant adapter file is https://github.com/brianpetro/jsbrains/blob/main/smart-chat-model/adapters/open_router.js

🌴

I'm still trying models on the list and running into roadblocks. The first attempt is with ministral-3b (not mistral), and the second attempt is with google Gemini 1.5 flash 8B. The errors do not appear to be the same

image

Edit: claude-3.5-haiku worked as expected

Edit 2: I updated on a different vault and claude-3.5-haiku did not work. It only shows me lookup and context but no results.

image