brianpetro/obsidian-smart-connections

Using external model, no smart connections?

Closed this issue · 2 comments

l-mb commented

Hey, sorry if this is obvious or I missed something in the docs.

I've setup Smart Connections 2.2.85 (with Obsidian 1.7.7).

The chat and semantic search work fine with OpenAI.

The discovery of Smart Connections itself though don't. I see that the API key provided for this is used, and it does report "100% embedded".

In OpenAI's reporting, I see that text-embedding-3-large is indeed used, it has processed tons of tokens etc.

But the Smart Connections view pane remains empty (or sometimes shows a left-over from the previous search).

It seems to only work for local models - are external models not fully supported, even though they can be configured?

There should be some error logs in console when things aren't working as expected. If you screenshot those I can give them a look and see what might be happening.

Additionally, a new major release is in the works and is currently being tested with supporters via the early-release version. This new version fixes a lot of existing issues, and has already been tested as working with OpenAI embeddings 🌴

l-mb commented

I did not see any errors in the console, should have mentioned that.

To be absolutely sure I didn't miss any during setup, I switched the model back to the local one, hit "Prune", and switched back over to text-embedding-3-large so I could share the logs.

I swear, I did that at least three times (plus Clear All & Re-import, and even resetting the plugin) prior to opening the issue.

This time, it didn't just complete without error again, but ... connections are showing up!

Weird glitch, perhaps there's some timing issue somewhere?

Hopefully next time I'll have better logs, but even more hopeful, I'll never see the glitch again :-)

Thanks for the fast response! Back to exploring!