No Implement Error show when using a local LLM for RAG prompts
Closed this issue · 3 comments
I am currently using LM Studio, which is OpenAI format, as my backend. This works fine with my settings when it's not note-related.
P.S. online API works great, only the local doesn't work. logs attached below.
BTW, every time I startup obsidian the importing process takes much longer time than before, and there is a flash black screen. I don't know if this is another bug, but this doesn't affect much.
update
Works after deleted the index files and reinstall the plugin. Now it's working perfect again.
In addition to the initial bug report, it's not just local LLM didn't work, API couldn't work too. Any possible reasons? I will update again if this reproduced.
Thanks for the update 🌴
Thanks for the update 🌴
Hi,
I think I find the reason, maybe you can check later when someone meets the similar problem. I changed my files and folders names after the first embedding, then the flash black screen when importing notes come back again, but this time the chat window couldn't load. So I just repeated the deleting and re-indexing. when I was deleting the vector storage I found the previous titles embeddings were still there. So the leftovers most likely is the reason.