DropbaseHQ/dropbase

Local LLM support (e.g. Ollama)

sammcj opened this issue · 1 comments

The project looks neat, but I can't see a way to configure local LLM servers such as Ollama.

Is there somewhere you can set an OpenAI compatible API endpoint?

we have not integrated Ollama yet, but it's in our roadmap. we'll ping you once local LLM support is added.