aurelio-labs/semantic-router

Local embedding on Mac

netandreus opened this issue · 0 comments

If I already deploy Ollama with embeddings model, Ollama started and model already in memory, how can I use it? I mean use Ollama server.