redis/agent-memory-server

Ollama Support

Opened this issue · 1 comments

How to enable ollama models in the current implementation?

The server doesn't support Ollama models directly. However, it's a worthy goal! The best solution right now is to use LiteLLM to make Ollama models available via an OpenAI-compatible interface: https://docs.litellm.ai/docs/providers/ollama

Then you can change the OpenAI API base setting: https://redis.github.io/agent-memory-server/configuration/#ai-model-configuration