/ollama-anywhere

Access your Ollama inference server running on your computer from anywhere. Set up with NextJS + Langchain JS LCEL + Ngrok

Primary LanguageTypeScript

No issues in this repository yet.