ollama-webui-management
There are 2 repositories under ollama-webui-management topic.
vam876/LocalAPI.AI
LocalAPI.AI is a local AI management tool for Ollama, offering Web UI management and compatibility with vLLM, LM Studio, llama.cpp, Mozilla-Llamafile, Jan Al, Cortex API, Local-LLM, LiteLLM, GPT4All, and more.
MillionthOdin16/Ollama-Web-Manager
Manage your Ollama server and models from a simple web page