valiantlynx/ollama-docker
Welcome to the Ollama Docker Compose Setup! This project simplifies the deployment of Ollama using Docker Compose, making it easy to run Ollama with all its dependencies in a containerized environment
HTMLNOASSERTION
Issues
- 0
Not working with llama3.2
#22 opened by HamzaAI9D - 3
- 1
Changed password and username of webui account, stuck on "Signing in to valiantlynx AI (Open WebUI)"
#19 opened by WarmWelcome - 1
[BUG] - USER_AGENT environment variable not set, consider setting it to identify your requests
#18 opened by pierrejo - 0
How can I change the ollama-webui to run from a local server instead of locahost so I can access it from my laptop?
#17 opened by mcgaleti - 2
- 1
Ollama-webui crashes on start
#13 opened by jereskuta - 3
- 3
no work when changing ollama port
#6 opened by choigawoon - 3
- 1
deploy to production
#5 opened by valiantlynx - 0
make a docker hub image
#3 opened by valiantlynx - 3