/ollama_experiments

Minaimal webapp to experiment using locally running ollma container.

Primary LanguageHTML

Run lllm models in ollama docker container locally with custom frontend

Ollama API docs https://github.com/ollama/ollama/blob/main/docs/api.md

Setup and run ollama docker container

Build and run docker container from https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image

docker run -d -v ollama_storage:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

and use docker start ollama afterwards

then go into the container and pull a model like llama:3.2

docker exec ollama bash -c "ollama pull llama3.2"

to make a request

curl http://localhost:11434/api/generate -d '{
  "model": "llama3.2",
  "prompt":" Why is the colour of sea blue ?"
}'

Other useful commands

Get the ollama logs from container

docker logs ollama

To run latest chroma container on port 8000

docker run -d -p 8000:8000 -v chroma-data:/chromadb/data chromadb/chroma

TODOs