/ollama-runner

Some scripts and stuff to help me run ollama

How to use ollama

sudo systemctl stop ollama
sudo systemctl disable ollama

b/c I want to run it here locally.

First, you need to start the server:

OLLAMA_HOST=0.0.0.0:11434 ./bin/ollama serve

To connect to this from anything-llm running in docker, use this URL: http://host.docker.internal:11434

Create the LLM from the model file (unfortunately, yes, it does transfer it over 😢)

./bin/ollama create thebloke-llama2-13b-gguf -f modelfiles/thebloke-llama2-13b-gguf

Now run the model:

./bin/ollama run thebloke-llama2-13b-gguf

To list all llocally-installed models:

ollama list