These are services I self-host on my local network.
- paperless - document manager
- uptime-kuma - health monitor for my other services
- Nginx Proxy Manager - proxy hosts to different services
- Starbase 80 - a nice dashboard
- Whisper ASR - API server to access Whisper model
- Ollama - API server to access open-source LLM models
Automatic backups are handled with docker-volume-backup
To copy data from one volume to another, use the following command:
docker run --rm -it -v uptime-kuma:/from -v uptime-kuma_uptime-kuma:/to alpine ash -c "cd /from ; cp -av . /to"
To update an image, we must first delete the local image or hard-code the image version:
docker-compose down -rmi all
docker-compose up -d
To load TS script:
cp gg.neil.ts.plist ~/Library/LaunchAgents/gg.neil.ts.plist
launchctl load ~/Library/LaunchAgents/gg.neil.ts.plist
launchctl list | grep gg.neil
The last command confirms it is correctly loaded and running
After downloading Ollama, download your preferred model (currently I use mistral with my 8gb of VRAM):
ollama pull mistral
Also configure remote access to the server (safe to do only because my machine is not public-facing):
launchctl setenv OLLAMA_HOST "0.0.0.0"
launchctl setenv OLLAMA_ORIGINS "*"
Then restart ollama.