A simple UI for chatting to Mistral AI
- Install Mistral with Ollama. Instructions here
- Run Mistral with Ollama. To allow the app to run on your local network run the following command (replace the IP address with your own local IP):
OLLAMA_ORIGINS=http://192.168.1.3:* OLLAMA_HOST=192.168.1.3:11435 ollama serve
see docs - Edit the
API_BASE_URL
in theindex.html
to point to the URL that Ollama is running on - Open the
index.html
file in your browser and type your question. Alternatively, run the app with Serve