/chat-ui-mistral

A simple UI for Mistral LLM

Primary LanguageHTML

Chat interface for Mistral AI

A simple UI for chatting to Mistral AI

  1. Install Mistral with Ollama. Instructions here
  2. Run Mistral with Ollama. To allow the app to run on your local network run the following command (replace the IP address with your own local IP): OLLAMA_ORIGINS=http://192.168.1.3:* OLLAMA_HOST=192.168.1.3:11435 ollama serve see docs
  3. Edit the API_BASE_URL in the index.html to point to the URL that Ollama is running on
  4. Open the index.html file in your browser and type your question. Alternatively, run the app with Serve
screenshot