This is basic spring-ai project working with ollama -mistral model. Steps to download Ollama on local 1. Go to website and download: https://ollama.com/download 2. Open the terminal and write command: ollama run mistral. This will install the mistral model in your local machine 3. To check if the mistral is installed on your system. Run command: ollama list This will show list of models available on your system. Spring-ai reference doc: https://docs.spring.io/spring-ai/reference/api/chatmodel.html Spring-ai Ollama reference doc: https://docs.spring.io/spring-ai/reference/api/chat/ollama-chat.html