Run large language models locally using Ollama, Langchain, and Streamlit.
We use Mistral 7b model as default model. You can change other supported models, see the Ollama model library.
Install Ollama
Download Mistral 7b
ollama pull mistral
Clone this project
git clone https://github.com/agnanp/Ollama-Streamlit.git
Go to the project directory
cd Ollama-Streamlit
Install dependencies
pip3 install -r requirements.txt
Start the streamlit
streamlit run main.py