/llm_chat

WebUI for chatting with LLMs.

Primary LanguagePythonThe UnlicenseUnlicense

Chat with any LLM using Ollama and WebUI interface.

WebUI for chatting with LLMs.

Install

# clone repository
$ git clone https://github.com/kirillsaidov/llm_chat.git
$ cd llm_chat/

# install python dependencies
$ python3 -m venv venv && source ./venv/bin/activate
$ pip install -r requirements.txt

Run

# run now
$ streamlit run chatui/chatui.py --server.port=8501

# launch in background
$ nohup streamlit run chatui/chatui.py --server.port=8501 > streamlit.log 2>&1 &

Output:

You can now view your Streamlit app in your browser.

Local URL: http://localhost:8501
Network URL: http://xxx.xxx.xxx.xxx:8501

LICENSE

Unlicense. You can do whatever you want with the repo files.