WebUI for chatting with LLMs.
# clone repository
$ git clone https://github.com/kirillsaidov/llm_chat.git
$ cd llm_chat/
# install python dependencies
$ python3 -m venv venv && source ./venv/bin/activate
$ pip install -r requirements.txt
# run now
$ streamlit run chatui/chatui.py --server.port=8501
# launch in background
$ nohup streamlit run chatui/chatui.py --server.port=8501 > streamlit.log 2>&1 &
Output:
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8501
Network URL: http://xxx.xxx.xxx.xxx:8501
Unlicense. You can do whatever you want with the repo files.