/rag-chat-with-pdf-local-llm

Simple demo for chatting with a PDF - and optionally point the RAG implementation to a local LLM

Primary LanguagePythonMIT LicenseMIT

Chat with your PDF documents - optionally with a local LLM

Installation

pip install -r requirements.txt

pip install -U pydantic==1.10.9

Run it

streamlit run chat.py

Running a local LLM

Easiest way to run a local LLM is to use LM Studio: https://lmstudio.ai/

The LLM I use in my conference talks (works fine on a MBP M1 Max with 64GB RAM):