/local_pdf_rag

A demo Jupyter Notebook showcasing a simple local RAG (Retrieval Augmented Generation) pipeline to chat with your PDFs.

Primary LanguageJupyter NotebookMIT LicenseMIT

Chat with PDF locally with Ollama demo 🚀

If you have any questions or suggestions, please feel free to create an issue in this repository, I will do my best to respond.

Give me a star if you like this repo :) and shoutout to the repo that I fork and please check out the original github-repo: https://github.com/tonykipkemboi/ollama_pdf_rag

Running the Streamlit application

  1. Clone repo

  2. Install Dependencies: Execute to install dependencies

    pip install -r requirements.txt
  3. Pull ollama model: Get the nomic embed text model

    ollama pull nomic-embed-text
    ollama pull llama3.1 # can be any model that the user want to test
  4. Launch the App: Run to start the Streamlit interface on localhost

    streamlit run streamlit_app.py