/easy-Ollama-rag

SuperEasy 100% Local RAG with Ollama

Primary LanguagePython

SuperEasy 100% Local RAG with Ollama

YouTube Tutorial

IMAGE ALT TEXT HERE

Setup

  1. git clone https://github.com/AllAboutAI-YT/easy-local-rag.git
  2. cd dir
  3. pip install -r requirements.txt
  4. Install Ollama (https://ollama.com/download)
  5. run upload.py (pdf, .txt, JSON)
  6. run localrag.py

Latest Updates V1.1

  • Pick your model from the CLI
    • python localrag.py --model mistral (llama3 is default)
  • Talk in a true loop with conversation history

My YouTube Channel

https://www.youtube.com/c/AllAboutAI

What is RAG?

RAG is a way to enhance the capabilities of LLMs by combining their powerful language understanding with targeted retrieval of relevant information from external sources often with using embeddings in vector databases, leading to more accurate, trustworthy, and versatile AI-powered applications

What is Ollama?

Ollama is an open-source platform that simplifies the process of running powerful LLMs locally on your own machine, giving users more control and flexibility in their AI projects. https://www.ollama.com