/localama

A document chat engine completely built off open source models and runs 100% locally

Primary LanguagePythonMIT LicenseMIT

localama

A document chat engine completely built off open source models and runs 100% locally This module is a "working" template to start with your 100% document augmentation Q&A application.

Pre-requisites

Ollama server needs to be running locally, pull Llama3.1 model or modify the model name in program as need be.