/local_llama

This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.

Primary LanguagePythonApache License 2.0Apache-2.0

Watchers

No one’s watching this repository yet.