/minillm

MiniLLM is a minimal system for running modern LLMs on consumer-grade GPUs

Primary LanguagePythonMIT LicenseMIT

Watchers

No one’s watching this repository yet.