/exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs

Primary LanguagePythonMIT LicenseMIT

Watchers

No one’s watching this repository yet.