/exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.