/exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs

Primary LanguagePythonMIT LicenseMIT

Watchers