/GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ with easy model loading

Primary LanguagePythonApache License 2.0Apache-2.0

Stargazers

No one’s star this repository yet.