/GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ

Primary LanguagePythonApache License 2.0Apache-2.0

Watchers

No one’s watching this repository yet.