/GPTQ-for-LLaMa

4 bits quantization of LLaMA using GPTQ

Primary LanguagePython

Watchers

No one’s watching this repository yet.