/GPTQ-for-LLaMa

4 bits quantization of LLaMa using GPTQ

Primary LanguagePython

Watchers

No one’s watching this repository yet.