Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
Primary LanguagePythonApache License 2.0Apache-2.0
No one’s watching this repository yet.