tomasz-kielbasa/GPTQ-for-LLaMa
4 bits quantization of LLaMA using GPTQ with easy model loading
PythonApache-2.0
Stargazers
No one’s star this repository yet.
4 bits quantization of LLaMA using GPTQ with easy model loading
PythonApache-2.0
No one’s star this repository yet.