LLM量化工具//AutoAWQ implements the AWQ algorithm for 4-bit quantization with a 2x speedup during inference. Documentation:
Primary LanguagePythonMIT LicenseMIT
No one’s star this repository yet.