[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
Primary LanguagePythonMIT LicenseMIT
No issues in this repository yet.