/llm-awq

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.