/llm-awq

AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration

Primary LanguagePythonMIT LicenseMIT

No issues in this repository yet.