Pinned Repositories
bit
Code repo for the paper BiT Robustly Binarized Multi-distilled Transformer
LLM-QAT
Code repo for the paper "LLM-QAT Data-Free Quantization Aware Training for Large Language Models"
MobileLLM
MobileLLM Optimizing Sub-billion Parameter Language Models for On-Device Use Cases. In ICML 2024.
SpinQuant
Code repo for the paper "SpinQuant LLM quantization with learned rotations"
AdamBNN
How Do Adam and Training Strategies Help BNNs Optimization? In ICML 2021.
Bi-Real-net
Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved Representational Capability and Advanced Training Algorithm. In ECCV 2018 and IJCV
Data-Free-NAS
Data-Free Neural Architecture Search via Recursive Label Calibration. ECCV 2022.
MetaPruning
MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.
Nonuniform-to-Uniform-Quantization
Nonuniform-to-Uniform Quantization: Towards Accurate Quantization via Generalized Straight-Through Estimation. In CVPR 2022.
ReActNet
ReActNet: Towards Precise Binary NeuralNetwork with Generalized Activation Functions. In ECCV 2020.
liuzechun's Repositories
liuzechun/MetaPruning
MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. In ICCV 2019.
liuzechun/ReActNet
ReActNet: Towards Precise Binary NeuralNetwork with Generalized Activation Functions. In ECCV 2020.
liuzechun/Bi-Real-net
Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved Representational Capability and Advanced Training Algorithm. In ECCV 2018 and IJCV
liuzechun/Nonuniform-to-Uniform-Quantization
Nonuniform-to-Uniform Quantization: Towards Accurate Quantization via Generalized Straight-Through Estimation. In CVPR 2022.
liuzechun/AdamBNN
How Do Adam and Training Strategies Help BNNs Optimization? In ICML 2021.
liuzechun/Data-Free-NAS
Data-Free Neural Architecture Search via Recursive Label Calibration. ECCV 2022.