network-quantization
There are 10 repositories under network-quantization topic.
quic/aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
sony/model_optimization
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
NJU-Jet/SR_Mobile_Quantization
Winner solution of mobile AI (CVPRW 2021).
1adrianb/binary-networks-pytorch
Binarize convolutional neural networks using pytorch :fire:
lmbxmu/RBNN
Pytorch implementation of our paper accepted by NeurIPS 2020 -- Rotated Binary Neural Network
tajanthan/pmf
Proximal Mean-field for Neural Network Quantization
CAS-CLab/Label-free-Network-Compression
Caffe implementation of "Learning Compression from Limited Unlabeled Data" (ECCV2018).
quic/aimet-pages
AIMET GitHub pages documentation
xuxw98/Quantformer
[T-PAMI 2022] Quantformer: Learning Extremely Low-precision Vision Transformers