LRQ_MindSpore

Neural Networks23(CCF-B) - Long-range zero-shot generative deep network quantization paper

Requirements

conda create -n mindspore python=3.8
pip install mindspore
pip install -r requirement.txt

Evaluate pre-trained models

The pre-trained models and corresponding logs can be downloaded here

Please make sure the "qw" and "qa" in *.hocon, *.hocon, "--model_name" and "--model_path" are correct.

For cifar10

python test.py --model_name resnet20_cifar10 --model_path path_to_pre-trained model --conf_path cifar10_resnet20.hocon
or
python test.py --model_name resnet20_cifar100 --model_path path_to_pre-trained model --conf_path cifar100_resnet20.hocon

For ImageNet

python test.py --model_name resnet18/mobilenet_w1/mobilenetv2_w1 --model_path path_to_pre-trained model --conf_path imagenet.hocon

Results of pre-trained models are shown below:

Model Bit-width Dataset Top-1 Acc.
resnet18 W4A4 ImageNet 66.47%
resnet18 W5A5 ImageNet 69.94%
mobilenetv1 W4A4 ImageNet 51.36%
mobilenetv1 W5A5 ImageNet 68.17%
mobilenetv2 W4A4 ImageNet 65.10%
mobilenetv2 W5A5 ImageNet 71.28%
resnet-20 W3A3 cifar10 77.07%
resnet-20 W4A4 cifar10 91.49%
resnet-20 W3A3 cifar100 64.98%
resnet-20 W4A4 cifar100 48.25%