This repository is for our paper Few Shot Network Compression via Cross Distillation, Haoli Bai, Jiaxiang Wu, Michael Lyu, Irwin King, AAAI 2020.
- python: 3.6+
- torch: 1.1+
- torchvision 0.2.2+
- numpy 1.14+
All the scripts to run the algorithm are in ./scripts
. Please make necessary arg changes, e.g, data_path
, save_path
and load_path
. Please prepare the datasets Cifar10
and ImageNet
yourself.
The alogrithm is based on a pre-trained model. For Cifar10 experiments, you can run sh scripts/start_vanilla.sh
to train a new model from scratch. For ImageNet experiments, you can download the pretrained models from the official website.
sh scripts/start_chnl.sh ${gpu_id}
for structured pruningsh scripts/start_prune.sh ${gpu_id}
for unstructured pruning The default parameters are already shown in the scripts. You can uncomment other configurations for different experiments.
- The codes automatically generate
./few_shot_ind/
, which stores the index of sampled data for few shot training in step 3. - Please read the arg description in
main.py
to learn more about the meanings of hyper-parameters.
If you find the code helpful for your research, please kindly star this repo and cite our paper:
@inproceedings{bai2019few,
title={Few Shot Network Compression via Cross Distillation},
author={Bai, Haoli and Wu, Jiaxiang and King, Irwin and Lyu, Michael},
booktitle={Proceedings of the 34-th AAAI conference on Artificial Intelligence, AAAI 2020},
year={2020}
}