/Self-KD-Lib

[ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of some Self-Knowledge Distillation and data augmentation methods

Primary LanguagePython

This project provides the implementations of some data augmentation methods, regularization methods, online Knowledge distillation and Self-Knowledge distillation methods.

Installation

Requirements

Ubuntu 18.04 LTS

Python 3.8 (Anaconda is recommended)

CUDA 11.1

PyTorch 1.12 + torchvision 0.13

Perform experiments on CIFAR-100 dataset

Dataset

CIFAR-100 : download

unzip to the ./data folder

The commands for running various methods can be found in main.sh

Top-1 accuracy(%) of Self-KD and Data Augmentation (DA) methods on ResNet-18
Type Method Venue Accuracy(%)
Baseline Cross-entropy - 76.24
Self-KD DDGSD [1] AAAI-2019 76.61
DKS [2] CVPR-2019 78.64
SAD [3] ICCV-2019 76.40
BYOT [4] ICCV-2019 77.88
Tf-KD-reg [5] CVPR-2020 76.61
CS-KD [6] CVPR-2020 78.66
FRSKD [7] CVPR-2021 77.71
PS-KD [8] ICCV-2021 79.31
BAKE [9] arXiv:2104.13298 76.93
MixSKD [10] ECCV-2022 80.32
DA Label Smoothing [1] CVPR-2016 78.72
Virtual Softmax [2] NeurIPS-2018 78.54
Focal Loss [3] ICCV-2017 76.19
Maximum Entropy [4] ICLR Workshops 2017 76.50
Cutout [5] arXiv:1708.04552 76.66
Random Erase [6] AAAI-2020 76.75
Mixup [7] ICLR-2018 78.68
CutMix [8] ICCV-2019 80.17
AutoAugment [9] CVPR-2019 77.97
RandAugment [10] CVPR Workshops-2020 76.86
AugMix [11] arXiv:1912.02781 76.22
TrivalAugment [12] ICCV-2021 76.03

Some implementations are referred by the official code. Thanks the papers' authors for their released code.

If you find this repository useful, please consider citing the following paper:

@inproceedings{yang2022mixskd,
  title={MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition},
  author={Yang, Chuanguang and An, Zhulin and Zhou, Helong and  Cai, Linhang and Zhi, Xiang and Wu, Jiwen and Xu, Yongjun and Zhang, Qian},
  booktitle={European Conference on Computer Vision},
  year={2022}
}