This repository contains knowledge-distillated models implemented in PyTorch and trained using Pytorch-Lightning.
- Clone the repository:
- Install requirements:
cd Distilled-Models
pip install -r requirements.txt - Setup CIFAR-100:
cd datasets/cifar-100
./setup.sh
To reproduce the results of an experiment mentioned in the table below, execute the command existing at the top of the corresponding python script.
Paper | Dataset | Teacher | Student | Accuracy@Top-1 | Accuracy@Top-5 |
---|---|---|---|---|---|
Baseline | CIFAR-100 | Resnet32 | - | 69.59 | 91.39 |
Deep Mutual Learning (Redo) | CIFAR-100 | Resnet32 | Resnet32 | 69.31 | 91.86 |
Online Knowledge Distillation with Diverse Peers | CIFAR-100 | Resnet32 | Resnet32 | N/A | N/A |
This repository is still under developement, thus if you encounter a bug or would like to request a feature, please feel free to open an issue here.