/Knowledge-Distillation-Pipeline

PyTorch Knowledge Distillation Framework

Primary LanguagePythonMIT LicenseMIT

FastKD

Introduction

PyTorch Knowledge Distillation Framework.

Features

Datasets:

Sample Model:

  • Teacher: ResNet50 (from torchvision)
  • Student: ResNet18 (from torchvision)

KD Methods:

Features coming soon:

Methods Comparison

Coming Soon...

Configuration

Create a configuration file in configs. Sample configuration for ImageNet dataset can be found here. Then edit the fields you think if it is needed. This configuration file is needed for both training and evaluation scripts.

Training

$ python train.py --cfg configs/CONFIG_FILE_NAME.yaml

Evaluation

$ python val.py --cfg configs/CONFIG_FILE_NAME.yaml