PyTorch Knowledge distillation using Pspnet

Introduction

This repository is a PyTorch implementation for Knowledge distillation. The code is modified from PSPNet. Sample experimented datasets is Cityscapes.

Usage

  1. Highlight:

    • All initialization models, trained models and predictions are available.
  2. Requirement:

  3. Clone the repository:

    git clone https://github.com/LiuZhenshun/Knowledge-Distillation.git
  4. Train:

    • Download related datasets and symlink the paths to them as follows (you can alternatively modify the relevant paths specified in folder config):

      cd Knowledge Distillation
      mkdir -p dataset
      ln -s /path_to_cityscapes_dataset dataset/
      
    • Download initmodel/resnet50_v2 && exp/cityscapes/pspnet50 pre-trained models and put them under folder initmodel for weight initialization. Remember to use the right dataset format detailed in FAQ.md.

    • Specify the gpu used in config then do training:

  5. Visualization: tensorboardX incorporated for better visualization.

    tensorboard --logdir=exp/cityscapes
  6. result:

    • Pspnet Result
    • Kowlege Distillation Result