This repository is a PyTorch implementation for Knowledge distillation. The code is modified from PSPNet. Sample experimented datasets is Cityscapes.
-
Highlight:
- All initialization models, trained models and predictions are available.
-
Requirement:
- PyTorch>=1.1.0, Python3, tensorboardX,
-
Clone the repository:
git clone https://github.com/LiuZhenshun/Knowledge-Distillation.git
-
Train:
-
Download related datasets and symlink the paths to them as follows (you can alternatively modify the relevant paths specified in folder
config
):cd Knowledge Distillation mkdir -p dataset ln -s /path_to_cityscapes_dataset dataset/
-
Download initmodel/resnet50_v2 && exp/cityscapes/pspnet50 pre-trained models and put them under folder
initmodel
for weight initialization. Remember to use the right dataset format detailed in FAQ.md. -
Specify the gpu used in config then do training:
-
-
Visualization: tensorboardX incorporated for better visualization.
tensorboard --logdir=exp/cityscapes
-
result: