/mmdetection-distiller

This is a knowledge distillation toolbox based on mmdetection.

Primary LanguagePythonApache License 2.0Apache-2.0

If you want to distill model in OpenMMLab related repos, please use MMRazor!!

If you are intrested in KD,you also could contact me by Wechat, and I will invite you to the KD group.


This project is based on mmdetection(v-2.9.0), all the usage is the same as mmdetection including training , test and so on.

Distiller Zoo

Installation

  • Set up a new conda environment: conda create -n distiller python=3.7

  • Install pytorch

  • Install mmcv ( 1.2.4 <= mmcv-full < 1.3 )

  • Install mmdetection-distiller

    git clone https://github.com/pppppM/mmdetection-distiller.git
    cd mmdetection-distiller
    pip install -r requirements/build.txt
    pip install -v -e .

Train

#single GPU
python tools/train.py configs/distillers/cwd/cwd_retina_rx101_64x4d_distill_retina_r50_fpn_2x_coco.py

#multi GPU
bash tools/dist_train.sh configs/distillers/cwd/cwd_retina_rx101_64x4d_distill_retina_r50_fpn_2x_coco.py 8

Test

#single GPU
python tools/test.py configs/distillers/cwd/cwd_retina_rx101_64x4d_distill_retina_r50_fpn_2x_coco.py $CHECKPOINT --eval bbox

#multi GPU
bash tools/dist_train.sh configs/distillers/cwd/cwd_retina_rx101_64x4d_distill_retina_r50_fpn_2x_coco.py $CHECKPOINT 8 --eval bbox

Lisence

This project is released under the Apache 2.0 license.