This repository contains an implementation of the distillation methods compared in this paper. Using the code from this repository, you can train a lightweight network to recognize faces for embedded devices. The repository contains the code for the following methods:
- Angular distillation
- Triplet distillation L2
- Triplet distillation Cos
- Margin based with T
- MarginDistillation (our)
- Download dataset https://github.com/deepinsight/insightface/wiki/Dataset-Zoo
- Extract images using: data_prepare/bin_get_images.ipynb
- Save vectors from Resnet100 using: data_prepare/save_embedings.ipynb
- Prepare a list for conversion to .bin file using: data_prepare/save_lst.ipynb
- Convert to .bin file using: insightface/blob/master/src/data/dir2rec.py
Download from google drive. Train Resnet100 with Arcface:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network r100 --loss arcface --dataset emore
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.76% | 98.38% | 98.25% | 98.35% |
Download from google drive. Train MobileFaceNet with Arcface:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss arcface --dataset emore
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.51% | 92.68% | 96.13% | 90.62% |
Download from google drive. Train MobileFaceNet with Angular distillation:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss angular_distillation --dataset emore_soft
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.55% | 91.90% | 96.01% | 90.73% |
Download from google drive. Finetune MobileFaceNet with Triplet distillation L2:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss triplet_distillation_L2 --dataset emore_soft --pretrained ./models/y1-arcface-emore/model
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.56% | 93.30% | 96.23% | 89.10% |
Download from google drive. Finetune MobileFaceNet with Triplet distillation cos:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss triplet_distillation_cos --dataset emore_soft --pretrained ./models/y1-arcface-emore/model
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.55% | 93.30% | 95.60% | 86.52% |
Download from google drive. Train MobileFaceNet with Margin based distillation with T:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss margin_base_with_T --dataset emore_soft
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.41% | 92.40% | 96.01% | 90.77% |
Download from google drive. Train MobileFaceNet with MarginDistillation:
$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss margin_distillation --dataset emore_soft
Performance:
lfw | cfp-fp | agedb-30 | megaface |
---|---|---|---|
99.61% | 92.01% | 96.55% | 91.70% |