Code for Privileged Modality Learning via Multimodal Hallucination (TMM)
- DOWNLODAD LINK https://sites.google.com/view/face-anti-spoofing-challenge/welcome/challengecvpr2019
- Require sign a agreement
-
parameter
- data_root: path to your dataset, such as '/home/bbb/shicaiwei/data/liveness_data/CASIA-SURF'
- modal: the modality used, such as rgb,depth,ir
- backbone: name of backbone, it is only the name determined by your backbone, and can not to select backbone
-
single baseline
cd src bash baseline_depth.sh bash baseline_ir.sh bash baseline_rgb.sh
-
multimodal teacher model
cd src bash baseline_multi.sh bash cefa_baseline_multi.sh
-
multimodal model hallucination
- RAD
cd src bash surf_patch_spp.sh bash cefa_patch_kd_spp.sh
- DAD
cd src bash surf_patch_feature.sh bash cefa_patch_kd_feature.sh
- RAD
- multimodal teacher model
- see code in https://github.com/shicaiwei123/pytorch-i3d
- multimodal model hallucination
- UCF101
-
cd ucf101/src bash ucf101_patch_spp.sh
-
- HMDB51
-
cd hmdb51/src bash hmdb51_patch_spp.sh
-
- NWUCLA
-
cd nw_ucla_cv bash ucla_patch_spp.sh
-
- NTUD60
-
cd ntud60_cs bash ntud60_patch_spp.sh
-
- UCF101