/attndistill

code for our paper "Attention Distillation: self-supervised vision transformer students need more guidance" in BMVC 2022

Primary LanguagePython

Kai Wang, Fei Yang and Joost van de Weijer

Requirements

Please check the packages in your environment with the "requirements.txt", normally these scripts don't depend on much on the package versions.

Reproducing

Please modify the data paths in "dino_teacher.sh" and "mugs_teacher.sh" to run the scripts by "bash xxx.sh".

Download

The teacher models checkpoints can be downloaded from the github repositories of DINO (https://github.com/facebookresearch/dino) and Mugs (https://github.com/sail-sg/mugs).

Others

If you have any question, do not hesitate to contact me or post an issue.