szq0214/FKD

self-supervised learning to train a teacher model and to generate pseudo-labels

Xglbrilliant opened this issue · 1 comments

Hi author, thank you for your great work! I am intrigued by your work and very much looking forward to your method for self-supervised learning to train a teacher model and to generate pseudo-labels, thanks a lot!

Hi @Xglbrilliant, thanks for your interest in our work! Our FKD-SSL is based on our previous work: S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021) (Code), which is a pure distillation-based self-supervised learning framework. The self-supervised teachers can be pre-trained via any strong SSL methods, such as MoCo, SwAV, DINO, etc., or use their off-the-shelf pre-trained models directly.