/HARD

Hard Augmentations for Robust Distillation

HARD🏋️: Hard Augmentations for Robust Distillation

This is the repository for our work on HARD augmentations for knowledge distillation.

🚧 Check here later for the full code and models. 🚧