/torchdistill

PyTorch-based modular, configuration-driven framework for knowledge distillation. πŸ†18 methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy.

Primary LanguagePythonMIT LicenseMIT

Stargazers

No one’s star this repository yet.