/knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

Primary LanguagePythonMIT LicenseMIT

No issues in this repository yet.