/knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

Primary LanguagePythonMIT LicenseMIT

Issues