employ knowledge distillation to compress their large deep models into lightweight versions (Teacher and Student Model)
minyang-chen/Knowledge_Distillation_Training
employ knowledge distillation to compress their large deep models into lightweight versions (Teacher and Student Model)
Jupyter NotebookApache-2.0