/MNIST-Knowledge-Distillation-TF2.x-Example

This project demonstrates how to use knowledge distillation to train a student model that learns from a pre-trained teacher model. This example uses the MNIST dataset for demonstration.

Primary LanguagePython

Stargazers

No one’s star this repository yet.