/Experiments-with-Distilling-Knowledge

Distilling Knowledge of neural networks on a self trained model trained on 10 classes of ImageNet

Primary LanguageShell

Watchers