/torchdistill

A Modular, Configuration-Driven Framework for Knowledge Distillation. Trained models, training logs and configurations are available for ensuring the reproducibiliy.

Primary LanguagePythonMIT LicenseMIT

Watchers