PyTorch Distillation
This package consists of a small extension library for Knowledge Distillation in PyTorch.
Installation
From source
Pytorch Distillation requires PyTorch (>= 1.4.0) to be installed. Please refer to the PyTorch installation page regarding the specific install command for your platform.
git clone https://github.com/sgraaf/pytorch_distillation.git
cd pytorch_distillation
python setup.py install
Example
Please refer to the examples in the examples/
directory for some working example(s) of Knowledge Distillation.
Acknowledgements
PyTorch Distillation has adapted (parts of) some example code of the Transformers library by 🤗 Hugging Face, Inc.
License
PyTorch Distillation is open-source and licensed under GNU GPL, Version 3.