haitongli/knowledge-distillation-pytorch
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
PythonMIT
Stargazers
- akarle@PocusHQ
- BUPTLdyCAS
- crcrparNVIDIA
- crystalmch
- dkozlov
- einolghozati
- fishelegs
- flygyyyCAS
- gmayday1997South China University of Technology
- gujiuxiangAdobe Research
- HisiFish
- jihaonewCUHK, MMLab
- kaidicStanford University
- kli-casiaCASIA
- lbin
- memoiryApple <- Zhejiang University, CAD&CG
- narumirunaTaipei, Taiwan
- onlywe0620
- QianZhang007Alibaba Group
- r9y9@line
- ruotianluoWaymo
- ShethRNew York, New York
- shubhampachori12110095Somewhere in India
- simplysimleprobUSA
- sriharsha0806fractal
- taozeyiUS
- tonylinsMIT, EECS
- tonysyShanghai AI Lab
- VibAltekar
- wolegechuStepFUN
- xizero00www.ilovepose.com
- xmfbitBytedance
- ykoneeeChina GuangZhou
- youngfly11ShanghaiTech
- YuliangXiuMax Planck Institute for Intelligent Systems
- yunglechao