megvii-research/mdistiller
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
Python
Stargazers
- aashiqmuhamedCarnegie Mellon University
- AllenKaiChenNanjing University
- August0424
- Binyr
- cheekyshibe
- DarrenTitor
- DTennantShanghai
- dw763j
- fei-aiartXidian University
- fly51flyPRIS
- flyminThe Chinese University of Hong Kong
- Fushier
- jklf5Wuxi, China
- KevinDocelTsinghua University
- kumamonatseu
- LetheSecUniversity of Science and Technology of China
- lwxfight
- Nandan91New York University
- onlyonewater
- roger1993Hong Kong
- scarlettliu920
- shangwei5HIT
- SoftwareGift
- soloist97Beijing, China
- Tanzichang
- VitvickyUniversity of Texas at Dallas
- wanganzhiChengDu
- WenkeHuangWuhan University
- wm901115nwpu
- wnma3mz
- wusaifei
- XiaoBuLZhejiang University
- youtang1993Electronics and Information Technology, Sun Yat-sen University,
- yzd-vTsinghua University
- zhhao1
- zhouchunpongZJU