HobbitLong/RepDistiller
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
PythonBSD-2-Clause
Stargazers
- alexandonianMIT CSAIL
- asetsuna
- botsman11
- bourbakisShanghai
- chaoso
- dodler
- eugenelawrence
- fly51flyPRIS
- franciszzjKing's College London
- gsx0
- guanfuchenZhejiang University
- haoranshCarnegie Mellon University, Peking University
- HHHedo
- huangzehaoBeijing
- Hucley
- ilkarmanMicrosoft
- jfzhang95National University of Singapore
- Jieeee
- jindongwang@microsoft
- Leesoon1984
- lzrobotsFudan University
- madebyollinRedmond, WA
- michalwolsNew York
- msalvaris
- phillipi
- rodrigobZürich, Switzerland.
- sailordiaryChinese Academy of Sciences
- SaintLogos1234
- sufeidechabei富土康
- SunAriesCNBeijing, China
- tejaskhotAbnormal Security
- tim5goHong Kong
- vlad3996Minsk, Belarus
- wm901115nwpu
- wyb15
- xmfbitBytedance