haitongli/knowledge-distillation-pytorch
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
PythonMIT
Issues
- 0
no module named torch._dynamo
#52 opened by xpzwzwz - 5
Requirements.txt is outdated?
#38 opened by Deso2121 - 0
About "reduction" built in KLDivLoss
#48 opened by junfish - 3
boxed folder
#26 opened by mlszy928 - 6
Box folder
#14 opened by zihaozhang9 - 2
Box Folder
#21 opened by LesiChou - 21
kd loss
#2 opened by PaTricksStar - 0
- 0
- 4
An issue on loss function
#10 opened by lhyfst - 1
How to train my own dataset
#19 opened by RamatovInomjon - 17
experiment result
#7 opened by MrLinNing - 2
How to improve studentmodel's acc?
#5 opened by RichardMrLu - 1
Computing teacher outpouts is called only onece?
#23 opened by ssqfs - 3
in mnist folder,why teacher_mnist and stdudent_mnist do not contain the softmax?
#27 opened by parquets - 1
- 2
I see the fitnets for reference
#15 opened by chenguanfu511 - 1
- 2
regression problem can use this method?
#8 opened by door5719 - 11
- 1
Error Cuda
#12 opened by youyeg - 8
- 0
missing training log for base cnn
#13 opened by hughperkins - 1
'Tensor' object is not callable
#11 opened by youyeg - 2
cant get the pretrained model
#1 opened by flygyyy