szagoruyko/attention-transfer
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Jupyter Notebook
Issues
- 0
Table1: Experiments on CIFAR-10
#41 opened by xlzhou01 - 0
- 1
RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same
#39 opened by toseattle - 0
My Imagenet replication results are poor
#38 opened by somone23412 - 2
Why not use bn for teacher net in imagenet.py
#37 opened by cheerss - 1
how to resolve this error
#36 opened by manza-ari - 3
Any experiment results updated with AT on imagenet?
#27 opened by apli - 0
Got error when use 2 gpus.
#35 opened by gtxjinx - 0
Loss function problems
#34 opened by jacky4323 - 1
Setting of β
#32 opened by tangbohu - 0
Strategy of α and β decay during training
#33 opened by d-li14 - 0
How to get the attention map with input model.parameters() or model.named_parameters() in pytorch
#31 opened by tangbohu - 2
KL div v/s xentropy
#12 opened by arunmallya - 1
AT+KD Code
#23 opened by mehak2393 - 1
Question on KL loss
#7 opened by wentianli - 0
Question about KL_loss average
#18 opened by Lan1991Xu - 5
how to install cvtransforms?
#28 opened by nobody-cheng - 1
qusetion about cifar.py
#26 opened by MrLinNing - 1
How to visualize the attention maps?
#19 opened by jimmy-dq - 1
Attention map
#22 opened by vkadykova - 6
invalid variables
#20 opened by vkadykova - 0
- 0
how to do the interpolation?
#16 opened by DragonFive - 0
What is this model's final purpose?
#15 opened by bemoregt - 2
question about "params.itervalues()"
#13 opened by Fzz123 - 6
Question on Code
#8 opened by zhenxing1992 - 1
Crossing computer vision boundaries
#1 opened by TheodoreGalanos