Note: PyTorch has since implemented learning rate schedulers. It would be easier to implement SGDR using them, rather than without (as it is done in this repository), although the difference in lines of code is relatively small. This repository is redundant, left up just for interest.
Built from kuangliu
's great simple
pytorch-cifar repository.
Switches out the manual learning rate scheduling for SGDR. Used the
anytime schedule reported best in the paper.
Model | Acc. Before | SGDR Acc. |
---|---|---|
VGG16 | 92.64% | ? |
ResNet18 | 93.02% | 93.99 % |
ResNet50 | 93.62% | 94.25 % |
ResNet101 | 93.75% | ? |
ResNeXt29(32x4d) | 94.73% | ? |
ResNeXt29(2x64d) | 94.82% | ? |
DenseNet121 | 95.04% | ? |
ResNet18(pre-act) | 95.11% | ? |
DPN92 | 95.16% | ? |