this repository is for check FReLU(arXiv:2007.11824) on CIFAR10. I have tested ReLU, Swish and FReLU 3times using ResNet18. The result is shown as following.
Activation Function | minimum validation loss |
---|---|
ReLU | 0.764 ± 0.009 |
Swish | 0.763 ± 0.008 |
FReLU | 0.743 ± 0.006 |