/activation-function-comparison-pytorch

Comparison of common activation functions on MNIST dataset using PyTorch.

Primary LanguagePythonMIT LicenseMIT

activation-functions-comparison-pytorch

Comparison of common activation functions on MNIST dataset using PyTorch.

Activation functions:

  • Relu
  • Sigmoid
  • Tanh

Best result: Relu