/deeplearning-activation-functions

A simple implementation of famous activation functions

Primary LanguageJupyter Notebook

A simple implementation of famous activation functions using Numpy

In this project these activation functions are implemented using python's numpy.

  • sigmoid
  • softmax
  • tanh
  • ReLU
  • leaky
  • ReLU
  • SELU
  • ELU
  • Maxout

Notes

1- The original code was written in a jupyter notebook(Q1_ActivationFunctions.ipynb) and then was converted to .py as this was the file format specified in the assignment.

2- In order to visualize the functions, some plots are added.

How to Run

pip3 install -r requirements.txt # to install the dependancies
python Q1_ActivationFunctions.py