In this project these activation functions are implemented using python's numpy.
- sigmoid
- softmax
- tanh
- ReLU
- leaky
- ReLU
- SELU
- ELU
- Maxout
1- The original code was written in a jupyter notebook(Q1_ActivationFunctions.ipynb) and then was converted to .py as this was the file format specified in the assignment.
2- In order to visualize the functions, some plots are added.
pip3 install -r requirements.txt # to install the dependancies
python Q1_ActivationFunctions.py