python activation.py <function name>
python activation.py linear
python activation.py sigmoid
python activation.py softmax
python activation.py tanh
python activation.py relu 0 # normal relu
python activation.py relu 1 # noisy relu
python activation.py relu 2 #leaky relu
following graph is drawn based on input range(-100,100) vs respective values based on function