/Neural-Network-Activation-Functions

This project consists of implementing combinations of hyperparameters on a Neural Network to determine what activation function (Logistic, ReLu, or Tanh) and what hyperparameters with each of those activation functions performs the best.

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

Neural Network Hyperparameter Implementation and Activation Function Analysis

This project consists of implementing combinations of hyperparameters on a Neural Network to determine what activation function (Logistic, ReLu, or Tanh) and what hyperparameters with each of those activation functions performs the best.

Below are three Loss/Error/Cost curve graphs for each of the activation functions.

Note: There are more graphs shown in the code file.

Logistic (Sigmoid) Loss Curves

My image

ReLu Loss Curves

My image

Tanh Loss Curves

My image