/Neural-Network-Activation-Functions

This project consists of implementing combinations of hyperparameters on a Neural Network to determine what activation function (Logistic, ReLu, or Tanh) and what hyperparameters with each of those activation functions performs the best.

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

Watchers