This notebook includes implementation of the DNN with L-layered architecture for binary classification and tested with iris dataset. You can use this notebook as a L-layered DNN model to train different datasets and different architectures. You have to fit train dataset of shape (dimensions, examples) and target output set of (dimensions,examples) to the model and parameters will be resulted. Hyperparameters are also there to tune as your preferrence and predictions can be made.
- Numpy
- Pandas
- Scikit-learn
- Scipy
- Matplotlib
- Download the repo
- Open the 'Binary-classification DNN' ipython notebook with Jupyter notebook
- Run the notebook cells step-by-step
- layer_dims: Dimensions of your network (according to your dataset)
- num_epochs: Number of epochs to train
- learning_rate: Learning rate alpha for gradient descent
- optimizer: Optimization method ("gd"- minibatch gradient descent, "adam"- Adam optimizer)
- lambd: Regularization parameter
- mini_batch_size: Size of minibatch for minibatch gradient descent
- beta1, beta2: Hyperparameters to adjust Adam optimizer's momentum
- epsilon: Hyperparameter to include in the updating function of the model parameters