This project implements a two-layer artificial neural network (ANN) for binary classification using NumPy, with dazzling training visualizations powered by Matplotlib and Seaborn. Whether you prefer static π or live π₯ updates, this code tracks loss, accuracy, and even throws in a confusion matrix to show off its skills. The datasetβfeaturing adorable cats π± and dogs πΆβis loaded from HDF5 files via a handy function in utilities.py.
The neural network rocks:
- An Input Layer (sized by the dataset features)
- A Hidden Layer (you pick the neuron count! π§ )
- An Output Layer (cat or dog? Binary magic! π―)
- Forward & backward propagation for training π
- Log-loss cost function π
- Gradient descent optimization βοΈ
- Accuracy tracking + confusion matrix fun π¨
- Loss & accuracy visuals over epochs (static or live!) π
- Dataset preprocessing (reshaping & normalizing) π§
Youβll need Python 3.6+ installed π. Check requirements.txt for the full list of goodies.
- Clone or grab this repo:
git clone https://github.com/Adamo08/TwoLayer-PropNet.git cd TwoLayer-PropNet - Install the dependencies:
pip install -r requirements.txt
- Make sure
datasets/hastrainset.hdf5andtestset.hdf5ready to roll! πΎ
Hereβs whatβs in requirements.txt:
numpy
matplotlib
seaborn
scikit-learn
tqdm
h5py
Get them with:
pip install matplotlib seaborn scikit-learn tqdm numpy h5pyβββ datasets/
β βββ trainset.hdf5 # Training dataset (cat & dog pix! π±πΆ)
β βββ testset.hdf5 # Test dataset (more furry friends!)
βββ utilities.py # Helper functions (e.g., load_data) π οΈ
βββ app.py # Main script with ANN awesomeness π
βββ requirements.txt # Required Python modules π
βββ README.md # Youβre here! π
The dataset is stored in HDF5 format and packed with images of cats and dogs for binary classification (cat = 0, dog = 1, or vice versaβyour call!). It includes:
- Training Set:
X_train(images) &y_train(labels) - Test Set:
X_test(images) &y_test(labels)
Loaded via load_data() in utilities.py. π»
- Check that all files are in place.
- Tweak hyperparameters in
app.pyif youβre feeling fancy:n1: Hidden layer neurons (default: 3) π§alpha: Learning rate (default: 0.01) β‘epochs: Training rounds (default: 100) β³
- Fire it up:
python app.py
N2N_NORMAL: Trains and shows static plotsβloss, accuracy, and a confusion matrixβwhen done. πN2N_LIVE: Trains with live-updating loss & accuracy plots, plus a confusion matrix at the end. π¬
Switch it up in app.py:
# Static vibes
params = N2N_NORMAL(X_train_reshape, y_train, 3, 0.01, 100)
# Live action
params = N2N_LIVE(X_train_reshape, y_train, 3, 0.01, 100)- Console: Final model accuracy π
- Plots:
- Loss & accuracy curves (saved during training) ππ
- Confusion matrix (saved as
confusion_matrix.png) π¨
- Sample Images: A cute grid of cat & dog pics with labels before training starts! πΌοΈ
initialize: Sets up weights & biases βοΈForward_Propagation: Runs data through the network β‘οΈlog_loss: Measures the cost πBack_Propagation: Calculates gradients β¬ οΈupdate: Tweaks params with gradient descent π§predict: Makes cat-or-dog calls πΎConfusion_Matrix: Plots the results π―N2N_NORMAL/N2N_LIVE: Trains with style! π
- Reshaping: Flattens images (e.g.,
(samples, height, width)β(samples, features)) π - Normalization: Scales pixels from
[0, 255]to[0, 1]π
Say your dataset has 1000 cat & dog pics (64x64 grayscale):
X_trainshape:(1000, 64, 64)- After reshaping:
(1000, 4096) - After transposition:
(4096, 1000) y_trainshape:(1, 1000)
Run:
python app.pyTrains with 3 hidden neurons, a 0.01 learning rate, and 100 epochs. π±πΆ
- Built for binary classification (cat vs. dog,
n2 = 1) πΎ - For bigger datasets, bump up
n1,alpha, orepochsπ§ - Live plotting (
N2N_LIVE) might lag a bit due to real-time updates β±οΈ
Unlicensedβfree to use or tweak for fun & learning! π
Inspired by classic neural network ideas and powered by Pythonβs awesome data science tools. π