This project utilizes a neural network for multiclass classification, specifically recognizing hand-written digits (0-9). The model is implemented using TensorFlow and includes the ReLU activation function and the Softmax function for improved accuracy.
- Packages: Numpy, Matplotlib, TensorFlow.
- Activation Functions: ReLU, Softmax.
- Model Architecture: Three-layer neural network with ReLU activation in hidden layers and linear activation in the output layer.
- Size: 5000 training examples.
- Input: 20x20 grayscale images unrolled into a 400-dimensional vector.
- Labels: 5000x1 vector indicating the digit (0-9) for each image.
- Architecture: Input layer (400 units), Hidden layers (25, 15 units with ReLU activation), Output layer (10 units with linear activation).
- Training: Softmax grouped with loss function, SparseCategoricalCrossentropy loss, Adam optimizer, 40 epochs.
- Import required packages.
- Load dataset using
load_data()
. - Build the model using Keras Sequential model.
- Compile the model with specified loss function and optimizer.
- Train the model using
model.fit(X, y, epochs=40)
. - Make predictions using
model.predict(image)
. - Evaluate accuracy and visualize results.
This project demonstrates the successful implementation of a neural network for digit recognition. The model achieves accurate predictions and provides a useful template for similar multiclass classification tasks.