/CIFAR10

Solving the CIFAR10 dataset using a shallow CNN and trying out different architectures

Primary LanguageJupyter Notebook

CIFAR10

The CIFAR10 dataset is a popular dataset containing coulured images of 10 different objects in different states and their labels Solving the CIFAR10 dataset using a shallow CNN ans Resnet and trying out different architectures. This project involves an analysis of acuuracy and loss over epochs based on controlled variations in the learning rates.

Frameworks used: PyTorch, NumPy, Pandas, Matplotlib, Torchvision.

Preprocessing: image

image

The data preprocessing done via Torchvision transforms involves normalizing the images, followed by randomly cropping, flipping and reflecting the images in order to include variaton in the dataset and prevent overfitting into the training data

image

Based on a preliminary CNN architecture, we get an accuracy of roughly 65-67% which can be improved upon by

image

Simply Switching to a resnet architecture elevates the the training accuracy to 86% and the validation accuracy to 84%, getting closer to the end goal of 90%

image

Using a triangular cyclic learing rate with a decrement factor 0.5 increases the accuracy to 91%, however it is observed that the the period of incerasing learning rate results in a periodically increasing loss and decreasing accuracy.

image

the same learning rate with a decrement factor of 0.25 results in a mrginally better accuracy, but is able to smoothen out the loss curve to a decent extent and lower fluctuations in the accuracy.

image

Using a cyclic learning rate with a decrement factor of 0.5, with a triangular shape where the the learnig rate is increased immediately rateher than gradually gives results similar to the case with a gradual increase, however causes a sharp increase in the loss rather than a gradual one, this is able to cut don on the epochs where the loss increases.

image

The same triangular learning rate with a sharp increase and decrement factor of 0.25 results in a similar result, with marginally lower accuracies. The loss and accuracies give smoother curves in this case.

IMPROVEMENTS:

  1. Trying out decrement factors other than 0.5 and 0.25.
  2. Using another type of learning rate variation.