/DLND

Deep Learning Nanodegree

Primary LanguageJupyter NotebookMIT LicenseMIT

My Projects from the Deep Learning Nanodegree

About

This is a repository for documenting my education of Deep Learning. It includes the projects and tutorials I completed sucessfully from the Udacity's Deep Learning Nanodegree program. I have built convolutional networks for image recognition, recurrent neural networks for sequence generation, generative adversarial networks for image generation, and also learnt to deploy models accessible from a website.

Contents

Exercises

  • Introduction to Neural Networks: Implemented gradient descent and applied it to predict patterns in student admissions data.
  • Sentiment Analysis with NumPy: Andrew Trask built a sentiment analysis model, for predicting if some text is positive or negative.
  • Convolutional Neural Networks: Visualized the output of layers that make up a CNN. Learnt to define and train a CNN for classifying MNIST data, a handwritten digit database that is notorious in the fields of machine and deep learning. Also, defined and trained a CNN for classifying images in the CIFAR10 dataset.
  • Transfer Learning: Used pre-trained networks such as VGGnet to help to classify images of flowers without training an end-to-end network from scratch.
  • Autoencoders: Built models for image compression and de-noising, using feedforward and convolutional networks in PyTorch.
  • Style Transfer: Extracted style and content features from images, using a pre-trained network. Implemented style transfer according to the paper, Image Style Transfer Using Convolutional Neural Networks by Gatys et. al. Defined appropriate losses for iteratively creating a target, style-transferred image of our own design!
  • Intro to Recurrent Networks (Time series & Character-level RNN): Implemented Recurrent neural networks which are able to use information about the sequence of data, such as the sequence of characters in the text, for generating sentiments in the words.
  • Embeddings (Word2Vec): Implemented the Word2Vec model to find semantic representations of words for use in natural language processing.
  • Sentiment Analysis RNN: Implemented a recurrent neural network that can predict if the text of a moview review is positive or negative.
  • Generative Adversarial Network on MNIST: Trained a simple generative adversarial network on the MNIST dataset.
  • Deep Convolutional GAN (DCGAN): Implemented a DCGAN to generate new images based on the Street View House Numbers (SVHN) dataset.
  • CycleGAN: Implemented a CycleGAN that is designed to learn from unpaired and unlabeled data; used trained generators to transform images from summer to winter and vice versa.

Projects Completed

  • Predicting Bike-Sharing Patterns: Implemented a neural network in NumPy to predict bike rentals.
  • Dog Breed Classifier: Built a convolutional neural network with PyTorch to classify any image (even an image of a face) as a specific dog breed.
  • TV Script Generation: Trained a recurrent neural network to generate scripts in the style of dialogue from Seinfeld Season.
  • Face Generation: Used a DCGAN on the CelebA dataset to generate images of new and realistic human faces.
  • Deploying a Model (with AWS SageMaker) Deployed pre-trained models using AWS SageMaker. Constructed a recurrent neural network for the purpose of determining the sentiment of a movie review using the IMDB data set. Deployed the model and constructed a simple web app which will interact with the deployed model.

Installation

Please install the dependencies from the requirements while running the code.

Certificate

certificate