/udacity-aind

Udacity Artificial Intelligence Nanodegree - May 2017

Primary LanguageHTML

Artificial Intelligence Engineer Projects:

Term 1: Foundations of AI

  • Constraint Satisfaction: build an AI to solve Sudoku using constraint propagation and search
  • Deterministic AI: build a AI agent playing the board game Isolation to compete against humans and other AIs
  • Search: build a Pac-Man AI that finds the most efficient path through its world
  • Simulated Annealing: explore large state spaces using the biologically-inspired techniques of Simulated Annealing
  • Logic and Planning: find the most efficient path to move a set of cargos from their origins to their respective destinations
  • Probabilistic AI: use Probabilistic Inference to calculate the probability of certain events occurring
  • Hidden Markov Models: use HMMs to translate sign language into their English Language characters

Term 2: Deep Learning and Applications

  • Sentiment Analysis with Numpy: Andrew Trask leads you through building a sentiment analysis model, predicting if some text is positive or negative.
  • Intro to TensorFlow: Starting building neural networks with Tensorflow.
  • Weight Intialization: Explore how initializing network weights affects performance.
  • Autoencoders: Build models for image compression and denoising, using feed-forward and convolution networks in TensorFlow.
  • Transfer Learning (ConvNet). In practice, most people don't train their own large networkd on huge datasets, but use pretrained networks such as VGGnet. Here you'll use VGGnet to classify images of flowers without training a network on the images themselves.
  • Intro to Recurrent Networks (Character-wise RNN): Recurrent neural networks are able to use information about the sequence of data, such as the sequence of characters in text.
  • Embeddings (Word2Vec): Implement the Word2Vec model to find semantic representations of words for use in natural language processing.
  • Sentiment Analysis RNN: Implement a recurrent neural network that can predict if a text sample is positive or negative.
  • Tensorboard: Use TensorBoard to visualize the network graph, as well as how parameters change through training.
  • Reinforcement Learning (Q-Learning): Implement a deep Q-learning network to play a simple game from OpenAI Gym.
  • Sequence to sequence: Implement a sequence-to-sequence recurrent network.
  • Batch normalization: Learn how to improve training rates and network stability with batch normalizations.
  • Generative Adversatial Network on MNIST: Train a simple generative adversarial network on the MNIST dataset.
  • Deep Convolutional GAN (DCGAN): Implement a DCGAN to generate new images based on the Street View House Numbers (SVHN) dataset.
  • Intro to TFLearn: A couple introductions to a high-level library for building neural networks.
  • Image classification: Build a convolutional neural network with TensorFlow to classify CIFAR-10 images.
  • Text Generation: Train a recurrent neural network on scripts from The Simpson's (copyright Fox) to generate new scripts.
  • Machine Translation: Train a sequence to sequence network for English to French translation (on a simple dataset)
  • Face Generation: Use a DCGAN on the CelebA dataset to generate images of novel and realistic human faces.
  • CNN: Dog Breed Image Classifier
  • RNN: Apple Stock & Sherlock Homes Text Generation

Term 3: AI Concentrations

CV

NLP

VUI

Deeper Readings