/t81_558_deep_learning

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

Primary LanguageJupyter NotebookOtherNOASSERTION

T81 558:Applications of Deep Neural Networks

Washington University in St. Louis

Instructor: Jeff Heaton

The content of this course changes as technology evolves, to keep up to date with changes follow me on GitHub.

  • Section 1. Spring 2023, Monday, 2:30 PM, Location: Eads / 216
  • Section 2. Spring 2023, Online

Course Description

Deep learning is a group of exciting new technologies for neural networks. Through a combination of advanced training techniques and neural network architectural components, it is now possible to create neural networks that can handle tabular data, images, text, and audio as both input and output. Deep learning allows a neural network to learn hierarchies of information in a way that is like the function of the human brain. This course will introduce the student to classic neural network structures, Convolution Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Neural Networks (GRU), General Adversarial Networks (GAN) and reinforcement learning. Application of these architectures to computer vision, time series, security, natural language processing (NLP), and data generation will be covered. High Performance Computing (HPC) aspects will demonstrate how deep learning can be leveraged both on graphical processing units (GPUs), as well as grids. Focus is primarily upon the application of deep learning to problems, with some introduction to mathematical foundations. Students will use the Python programming language to implement deep learning using Google TensorFlow and Keras. It is not necessary to know Python prior to this course; however, familiarity of at least one programming language is assumed. This course will be delivered in a hybrid format that includes both classroom and online instruction.

Textbook

The complete text for this course is here on GitHub. This same material is also available in book format. The course textbook is “Applications of Deep Neural networks with Keras“, ISBN 9798416344269.

If you would like to cite the material from this course/book, please use the following BibTex citation:

@misc{heaton2020applications,
    title={Applications of Deep Neural Networks},
    author={Jeff Heaton},
    year={2020},
    eprint={2009.05673},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

Objectives

  1. Explain how neural networks (deep and otherwise) compare to other machine learning models.
  2. Determine when a deep neural network would be a good choice for a particular problem.
  3. Demonstrate your understanding of the material through a final project uploaded to GitHub.

Syllabus

This syllabus presents the expected class schedule, due dates, and reading assignments. Download current syllabus.

Module Content
Module 1
Meet on 01/23/2023
Module 1: Python Preliminaries
  • Part 1.1: Course Overview
  • Part 1.2: Introduction to Python
  • Part 1.3: Python Lists, Dictionaries, Sets & JSON
  • Part 1.4: File Handling
  • Part 1.5: Functions, Lambdas, and Map/ReducePython Preliminaries
  • We will meet on campus this week! (first meeting)
Module 2
Week of 01/30/2023
Module 2: Python for Machine Learning
  • Part 2.1: Introduction to Pandas for Deep Learning
  • Part 2.2: Encoding Categorical Values in Pandas
  • Part 2.3: Grouping, Sorting, and Shuffling
  • Part 2.4: Using Apply and Map in Pandas
  • Part 2.5: Feature Engineering in Padas
  • Module 1 Program due: 01/31/2023
  • Icebreaker due: 01/31/2023
Module 3
Week of 02/06/2023
Module 3: TensorFlow and Keras for Neural Networks
  • Part 3.1: Deep Learning and Neural Network Introduction
  • Part 3.2: Introduction to Tensorflow & Keras
  • Part 3.3: Saving and Loading a Keras Neural Network
  • Part 3.4: Early Stopping in Keras to Prevent Overfitting
  • Part 3.5: Extracting Keras Weights and Manual Neural Network Calculation
  • Module 2: Program due: 02/07/2023
Module 4
Week of 02/13/2023
Module 4: Training for Tabular Data
  • Part 4.1: Encoding a Feature Vector for Keras Deep Learning
  • Part 4.2: Keras Multiclass Classification for Deep Neural Networks with ROC and AUC
  • Part 4.3: Keras Regression for Deep Neural Networks with RMSE
  • Part 4.4: Backpropagation, Nesterov Momentum, and ADAM Training
  • Part 4.5: Neural Network RMSE and Log Loss Error Calculation from Scratch
  • Module 3 Program due: 02/14/2023
Module 5
Meet on 02/20/2023
Module 5: Regularization and Dropout
  • Part 5.1: Introduction to Regularization: Ridge and Lasso
  • Part 5.2: Using K-Fold Cross Validation with Keras
  • Part 5.3: Using L1 and L2 Regularization with Keras to Decrease Overfitting
  • Part 5.4: Drop Out for Keras to Decrease Overfitting
  • Part 5.5: Bootstrapping and Benchmarking Hyperparameters
  • Module 4 Program due: 02/21/2023
  • We will meet on campus this week! (second meeting)
Module 6
Week of 02/27/2023
Module 6: CNN for Vision
    Part 6.1: Image Processing in Python
  • Part 6.2: Using Convolutional Networks with Keras
  • Part 6.3: Using Pretrained Neural Networks
  • Part 6.4: Looking at Keras Generators and Image Augmentation
  • Part 6.5: Recognizing Multiple Images with YOLOv5
  • Module 5 Program due: 02/28/2023
Module 7
Week of 03/06/2023
Module 7: Generative Adversarial Networks (GANs)
  • Part 7.1: Introduction to GANS for Image and Data Generation
  • Part 7.2: Train StyleGAN3 with your Own Images
  • Part 7.3: Exploring the StyleGAN Latent Vector
  • Part 7.4: GANS to Enhance Old Photographs Deoldify
  • Part 7.5: GANs for Tabular Synthetic Data Generation
  • Module 6 Assignment due: 03/07/2023
Module 8
Week of 03/20/2023
Module 8: Kaggle
  • Part 8.1: Introduction to Kaggle
  • Part 8.2: Building Ensembles with Scikit-Learn and Keras
  • Part 8.3: How Should you Architect Your Keras Neural Network: Hyperparameters
  • Part 8.4: Bayesian Hyperparameter Optimization for Keras
  • Part 8.5: Current Semester's Kaggle
  • Module 7 Assignment due: 03/21/2023
Module 9
Meet on 03/27/2023
Module 9: Transfer Learning
  • Part 9.1: Introduction to Keras Transfer Learning
  • Part 9.2: Keras Transfer Learning for Computer Vision
  • Part 9.3: Transfer Learning for NLP with Keras
  • Part 9.4: Transfer Learning for Facial Feature Recognition
  • Part 9.5: Transfer Learning for Style Transfer
  • We will meet on campus this week! (third meeting)
  • Module 8 Assignment due: 03/28/2023
Module 10
Week of 04/03/2023
Module 10: Time Series in Keras
  • Part 10.1: Time Series Data Encoding for Deep Learning, Keras
  • Part 10.2: Programming LSTM with Keras and
  • Part 10.3: Text Generation with Keras
  • Part 10.4: Introduction to Transformers
  • Part 10.5: Transformers for Timeseries
  • Module 9 Assignment due: 04/04/2023
Module 11
Week of 04/10/2023
Module 11: Natural Language Processing
  • Part 11.1: Hugging Face Introduction
  • Part 11.2: Hugging Face Tokenizers
  • Part 11.3: Hugging Face Data Sets
  • Part 11.4: Training a Model in Hugging Face
  • Part 11.5: What are Embedding Layers in Keras
  • Module 10 Assignment due: 04/11/2023
Module 12
Week of 04/17/2023
Module 12: Reinforcement Learning
  • Kaggle Assignment due: 04/18/2023 (approx 4-6PM, due to Kaggle GMT timezone)
  • Part 12.1: Introduction to the OpenAI Gym
  • Part 12.2: Introduction to Q-Learning for Keras
  • Part 12.3: Keras Q-Learning in the OpenAI Gym
  • Part 12.4: Atari Games with Keras Neural Networks
  • Part 12.5: Application of Reinforcement Learning
Module 13
Meet on 04/24/2023
Module 13: Deployment and Monitoring
  • Part 13.1: Flask and Deep Learning Web Services
  • Part 13.2: Interrupting and Continuing Training
  • Part 13.3: Using a Keras Deep Neural Network with a Web Application
  • Part 13.4: When to Retrain Your Neural Network
  • Part 13.5: Tensor Processing Units (TPUs)
  • Final Project due 05/08/2023
  • We will meet on campus this week! (fourth meeting)

Datasets