Explore the cutting edge of data science: deep learning. Begin by understanding the basic computational unit of artificial neural networks: the perceptron. Study their neurological inspiration, mathematical basis in linear regression, and graphical interpretation of weights and threshold to gain an intuition for their power. Build a perceptron from scratch and train it using the perceptron learning algorithm. Explore the limitations of perceptrons and how non-linear activation functions enhance their power. Combine many perceptrons to construct feed-forward neural networks and program a training algorithm using error back-propagation and gradient descent. Compare and contrast several neural network architectures for solving problems across different domains. Use cutting-edge software libraries and tools including Keras and TensorFlow to construct large-scale networks with relatively little code. Train networks on large data sets to solve hard problems like image classification, face recognition, content generation, and style transfer. Apply these deep learning techniques to an original project and data set.
Deep learning has shaped our world. Deep learning is the science of getting computers to learn and act like humans do, and improve their learning over time in autonomous fashion, by feeding them data and information in the form of observations and real-world interactions. It has various applications including natural-language processing (NLP), image classification and segmentation, voice recognition and deep reinforcement learning.
Students by the end of the course will be able to ...
- Describe neural networks and deep learning models
- Use deep learning models for prediction or classification problems
- Compare and contrast MLP, CNN and LSTM neural networks and identify when to use each
- Shape data to use appropriate deep learning models
- Practice tuning deep learning hyper-parameters
NOTE: Due to the shorter summer sessions, for some class sessions you will see multiple topics covered. This is to ensure that we cover the same material that we normally would in non-summer terms.
Course Dates: Monday, January 21 – Wednesday, March 6, 2019 (7 weeks)
Class Times: Monday and Wednesday at 3:30–5:20pm (10 class sessions)
Class | Date | Topics |
---|---|---|
- | Monday, January 21 | MLK Jr. Day |
1 | Wednesday, January 23 | Array and matrix manipulation |
2 | Monday, January 28 | What is Neural Network? |
3 | Wednesday, January 30 | Introduction to Keras |
4 | Monday, February 4 | Deep Learning Glossary |
5 | Wednesday, February 6 | Convolutional Neural Network |
6 | Monday, February 11 | Recurrent Neural Network |
7 | Wednesday, February 13 | Keras for Large Datasets |
- | Monday, February 18 | President's Day (Observed) |
8 | Tuesday, February 19 | Deep Learning Model Evaluation |
9 | Wednesday, February 20 | Introduction to Tensorflow |
10 | Monday, February 25 | Hyper parameter opt |
11 | Wednesday, February 27 | Auto Encoders |
12 | Monday, March 4 | Final Class (presentations, etc) |
13 | Wednesday, March 6 | Final Exams/Presentations |
- Build a linear regression and logistic regression with Keras
- Apply MLP for Churn dataset by Keras
- Apply MLP to MNIST datasets
- Build and train a CNN + MLP deep learning model with Keras for MNIST dataset
- Projects should be linked to a project page which has a description & requirements.
- We'll be exploring Keras on the Cifar dataset
- https://www.makeschool.com/academy/track/standalone/keras-for-image-classification-pfw/keras-and-neural-networks
- You will choose your own dataset to clean, investigate, and make predictions or classification or clustering on it
To pass this course you must meet the following requirements:
- Pass all required tutorials and projects (see associated rubrics)
- Pass the final summative assessment >=75%
- Actively participate in class and abide by the attendance policy
- Make up all classwork from all absences