/Deep_Learning_PadhAI

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

PadhAI: Deep Learning Course

  • I am still working on this notebook.

This repository contains the theoretical summary of concepts and practical implementation of the topics covered in the course I did on Deep Learning from PadhAI - One Fourth Labs taught by IIT Madras faculty.

PadhAI

Course Instructors:

Completion Certificate

Overview of Lab Session Modules with Python:

M1: Python Basics I - Data Types, Lists, Tuple, Set, Dictionary, Packages, File Handling, Class, Numpy, Plotting.

M2: Expert Systems - Introduction, Why do we care about Binary Classification?, How do humans make decisions?, Limitations.

M3: Say Hi to Machine Learning - 6 Jars of Machine Learning: Data, Task, Model, Loss, Learning Algorithm, Evaluation.

M4: Vectors and Matrices - Introduction to Vectors, Dot product of vectors, Unit Vectors, Projection of one vector onto another, Angle between two vectors, Why do we care about vectors?, Introduction to Matrices, Multiplying a vector by a matrix, Multiplying a matrix by another matrix, An alternate way of multiplying two matrices, Why do we care about matrices?

M5: Python Basics II - Linear Algebra, Pandas, Python Debugger, Plotting Vectors, Vector Addition and Subtraction, Vector Dot Product.

M6: McCulloch Pitts(MP) Neuron - Introduction, MP Neuron Model, MP Neuron Data Task, MP Neuron Loss, MP Neuron Learning, MP Neuron Evaluation, MP Neuron Geometry Basics, MP Neuron Geometric Interpretation.

M7: Perceptron - Introduction, Perceptron Data Task, Perceptron Model, Geometric Interpretation, Perceptron Loss Function, Perceptron Learning - General Recipe, Algorithm, Why It Works?, Will It Always Work?, Perceptron Evaluation, Summary.

M8: Python: MP Neuron, Perceptron, Test/Train - Perceptron: Toy Example, Loading Data, Train-Test Split, Binarisation, Inference And Search, Inference, Class, Perceptron Class, Epochs, Checkpointing, Learning Rate, Weight Animation, Excercises.

M9: Contest 1.1 - Contests Intro, Creating a Kaggle account, Data preprocessing, Submitting Entries, Clarifications, Mobile phone like/dislike predictor

M10, M11, M12: Sigmoid Neuron and Gradient Descent - Sigmoid Model (Part - I, II, III, IV), Sigmoid: Data and Tasks, Loss Function, Dealing with more than 2 parameters, Evaluation. Learning: Introduction to learning algorithm, Learning by guessing, Error surfaces for learning, Mathematical setup for the learning algorithm, Math free version of learning algorithm, Taylor Series, Deriving the gradient descent update rule, The complete learning algorithm, Computing partial derivatives, Writing the code. Mathematics behind the parameters update rule. Summary and Takeaways.

M13: Python: Sigmoid, Gradient Descent - Plotting Sigmoid 2D & 3D, Plotting Loss, Contour Plot, Class, Toy Data Fit and Plot. Loading Data, Standardisation, Test/Train Split, Fitting Data, Loss Plot, Progress Bar, Exercises.

M14: Basics: Probability Theory - Introduction, Random Variable: Intuition, Formal Definition, Continuous and Discrete, Probability Distribution, True and Predicted Distribution, Certain Events, Why Do we Care About Distributions?

M15: Information Theory - Expectation, Information Content, Entropy, Relation To Number Of Bits, KL-Divergence and Cross Entropy.

M16: Sigmoid Neuron and Cross Entropy - Using Cross Entropy With Sigmoid Neuron, Learning Algorithm for Cross Entropy loss function, Computing partial derivatives with cross entropy loss, Code for Cross Entropy Loss function.