/MachineLearning_DeepLearning

I will share about Machine Learning and Deep Learning.

MIT LicenseMIT

MACHINE LEARNING & DEEP LEARNING

Image

Books and Resources Status of Completion
1. Deep Learning for Computer Vision with Python I
2. Pattern Recognition and Machine Learning - Bishop
3. Hugging Face
4. Transformers for Natural Lanaguage Processing
5. Natural Language Processing with Transformers
Projects and Notebooks
1. Image Classification
2. Linear Classifier
3. Gradient Descent
4. Stochastic Gradient Descent
5. Neural Networks
6. Convolutional Layers II
7. LeNet Architecture
8. VGGNet Architecture
9. Pretrained CNNs
10. Object Detection
11. DCGANs
12. Hugging Face: Transformer Models
13. Hugging Face: Pipeline Function
14. Hugging Face: Models & Tokenizers
15. Hugging Face: Pretrained Models
18. Fine-Tuning BERT Model
19. Machine Translation
20. Text Classification
21. Named Entity Recognition
22. Text Generation
23. Transformers & Production

Day1 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading about History of Deep Learning, Image Fundamentals, Pixels, Scaling and Aspect Ratios, Image Classification, Semantic Gap, Feature Extraction, Viewpoint Variation, Scale Variation, Deformation, Occlusions, Illumination, Background Clutter, Intra-class Variation, Supervised and Unsupervised Learning and few more topics related to the same. I have presented the notes about Image Classification, Semantic Gap, Feature Extraction, Supervised and Unsupervised Learning here in the snapshot. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:
    • Deep Learning for Computer Vision with Python

Image

Day2 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Image Classification, OpenCV, Animals Dataset, Raw Pixel Intensities, Convolutional Neural Networks, Dataset Loader and Preprocessing Modules, Aspect Ratio, Resizing and Scaling and few more topics related to the same from here. I have presented the implementation of Image Preprocessor and Dataset Loader here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day3 of MachineLearningDeepLearning

  • K-Nearest Neighbor: K-Nearest Neighbor Classifier doesn’t actually learn anything, but it directly relies on the distance between feature vectors. On my Journey of Machine Learning and Deep Learning, I have been reading the book "Deep Learning for Computer Vision with Python". Here, I have been reading about Image Classification, K-Nearest Neighbor Classifier, Partitioning Dataset, Preprocessing Images, Model Evaluation and Classification Report, Label Encoder, Hyperparameters and few more topics related to the same from here. I have presented the implementation of K-Nearest Neighbor Classifier and Model Evaluation here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day4 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Parameterized Learning, Cross Entropy Loss and Softmax Classifiers, Weights and Biases, Squared Hinge Loss, Scoring Function and Optimization, Linear Classification and few more topics related to the same from here. I have presented the notes about K-Nearest Neighbor and Parameterized Learning here in the snapshot. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day5 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I am reading about Optimization Methods and Regularization, Parameterized Learning and Optimization, Gradient Descent, Loss Landscape and Optimization Surface, Local and Global Minimum, Loss Function, Partial Derivative, Classification Accuracy and few more topics related to the same from here. I have also read about Data Pipeline, Meta-Data, Data Provenance and Lineage and Label Consistency from Introduction to Machine Learning in Production course of Coursera. I have presented the notes about Gradient Descent and Optimization here in the snapshot. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day6 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Optimization Methods and Regularization, Gradient Descent, Sigmoid Activation Function, Weights and Learning Rate, Iterative Algorithm, Classification Report, Stochastic Gradient Descent, Mini-batch SGD and few more topics related to the same from here. I have presented the implementation of Sigmoid Activation Function and Gradient Descent here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day7 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Stochastic Gradient Descent, Mini-batch SGD, Sigmoid Activation Function, Weight Matrix and Losses, Momentum, Nesterov's Acceleration, Regularization, Overfitting and Underfitting and few more topics related to the same from here. I have presented the implementation of Sigmoid Activation Function and Stochastic Gradient Descent and notes about Regularization here in the snapshots. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day8 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Regularization, Cross Entropy Loss Function, Updating Loss and Weight, L2 Regularization and Weight Decay, Elastic Net Regularization, Image Classification, SGD Classifier, Label Encoder and few more topics related to the same from here. I have presented the implementation of Preprocessing Dataset, Encoding Labels, SGD Classifier and Regularization here in the snapshots. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day9 of MachineLearningDeepLearning

  • Neural Networks: Neural networks are the building blocks of deep learning systems. A system is called a neural network if it contains a labeled, directed graph structure where each node in the graph performs some computation. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Neural Networks, Human Neuron Anatomy, Artificial Models, Weights and Gradients and few more topics related to the same from here. I have also spend time in Using Fasti on Sequences of Images & Video. I have presented the implementation of Preparing Dataset, Decoding Videos and Extracting Images using Fastai & PyTorch here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:
    • Deep Learning for Computer Vision with Python

Image

Day10 of MachineLearningDeepLearning

  • Neural Networks: Neural networks are the building blocks of deep learning systems. A system is called a neural network if it contains a labeled, directed graph structure where each node in the graph performs some computation. Rectified Linear Unit:ReLU is zero for negative inputs but increases linearly for positive inputs. The ReLU function is not saturable and is also extremely computationally efficient. ReLU is the most popular activation function used in deep learning and has stronger biological motivations. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have read about Activation Functions: Sigmoid, Tanh, ReLU, Feedforward Neural Network Architecture, Neural Learning, The Perceptron Algorithm, AND, OR and XOR Datasets, Perceptron Training Procedure and Delta Rule and few more topics related to the same from here. I have presented the notes about Sigmoid Function, ReLU and Feedforward Networks here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:
    • Deep Learning for Computer Vision with Python

Image

Day11 of MachineLearningDeepLearning

  • Rectified Linear Unit:ReLU is zero for negative inputs but increases linearly for positive inputs. The ReLU function is not saturable and is also extremely computationally efficient. ReLU is the most popular activation function used in deep learning and has stronger biological motivations. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Neural Networks, Perceptron Algorithm, Learning Rate, Weight Matrix and Bias, Dot Product, Linear and Non-linear Datasets, Backpropagation and Multilayer Networks, Forward Pass and Backward Pass and few more topics related to the same from here. I have presented the implementation of Perceptron Algorithm here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day12 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Nonlinear XOR Dataset, Learning Rate and Weight Initializations, Neural Networks Architecture, Squared Loss, Backpropagation, Sigmoid Activation Function, Gradient Descent and Weight Updates, Derivatives and Chain Rule, Dot Product and few more topics related to the same from here. I have presented the implementation of Neural Network and Backpropagation here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day13 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been learning about Multi-layer Neural Networks, Backpropagation, Min-max Normalization, One Hot Encoding and Feature Vectors, Probabilities, Gradient Descent, Label Binarizer, Classification Report, SGD Optimizer and Cross Entropy Loss Function and few more topics related to the same from here. I have presented the implementation of Neural Network and Backpropagation using Keras here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day14 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been learning about Convolutional Neural Networks, Convolutions versus Cross-correlation, Kernels, CNN Building Blocks, Layer Types, Depth, Stride, Zero-padding, Filters and Receptive Field and few more topics related to the same from here. I have presented the notes about Backpropagation and Convolutional Layers here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day15 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Activation Layers, Pooling Layers, RELU, Fully Connected Layers, Batch Normalization Layer, Dropout Layer, Convolutional Neural Networks Patterns, Image To Array Preprocessor, Resizing and Shallow Network, Sequential Model and few more topics related to the same from here. I have presented the implementation of Image to Array Preprocessor and Shallow Network here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day16 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about LeNet Architecture, Convolutional Layers, RELU Activation Function, Max Pooling Layer, Fully Connected Dense Layer, Softmax Activation Function, Input Data Format and Channels, Flatten Layer, Label Binarizer and Encoding, SGD, Classification Report and few more topics related to the same from here. I have presented the implementation of LeNet Architecture, Training and Model Evaluation here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day17 of MachineLearningDeepLearning

  • Logistic Regression: However, when unnecessary or excessive number of variables is used in logistic regression model, peculiarities i.e. special attributes of the underlying dataset disproportionately affect the coefficient of the model, the phenomena commonly known as overfitting. So, it is most important that the logistic regression model doesn't start training more variables than is justified for the given number of observations. On my Journey of Machine Learning and Deep Learning, I have been reading the book "Deep Learning for Computer Vision with Python". Here, I have been reading about VGG Networks, Batch Normalization, Max Pooling and Activations, Fully Connected Layers, Classification Report, Learning Rate and Decay Parameters and few more topics related to the same from here. I have presented the implementation of VGGNet Architecture here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day18 of MachineLearningDeepLearning

  • Logistic Regression: In case of logistic regression, the response variable is the log of odds of being classified in a group of binary or multi-class responses. This definition essentially demonstrates that odds can take the form of a vector. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Learning Rate Schedulers, Step-based Decay, Spotting Overfitting and Underfitting, Training Error and Generalization Error, Effects of Learning Rates, Loss and Accuracy Curves, VGG Network Architectures and few more topics related to the same from here. I have presented the notes of VGGNet Architecture here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day19 of MachineLearningDeepLearning

  • Convolutional Neural Networks: Convolutions are just a type of matrix multiplication with two constraints on the weight matrix: some elements are always zero and some elements are tied or forced to always have the same value. Batch Normalization adds some extra randomness to the training process. Larger batches have gradients that are more accurate since they are calculated from more data. But larger batch size means fewer batches per epoch which means fewer opportunities for the model to update weights. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Pretrained Convolutional Neural Networks for Classification, VGG Neural Networks, ResNet Architectures, Inception V3 and GoogLeNet, Xception, Processing Images and ImageNet and few more topics related to the same from here. I have presented the implementation of pretrained VGGNet and Xception modules here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day20 of MachineLearningDeepLearning

  • Object Detection: Object Detection is a computer technology related to computer vision and image processing that deals with detecting instances of semantic objects of a certain class. On my Journey of Machine Learning and Deep Learning, I have been reading the book Deep Learning for Computer Vision with Python. Here, I have been reading about Object Detection with Pretrained Networks, COCO Dataset, Preprocessing Images and Video, Real Time Object Detection and few more topics related to the same from here. I have presented the implementation Object Detection here in the snapshot. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day21 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have started reading the book Deep Learning - Ian Goodfellow. Here, I have read about Introduction to DL, Scalars, Vectors, Matrices and Tensors, Random Variables and Probability Distributions, Overflow and Underflow, Gradient Based Optimization, Learning Algorithms and many topics related to the same. I have presented the implementation The Generator and The Discriminator using PyTorch here in the snapshots. I hope you will gain some insights and work on the same. I hope you will also spend some time learning the topics from the Book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day22 of MachineLearningDeepLearning

  • Pattern Recognition: The field of pattern recognition is concerned with the automatic discovery of regularities in data through the use of computer algorithms and with the use of these regularities to take actions such as classifying the data into different categories. Generalization, the ability to categorize correctly new examples that differ from those used for training is a central goal in pattern recognition. On my Journey of Machine Learning and Deep Learning, I have started reading the book Pattern Recognition and Machine Learning - Bishop. Here, I have read about Training and Learning Phase, Generalization, Pattern Recognition and Feature Extraction, Supervised Learning, Unsupervised Learning, Clustering and Density Estimation, Reinforcement Learning and few more topics related to the same. I have shared the notes about Pattern Recognition, Supervised and Unsupervised Learning and Reinforcement Learning here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day23 of MachineLearningDeepLearning

  • Reinforcement Learning: The technique of reinforcement learning is concerned with the problem of finding suitable actions to take in a given situation in order to maximize a reward. A general feature of reinforcement learning is the trade-off between exploration, in which the system tries out new kinds of actions to see how effective they are, and exploitation, in which the system makes use of actions that are known to yield a high reward. On my Journey of Machine Learning and Deep Learning, I have read about Deep Convolutional Generative Adversarial Networks, Image Segmentation, The Generator and The Discriminator, Weights Initialization, RELU Function, Convolutional Layers, Batch Normalization Layer, Cross Entropy Loss Function, Data Loader, Gradients and few more topics related to the same from here. I have presented the implementation of Training DCGANs here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day24 of MachineLearningDeepLearning

  • Unsupervised Learning: The pattern recognition problems in which the training data consists of a set of input vectors x without any corresponding target values are called unsupervised learning problems. The goal in such unsupervised learning problems may be to discover groups of similar examples within the data, where it is called clustering, or to determine the distribution of data within the input space, known as density estimation, or to project the data from a high-dimensional space down to two or three dimensions for the purpose of visualization. On my Journey of Machine Learning and Deep Learning, I have started reading the book Pattern Recognition and Machine Learning - Bishop. Here, I have read about Probability Theory, The Rules of Probability, Bayes' Theorem, Probability Densities, Expectations and Covariances, Bayesian Probabilities, The Gaussian Distribution, Maximum Likelihood and few more topics related to the same from here. I have shared the notes about Probability Theory, Bayes' Theorem and The Rules of Probability here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day25 of MachineLearningDeepLearning

  • On my Journey of Machine Learning and Deep Learning, I have started reading the book "Pattern Recognition and Machine Learning - Bishop". Here, I am reading about Model Selection, The Curse of Dimensionality, Decision Theory, Minimizing the Misclassification Rate, Minimizing the Expected Loss, The Reject Option, Joint Probability Distribution and few more topics related to the same from here. I have shared the notes about Model Selection, Decision Theory and Loss Function here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:
    • Pattern Recognition and Machine Learning
    • DCGANs

Image

Day26 of MachineLearningDeepLearning

  • Natural Language Processing:NLP is a field of linguistics and machine learning focused on understanding everything related to human language. The aim of NLP tasks is not only to understand single words individually, but to be able to understand the context of those words. On my journey of Machine Learning and Deep Learning, I have started reading about Hugging Face. I have read about Natural Language Processing and Challenges, Sentiment Analysis, Zero Shot Classification, Text Generation, Mask Filling, Named Entity Recognition, Summarization, and Translation and few more topics related to the same from here. I have presented the implementation of Transformer Models here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day27 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have started reading about Hugging Face. I have read about Transformers, Transfer Learning, Attention Layers, Encoder Models, Decoder Models, Sequence2Sequence Models, Bias & Limitations, Pipeline Function, Tokenizer, Model Head and few more topics related to the same from here. I have presented the implementation of Pipeline Function here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day28 of MachineLearningDeepLearning

  • Sequential Learning: If the data set is sufficiently large, it may be worthwhile to use sequential algorithms, also known as on-line algorithms, in which the data points are considered one at a time, and the model parameters are updated after each such presentation. Sequential learning is also appropriate in which the data observations are arriving in a continuous stream. It can be achieved by applying the technique of stochastic gradient descent, also known as sequential gradient descent. On my journey of Machine Learning and Deep Learning, I am reading the book Pattern Recognition and Machine Learning - Bishop. Here, I have read about Sequential Learning, Geometry of Least Squares, Maximum Likelihood, Linear Basis Function Models, Linear Regression, Binary Variables and Multinomial Variables and few more topics related to the same from here.
  • Book:
    • Pattern Recognition and Machine Learning

Image

Day29 of MachineLearningDeepLearning

  • Attention Masks: Attention masks are tensors with the same shapes as the input IDs tensors, filled with 0s and 1s: 1s indicate the corresponding tokens should be attended to, and 0s indicate the corresponding tokens should not be attended to i.e. they should be ignored by the attention layers of the model. On my journey of Machine Learning and Deep Learning, I have started reading from Hugging Face. Here, I have read about Creating a Transformer, Tokenizers, Word-based, Character-based and Subword Tokenization, Encoding, Decoding, Padding and Attention Masks and few more topics related to the same from here. I have presented the implementation of Tokenizers, Padding and Attention Masks here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day30 of MachineLearningDeepLearning

  • Subword Tokenization: Subword Tokenization rely on the principle that frequently used words should not be split into smaller subwords, but rare words should be decomposed into meaningful subwords. On my journey of Machine Learning and Deep Learning, I have started reading from Hugging Face. I have read about Configurable Tokenizer Methods, Attention Masks, Tokenization Pipeline, Vocabulary, Tensors and Arrays, Padding and Truncation and few more topics related to the same from here. I have presented the implementation of Tokenization Pipeline here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day31 of MachineLearningDeepLearning

  • Dynamic Padding: The function that is responsible for putting together samples inside a batch is called a collate function. Dynamic Padding means the samples in the batch should all be padded to the maximum length inside the batch. On my journey of Machine Learning and Deep Learning, I have been reading from Hugging Face. I have read about Loading & Processing the Data, Tokenization, Dynamic Padding, Collate Function, Fine Tuning and Trainer API, Training Arguments, Transformer Model and few more topics related to the same from here. I have presented the implementation of Tokenization Pipeline & Training here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image Image

Day32 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been reading the book Transformers for Natural Language Processing. Here, I have read about Transformers, Encoder & Decoder, Positional Encoding, Multi-head Attention, BERT Architecture, Fine-tuning BERT, Optimizer & Hyperparameters, Masked Language Modeling, Next Sentence Prediction, Matthews Correlation Coefficient and few more topics related to the same from here. I have presented the implementation of Training BERT Model here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day33 of MachineLearningDeepLearning

  • Machine Translation:Machine translation is the process of reproducing human translation by machine transductions and outputs. The transduction process of the original Transformer architecture uses the encoder, the decoder stack, and all of the model's parameters to represent a reference sequence. On my journey of Machine Learning and Deep Learning, I have been reading the book Transformers for Natural Language Processing. Here, I have read about Pretraining RoBERTa Model, Machine Transduction & Transformers, Machine Translation, BLEU and Trax and few more topics related to the same from here. I have presented the implementation of Processing WMT Dataset here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day34 of MachineLearningDeepLearning

  • Encoder-Decoder Architectures:The numerical representation computed for a given token in encoder only transformer architecture depends both on the left or before the token and the right or after the token contexts which is called bidirectional attention. The numerical representation computed for a given token in decoder only transformer architecture depends only on the left context which is called autoregressive attention. On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Encoder-Decoder Architectures, Attention Mechanisms, Transfer Learning, 🤗 Ecosystem, Text Classification, Class Distributions, Tokenization, Fine-Tuning Transformers and Feature Extraction and many more topics related to Transformers.
  • Book:

Image

Day35 of MachineLearningDeepLearning

  • Named Entity Recognition:NER is a common NLP task that identifies entities like people, organizations or locations in text. These entities can be used for various applications such as gaining insights from documents, augmenting the quality of search engines, or building a structured database from a corpus. On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Multilingual Named Entity Recognition, Cross-Lingual Transfer, Text Generation, Greedy Search Decoding, Beam Search Decoding, Sampling Methods, Top-k and Nucleus Sampling, Fine-Tuning XLM-Roberta, Error Analysis and many more topics related to the same from here. I have presented the implementation of Text Generation here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day36 of MachineLearningDeepLearning

  • Knowledge Distillation: Knowledge distillation is a general purpose method for training a smaller student model to mimic the behavior of a slower, larger, but better performing teacher model. The KL divergence expects the inputs in the form of log probabilities and labels as normal probabilities. So, we have used log softmax to normalize the student's logits while teacher's logits are converted to probabilities with a standard softmax. On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Performance Benchmarking, Knowledge Distillation for Fine-Tuning, Distillation Trainer, Text Summarization and Question Answering Pipelines and many more topics related to the same. I have presented the implementation of Distillation Training Arguments, Trainer and Computing metrics using Transformers here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day37 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been reading the book Natural Language Processing with Transformers. Here, I have read about Dealing with Few Labels, Zero-Shot Classification, Multilabel Text Classification & Multilabel Binarizer, Training Slices, Naive Bayes Classifier, Natural Language Inference, Data Augmentation and Embeddings, Mean Pooling and many topics related to the same from here. I have presented the implementation of Training Naive Bayes Classifier and Mean Pooling using Transformers here in the snapshot. I hope you will gain some insights and you will also spend some time learning the topics from the book mentioned below. Excited about the days ahead !!
  • Book:

Image

Day38 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have started taking the course Stanford CS224N: NLP with Deep Learning. Here, I am learning about Word Vectors, Conditional Probability Distribution, Distributional Semantics, Word2Vec Embedding Model, Softmax Function, Human Language and Word meaning, and many more topics related to the same. I have shared the notes about Word Vectors, Distributional Semantics, and Word2Vec Model here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • Stanford CS224N: NLP with Deep Learning

Image

Day39 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been taking the course Stanford CS224N: NLP with Deep Learning. Here, I am learning about Word Vectors, Word2Vec Model, Training Methods and Algorithms, Skip Grams and Continuous Bag of Words Models, Gradient Descent and Problems, Stochastic Gradient Descent, and many more topics related to the same. I have shared the notes about Word2Vec Model, and Gradient Descent here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • Stanford CS224N: NLP with Deep Learning

Image

Day40 of MachineLearningDeepLearning

  • On my journey of Machine Learning and Deep Learning, I have been taking the course UMass CS685: Advanced Natural Language Processing. Here, I am learning about Natural Langauge Processing, Supervised & Self-supervised Learning, Representation Learning, Sentiment Analysis, Language Modeling & Importances, Chain Rule & Markov Rule, Unigram & Bigram Models, Log Probabilities, Perplexity, and many more topics related to the same. I have shared the notes about Language Modeling & Perplexity here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • UMass CS685: Advanced Natural Language Processing

Image

Day41 of MachineLearningDeepLearning

  • Batch size generally tells us how many training examples do we use to estimate the derivate of loss with respect to the parameters before taking a step. On my journey of Machine Learning and Deep Learning, I have been taking the course UMass CS685: Advanced Natural Language Processing. Here, I am learning about N-gram Models, Recurrent Neural Networks, Batch Size, Cross-entropy Loss Function, Gradient Descent & Backpropagation, Composition Functions, Forward Propagation, One-hot Vectors & Vocabulary, and many more topics related to the same. I have shared the notes about Language Modeling & Gradient Descent here in the snapshot. I hope you will gain some insights and spend time learning the topics from the course mentioned below. I am excited about the days ahead.
  • UMass CS685: Advanced Natural Language Processing

Image

Day42 of MachineLearningDeepLearning

  • Text to Speech: Text-to-speech (TTS) also known as speech synthesis, which aims to synthesize intelligible and natural speech when given a text, has broad applications in human communication. Developing a TTS system requires knowledge of languages and human speech production. Transformer-based Acoustic Models: TransformerTTS leverages Transformer based encoder -attention-decoder architecture to generate mel-spectograms from phonemes. TransformerTTS adopts the basic model structure of Transformer and absorbs some designs from Tacotron 2 such as decoder pre net and post-net and stop token prediction. It achieves similar voice quality with Tacotron 2 but enjoys faster training time. I have shared the notes about Text-to-Speech & Text Analysis here in the snapshot. I hope you will gain some insights and spend time learning the topics mentioned below. I am excited about the days ahead.
  • ** A Survey on Neural Speech Synthesis**

Image

Day43 of MachineLearningDeepLearning

  • The Module class contains three methods: The init method stores the learnable parameters. The training_step method accepts a data branch to return the loss value. The configure_optimizers method returns the optimization method that is used to update the learnable parameters. The validation_step method reports the evaluation measures. On my journey of Machine Learning and Deep Learning, I am reading the book "Dive into Deep Learning". I have read about Object Oriented Design Implementation & Modules, Linear Regression, Vectorization, Normal Distribution & Squared Loss, Linear Algebra & Calculus, Probability & Statistics, Data Manipulation, Data Preprocessing, and many more topics related to the same. I have shared the implementation of the Module here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day44 of MachineLearningDeepLearning

  • DataLoaders: DataLoaders are a convenient way of abstracting out the process of loading and manipulating data so that the same machine-learning algorithm is capable of processing many different types and sources of data without the need for modification. On my journey of Machine Learning and Deep Learning, I am reading the book "Dive into Deep Learning". I have read about Data Module & DataLoader, Trainer Class & Training, Synthetic Regression Data, Object Oriented Design, Loss & Minibatches, and I have also read about Greedy Algorithms, Dynamic Programming, Backtracking, and many other topics related to the same. I have shared the implementation of the Data Module here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day45 of MachineLearningDeepLearning

  • Attention & Transformers: The idea behind the Transformer model is the attention mechanism, an innovation that was originally envisioned as an enhancement for encoder-decoder RNNs applied to sequence-to-sequence models. The intuition behind attention is that rather than compressing the input, it might be better for the decoder to revisit the input sequence at every step. On my journey of Machine Learning and Deep Learning, I continued reading the book Dive into Deep Learning. I have read about Linear Regression, Training Errors & Generalization Errors, Normalization & Weight Decay, Attention Mechanisms & Transformers, Queries, Keys & Values, Attention Pooling & Nadaraya Watson Regression, and many other topics related to the same. I have shared the implementation of the Nadaraya Watson and Kernels here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day46 of MachineLearningDeepLearning

  • Attention & Transformers: The idea behind the Transformer model is the attention mechanism, an innovation that was originally envisioned as an enhancement for encoder-decoder RNNs applied to sequence-to-sequence models. The intuition behind attention is that rather than compressing the input, it might be better for the decoder to revisit the input sequence at every step. On my journey of Machine Learning and Deep Learning, I continued reading the book Dive into Deep Learning. I have been reading Attention Mechanisms & Transformers, Attention Scoring Functions, Dot Product Attention, Masked Softmax Operation, Batch Matrix Multiplication, Scaled Dot Product Attention, Additive Attention, and many other topics related to the same. I have shared the implementation of the Scaled Dot Product Attention here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.
  • Dive into Deep Learning

Image

Day47 of MachineLearningDeepLearning

  • GPT-based Transformers: GPT-based Transformers are pre-trained on a large corpus of text data, which gives them a general understanding of natural language and allows them to perform well on various downstream tasks with minimal fine-tuning. They predict the next token in a sequence based on the previous tokens, which enables them to generate coherent and fluent text. On my journey of Machine Learning and Deep Learning, I am following the materials of Andrej Karpathy on implementing GPT (ChatGPT) from scratch. I learned about character level tokenization, encoding, and decoding characters, vocabulary tables, data loader, Bigram Language Model and training pipelines and architecture implementation, optimization, and many many other topics related to the same. I have presented the implementation of the Bigram Language Model here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.

Image

Day48 of MachineLearningDeepLearning

  • GPT-based Transformers: GPT-based Transformers are pre-trained on a large corpus of text data, which gives them a general understanding of natural language and allows them to perform well on various downstream tasks with minimal fine-tuning. They predict the next token in a sequence based on the previous tokens, which enables them to generate coherent and fluent text. On my journey of Machine Learning and Deep Learning, I am following the materials of Andrej Karpathy on implementing GPT (ChatGPT) from scratch. I continued working on GPT and learned about Multihead self-attention, Feedforward networks with nonlinearity, Token, and positional embedding, Layer normalization, dropout, and many other topics related to the same. I have presented the implementation of the GPTLanguage Mode or pre-training of ChatGPT here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.

Image

Day49 of MachineLearningDeepLearning

  • Stable Diffusion: Stable Diffusion is a deep learning text-to-image model which is primarily used to generate detailed images conditioned on text descriptions, though it can also be applied to other tasks such as inpainting, outpainting, and generating image-to-image translations guided by a text prompt. It is a latent diffusion model which is a kind of deep generative neural network. On my journey of Machine Learning and Deep Learning, I am learning about GAN Models, Large Language Models, Stable Diffusion, and Whisper. In Stable Diffusion, I learned about Stable Diffusion Pipeline, Classifier-Free Guidance, Negative Prompts, Image to Image Diffusion, and many other topics related to the same. I have presented the implementation of Image to Image Diffusion here in the snapshot. I hope you will gain some insights and spend time learning the topics from the book mentioned below. I am excited about the days ahead.

Image