/100DaysOfMLCode

#100DaysofMLCode Challenge

Primary LanguageJupyter Notebook

DAY 1: Today I've learned about Data Preprocessing:

DAY 2: Today I've learned and implemented Simple Linear Regression. Code
LindedIn Post

DAY 3: Started reading a book called "Naked Statistics: Stripping the Dread from the Data" by Charles Wheelan.
Today I've learned about

  • Multiple Linear Regression. Code
  • Backward Elimination
  • Forward Selection
  • Bidirectional Elimination
    LinkedIn Post

DAY 4: Today I've learned and implemented Polynomial Regression. Code
LinkedIn Post

DAY 5: Today I've learned and implemented SVR model. Code
LinkedIn Post

DAY 6: Today I've learned and implemented:

  • Decision Tree Regression. Code
  • Random Forest Regression. Code
  • R-Squared and Adjusted R-Squared

DAY 7: Today I've learned and implemented Logistic Regression. Code

DAY 8: Today I've learned and implemented:

  • K-Nearest Neighbors(K-NN) Code
  • Support Vector Machine(SVM) Code
  • Kernel SVM Code

DAY 9: Today I've learned and implemented:

  • Naive Bayes Classification Code
  • Decision Tree Classification Code
  • Random Forest Classification Code

DAY 10: Today I've learned about False Positives & False Negatives, Confusion Matrix, Accuracy Paradox, CAP Curve.

DAY 11: Today I've learned and implemented K-Means Clustering. Code

DAY 12: Today I've learned and implemented Hierarchical Clustering. Code

DAY 13-20: I've completed the Machine Learning A-Z course on Udemy. I've also completed reading the book "Naked Statistics" by Charles Wheelan.
LinkedIn Post

DAY 21-25: I’ve learned and practiced Numpy, Pandas, and Tensorflow. I’ve also done some basic ML projects like text classification (Fashion MNIST) and text classification (IMDB reviews) based on some youtube tutorials.

DAY 26: Resuming after a break. Revised ML course by Andrew NG on coursera. Completed two weeks' content and assignments. Working on optional assignments. Course Link

DAY 27, 28: Completed the optional assignments on Multivariate Cost function, Gradient Descent and Normal equation. Learned and implemented vectorization of cost function and gradient descent for easy coding in octave.

DAY 29: Revised concepts of Deep learning. Understood the concept with an example of Demand Prediction of a product.

DAY 30: Reviced concepts of Gradient Descent and Back Propagation.

DAY 30: Learned how to implement forward propagation in neural networks using numpy.

DAY 31: Implemented forward propagation using tensorflow and numpy

DAY 32: Introduction to Multi class regression and softmax function.

DAY 33: Got a better and deeper understanding of how Back Propagation works and why it's very important in neural networks.

DAY 34: Started with GenAI with LLMs course on Coursera. Revised the working of transformer.

DAY 35: Learned about bias and variance in ML models.

DAY 36: Revised concept of text generation with transformers.

DAY 37: Revised concept of prompt engineering.

DAY 38: Intro to decison trees.

DAY 39: Intro to clustering.

DAY 40: Learned about K-means clustering.

DAY 41: Learned about optimising K-means algorithm.

DAY 42: Experimented with prompt engineering.

DAY 43: Introduction to anamoly detection.

DAY 44: Revised concept of Normal distribution.

DAY 45: Learned to build Anamoly detection algorithm.

DAY 46: Learned the difference between Anamoly detection vs supervised learning and choosing features.

DAY 47: Introduction to Recommender Systems.

DAY 48: Learned about collaborative filterng and implementing Recommender systems using tensorflow.

DAY 49: Learned the differences between Collaborative filtering and Content baded filtering.

DAY 50: Learned implementation of Collaborative filtering using neural networks and with tensor flow.

DAY 51: Learned the concept and implementation of PCA.

DAY 52: Reviewed reinforcement learning and learned about state action value function with example of a rover landing on moon.

DAY 53: Introduction to reinforcement learning with human feedback for LLMs.

DAY 54: Learned about variational graph auto encoders.