/ML-Journey

Tracking my progress and projects while learning from Google's Crash Course on Machine Learning. Follow along as I implement concepts, tackle exercises, and share insights gained from this foundational ML resource.

Primary LanguageJupyter NotebookMIT LicenseMIT

Progress

Prerequisites and Prework

Before starting the Google Machine Learning Crash Course, it's essential to complete the following prerequisites and prework tasks to ensure a solid foundation.

Prerequisites

Prework

  • Watch Introductory Videos: Completed introductory videos on machine learning.

  • Set Up Python Environment: Installed necessary software for machine learning.

    • Tools:
      • Python, Jupyter Notebook, NumPy, Pandas, Matplotlib, TensorFlow.
    • Completed: Set up the development environment and ran test scripts.
  • Review Linear Algebra: Understand vectors, matrices, and linear transformations.

Course Progress

Week 1: Introduction to Machine Learning

  • Watch "Introduction to Machine Learning" Video
  • Complete "ML Introduction Quiz"
    • Result: 100%
  • Practical Exercise: Implement a simple classification model.
    • Project: Built a simple spam classifier using Python.

Week 2: Framing

  • Read "Framing" Article
    • Link: Framing
    • Complete: Learned how to frame machine learning problems effectively.
  • Complete "Framing Quiz"
    • Result: 90%
  • Practical Exercise: Define the problem for a real-world dataset.
    • Project: Framed a problem for predicting house prices based on various features.

Week 3: Descending into Machine Learning

  • Watch "Descending into Machine Learning" Video
  • Complete "Gradient Descent Exercise"
    • Result: Successfully implemented gradient descent.
  • Practical Exercise: Apply gradient descent to a dataset.
    • Project: Used gradient descent to optimize a linear regression model for predicting sales.

Week 4: First Steps with TensorFlow

  • Read "First Steps with TensorFlow" Article
  • Complete "TensorFlow Basics Exercise"
    • Result: Built and trained a linear regression model.
  • Practical Exercise: Create a model to predict house prices using TensorFlow.
    • Project: Developed and trained a model with TensorFlow to predict house prices.

Week 5: Generalization

  • Read "Generalization" Article
    • Link: Generalization
    • Complete: Learned about overfitting, underfitting, and improving model generalization.
  • Complete "Overfitting and Underfitting Exercise"
    • Result: Applied techniques to avoid overfitting.
  • Practical Exercise: Implement regularization to improve model performance.
    • Project: Applied L2 regularization to a model to reduce overfitting.

Week 6: Training and Testing Sets

  • Read "Training and Testing Sets" Article
  • Complete "Train/Test Split Exercise"
    • Result: Successfully split data into training and testing sets.
  • Practical Exercise: Evaluate model performance using different data splits.
    • Project: Implemented train/test split and evaluated model accuracy on different datasets.

Week 7: Validation

  • Read "Validation" Article
    • Link: Validation
    • Complete: Explored model validation techniques.
  • Complete "Cross-Validation Exercise"
    • Result: Implemented cross-validation to assess model performance.
  • Practical Exercise: Use cross-validation to tune hyperparameters.
    • Project: Conducted cross-validation on a dataset to find optimal hyperparameters for a classification model.

Week 8: Regularization for Simplicity

  • Read "Regularization for Simplicity" Article
  • Complete "Regularization Exercise"
    • Result: Applied L1 and L2 regularization to different models.
  • Practical Exercise: Experiment with different regularization techniques.
    • Project: Compared model performance with L1, L2, and dropout regularization.

Week 9: Classification

  • Read "Classification" Article
    • Link: Classification
    • Complete: Learned about binary and multi-class classification.
  • Complete "Classification Algorithms Exercise"
    • Result: Built and evaluated a logistic regression classifier.
  • Practical Exercise: Implement and compare different classification algorithms.
    • Project: Built logistic regression, decision tree, and random forest classifiers and compared their performances on a dataset.

Week 10: Conclusion and Next Steps

  • Watch "Conclusion and Next Steps" Video
  • Complete "Final Project"
    • Result: Developed a comprehensive project to classify images using a convolutional neural network (CNN).
  • Plan Future Learning: Outlined future goals and additional resources for continued learning in machine learning and data science.

Planned Projects

Project 1: Predictive Maintenance

  • Objective: Develop a model to predict machinery failures before they occur.
  • Description: Used historical maintenance data to build a predictive model that forecasts equipment breakdowns, allowing for proactive maintenance.
  • Tools: Python, TensorFlow, Pandas, Scikit-learn.

Project 2: Sentiment Analysis

  • Objective: Analyze sentiment in social media posts to gauge public opinion on various topics.
  • Description: Collected and processed social media data to classify sentiments (positive, negative, neutral) using natural language processing techniques.
  • Tools: Python, NLTK, Scikit-learn.

Project 3: Image Classification with Convolutional Neural Networks (CNN)

  • Objective: Classify images into different categories using CNNs.
  • Description: Built and trained a CNN on a dataset of images to classify them into predefined categories, such as cats vs. dogs.
  • Tools: Python, TensorFlow, Keras.

Project 4: Customer Segmentation

  • Objective: Segment customers based on purchasing behavior for targeted marketing.
  • Description: Analyzed customer data to create segments using clustering techniques, enabling personalized marketing strategies.
  • Tools: Python, Pandas, Scikit-learn, Matplotlib.

Project 5: Recommendation System

  • Objective: Develop a recommendation system to suggest products to users.
  • Description: Built a collaborative filtering-based recommendation system to recommend products based on user preferences and past behavior.
  • todo: Implementing collaborative filtering and matrix factorization techniques.
  • Tools: Python, Pandas, Scikit-learn, Surprise.

Project 6: Time Series Forecasting

  • Objective: Forecast future stock prices using historical data.
  • Description: Applied time series analysis techniques, including ARIMA and LSTM, to predict stock price trends.
  • Tools: Python, Pandas, Scikit-learn, TensorFlow.

Project 7: Spam Detection

  • Objective: Classify emails as spam or not spam using machine learning.
  • Description: Built a classification model using natural language processing to detect and filter spam emails.
  • todo: using a Naive Bayes classifier.
  • Tools: Python, Scikit-learn, NLTK.

Project 8: Handwritten Digit Recognition

  • Objective: Recognize handwritten digits using neural networks.
  • Description: Trained a neural network on the MNIST dataset to classify handwritten digits with high accuracy.
  • Tools: Python, TensorFlow, Keras.

Project 9: Fraud Detection

  • Objective: Detect fraudulent transactions in financial datasets.
  • Description: Implemented machine learning techniques to identify fraudulent transactions, focusing on anomaly detection.
  • Tools: Python, Pandas, Scikit-learn.

Project 10: Object Detection

  • Objective: Detect and classify objects within images using deep learning.
  • Description: Used YOLO (You Only Look Once) algorithm to detect and classify multiple objects in images.
  • Tools: Python, TensorFlow, OpenCV.

Project 11: Chatbot Development

  • Objective: Create an intelligent chatbot capable of understanding and responding to user queries.
  • Description: Developed a chatbot using natural language processing and machine learning to simulate human-like interactions.
  • Tools: Python, TensorFlow, NLTK.

Project 12: Image Style Transfer

  • Objective: Apply artistic styles to images using neural networks.
  • Description: Implemented neural style transfer to combine content from one image with the style of another.
  • Tools: Python, TensorFlow, Keras.

Project 13: Natural Language Translation

  • Objective: Build a model to translate text from one language to another.
  • Description: Used sequence-to-sequence learning with neural networks to develop a language translation model.
  • Tools: Python, TensorFlow, Keras.

Project 14: Housing Price Prediction

  • Objective: Predict housing prices using regression models.
  • Description: Built a regression model to predict housing prices based on features such as location, size, and amenities.
  • todo: using linear regression and random forest models.
  • Tools: Python, Scikit-learn, Pandas.

Experiments

Experiment 1: Hyperparameter Tuning

  • Objective: Optimize hyperparameters for a neural network to improve performance.
  • Description: Conducted experiments to find the best combination of hyperparameters for a neural network.
  • Tools: Python, TensorFlow, Keras.

Experiment 2: Data Augmentation

  • Objective: Use data augmentation techniques to enhance the robustness of an image classification model.
  • Description: Applied techniques such as rotation, flipping, and zooming to augment the training dataset.
  • Tools: Python, TensorFlow, Keras.

Experiment 3: Ensemble Learning

  • Objective: Improve model performance by combining multiple learning algorithms.
  • Description: Tested different ensemble methods like bagging, boosting, and stacking to increase accuracy and robustness.
  • Tools: Python, Scikit-learn.

Experiment 4: Feature Engineering

  • Objective: Enhance model performance by creating new features from existing data.
  • Description: Experimented with different feature engineering techniques to uncover insights and improve model predictions.
  • Tools: Python, Pandas.

Experiment 5: Model Interpretability

  • Objective: Make machine learning models more interpretable and transparent.
  • Description: Used tools like LIME and SHAP to explain model predictions and ensure accountability.
  • Tools: Python, LIME, SHAP.

Future Plans

  • Explore Kaggle Competitions

    • Objective: Participate in Kaggle competitions to apply machine learning skills in real-world challenges.
    • Status: Reviewing available competitions and selecting one to start with.
  • Read Research Papers

    • Objective: Stay updated with the latest advancements in machine learning by reading research papers.
    • Status: Compiling a list of recommended papers to start reading.
  • Complete an Advanced Course

    • Objective: Enroll in an advanced machine learning course to deepen understanding.
    • Status: Researching courses on platforms like Coursera and edX.