/embedded.ai

Repository for DCA0306, an undergraduate course about Embedded Artifical Intelligence

Primary LanguageJupyter NotebookMIT LicenseMIT

Federal University of Rio Grande do Norte

Technology Center

Department of Computer Engineering and Automation

Embedded AI

References

  • 📚 Daniel Situnayake and Pete Warden. TinyML: Machine Learning with TensorFlow Lite on Arduino and Ultra-Low-Power Microcontrollers. [Link]
  • 📚 Gian Marco Iodice. TinyML Cookbook: Combine Artificial Intelligence and Ultra-low-power Embedded Devices to Make the World Smarter [Link]
  • 📚 Aurélien Géron. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow [Link]
  • 📚 François Chollet. Deep Learning with Python [Link]

Lessons

Week 01: Course Outline Open in PDF

  • Machine Learning Fundamentals Open in Dataquest
    • You'll learn how machine learning models work, how to build them, and how to optimize them. By the end, you’ll know the basics behind building models that will make data-driven predictions.
    • ⏳ Estimated time: 10h
  • Git and Version Control Open in Dataquest
    • You'll learn how to: a) organize your code using version control, b) resolve conflicts in version control, c) employ Git and Github to collaborate with others.
    • 👊 getting a git repository.
    • ⏳ Estimated time: 5h
  • Complementary materials
    • Google Colab Introduction Open in Loom
    • Google Colab Cont. [optional] Open in Loom Jupyter
    • ⏳ Estimated time: 2h

Week 02: TinyML Fundamentals

  • Why our business need AI? And bigger is not always better!! Open in PDF

  • How do we enable TinyML? Open in PDF Open in Loom

    • Three fundamental steps to explore a TinyML solution
      • Input Open in Loom
      • Processing Open in Loom
      • Output and final remarks Open in Loom
      • ⏳ Estimated time: 30min to 1h.
    • 📄 Further reading paper
      • Vijay Janapa Reddi et al. Widening Access to Applied Machine Learning with TinyML Arxiv
      • ⏳ Estimated time: 4h
  • Machine Learning Fundamentals Open in PDF

    • What is Machine Learning (ML)? Open in Loom
    • ML types Open in Loom
    • Main challenges of ML
      • Variables, pipeline, and controlling chaos Open in Loom
      • Train, dev and test sets Open in Loom
      • Bias vs Variance Open in Loom
    • ⏳ Estimated time: 2h
  • Calculus For Machine Learning Open in Dataquest

    • You'll learn how to: a) define mathematical functions using calculus; b) employ intermediate machine learning techniques.
    • ⏳ Estimated time: 6h

Week 03: TinyML Challenges

  • What are the challenges for TinyML? Open in PDF
  • AI lifecycle and ML workflow Open in PDF
    • AI lifecycle introduction Open in Loom
    • AI infrastructure Open in Loom
    • A typical ML workflow Open in Loom
    • A TinyML workflow Open in Loom
    • ⏳ Estimated time: 30min
  • ML evaluation metrics Open in PDF
    • How to choose an evaluation metric? Open in Loom
    • Threshold metrics Open in Loom
    • Ranking metrics Open in Loom
    • ⏳ Estimated time: 1h
  • Linear Algebra For Machine Learning Open in Dataquest
    • You'll learn how to: a) Understand the key ideas to understand linear systems; b) Apply the concepts to machine learning techniques.
    • ⏳ Estimated time: 6h
  • 📄 Further reading paper
    • Visal Rajapakse et al. Intelligence at the Extreme Edge: A Survey on Reformable TinyML Arxiv
    • Sam Leroux et al. TinyMLOps: Operational Challenges for Widespread Edge AI Adoption Arxiv
    • ⏳ Estimated time: 10h

Week 04: Deep Learning Fundamentals I

  • The big-picture Open in PDF
  • Introduction Open in PDF
    • The perceptron Open in Loom
    • Building Neural Networks Open in Loom
    • Matrix Dimension Open in Loom
    • Applying Neural Networks Open in Loom
    • Training a Neural Networks Open in Loom
    • Backpropagation with Pencil & Paper Open in Loom
    • Learning rate & Batch Size Open in Loom
    • Exponentially Weighted Average Open in Loom
    • Adam, Momentum, RMSProp, Learning Rate Decay Open in Loom
    • ⏳ Estimated time: 6h to 8h
  • Hands on DL fundamentals Open in Dataquest
    • You'll learn how to: a) Understand how neural networks are represented; b) understand how adding hidden layers can provide improved model performance; c) Understand how neural networks capture nonlinearity in the data.
    • ⏳ Estimated time: 8h

Week 05: Deep Learning Fundamentals II

  • A first image classification model using MLOps best practices Open in PDF Jupyter
  • Project 🌟 😺 🐶 🐼
    • Explore MLOps tools Open in Wandb
    • Hyperparameter tuning using Sweeps
    • Compare MLP vs KNN
    • ⏳ Estimated time: 8h

Week 06: Convolutional Neural Networks

  • Previously on last weeks Open in PDF
  • CNN Fundamentals I Open in PDF
    • Convolution with OpenCV and Python Jupyter
  • CNN Fundamentals II Open in PDF
    • Motivation Open in Loom
    • Convolution Layer Open in Loom
    • Convolution Layer - Case Study TinyVGG Open in Loom
    • Pooling Layer Open in Loom
    • Fully-Connected Layer Open in Loom
    • ⏳ Estimated time: 2h
  • CNN Fundamentals III Open in PDF
    • Batch Normalization Fundamentals Open in Loom
    • Batch Normalization Math Details Open in Loom
    • Batch Normalization - Case Study Open in Loom
    • Dropout Open in Loom
    • ⏳ Estimated time: 1h

Week 07: Using CNN to Classify Images

  • A MLOPs pipeline using Tensorflow, Keras, Wandb Jupyter
    • Preprocessing
    • Data segregation
    • Train
    • Test

Week 08: Going Deeper with CNN

  • Study of Classical Architectures Open in PDF
  • LeNet-5 Jupyter
    • Best practices
    • Extensions using: batch normalization, dropout, data augmentation
    • Sweepy (hyperparameter tuning)

Week 09: Going Deeper with CNN II

  • AlexNet Jupyter
  • VGG and Inception Open in PDF Jupyter

Week 10: Transfer Learning

  • Feature extractor and fine-tuning Open in PDF
  • Hands on Jupyter

Week 11: Edge Impulse crash course

  • A brief overview of Edge Impulse Platform Open in Loom
  • Data Acquisition Open in Loom
  • Create a impulse design and a preprocessing task Open in Loom
  • Training Open in Loom
  • Understanding training evaluation metrics Open in Loom
  • Model testing Open in Loom
  • Live classification using a mobile phone Open in Loom
  • AutoML configuration using EON Tuner Open in Loom
  • Understanding the results of EON Tuner and versioning the model Open in Loom
  • Set a primary model using EON Tuner and Transfer Learning Open in Loom
  • Training an EON Tuner primary model using transfer learning Open in Loom
  • Final remarks Open in Loom

Week 12: TFLite Optimizations and Quantization

  • Post Training Quantization (PTQ) Open in PDF
  • Introduction to TensorFlow-Lite Jupyter
  • PTQ of MNIST Jupyter
  • A regression model using TensorFlow-Lite Jupyter
  • Case study using Wandb developed by Ishan Dutta et al.
    • Optimizing Models with Post-Training Quantization in Keras - Part I Open in Wandb
    • Optimizing Models with Quantization-Aware Training in Keras - Part II Open in Wandb