/deep-learning-argorithms-from-sratch

⌨️ Implementations/tutorials of deep learning papers with side-by-side notes

Primary LanguagePythonMIT LicenseMIT

Deep Learning Algorithms, Paper Implementation from Scratch with Python | PyTorch and JAX Framework

👋 Introduction

This is a complination of straightforward implementations for neural networks and associated algorithms. These implementations are thoroughly documented with accompanying explanations.

It is important to note that this project was developed solely for educational purposes, to gain a deeper understanding of algorithms. These implementations may not be suitable for real-world applications. It is highly recommended to utilize other statoe-of-the-art libraries for building models.

📝 Algorithms

Metrics

  • WER (Word Error Rate) | Code, [Blog]
  • WER (Character Error Rate) | Code, [Blog]

Transformers

  1. Multi-head attention
  2. Transformer building blocks

Transform

PyTorch

  • Center Crop,
  • Random Crop
  • Adjust Contrast
  • Adjust Brightness
  • Adjust saturation

JAX

  • Center Crop,
  • Random Crop
  • Adjust Contrast
  • Adjust Brightness
  • Adjust saturation

Bert

GPT

Diffusion models

  1. AutoEncoder
  2. Latent Diffusion Models
  3. Sampling algorithms for stable diffusion

Generative Adversarial Networks

LSTM

ResNet

Optimizers

Normaliztion Layers