/Paper-Implementations

My implementation of Machine Learning and Deep Learning papers from scratch.

Primary LanguageJupyter Notebook

Paper-Implementations

My implementation of Machine Learning and Deep Learning papers from scratch.

Paper Name Link to Paper Year Published GitHub Folder
Improving Language Understanding by Generative Pre-Training GPT Paper 2018 GPT Implementation
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding BERT Paper 2019 BERT Implementation
Language Models are Unsupervised Multitask Learners GPT2 Paper 2019 GPT2 Implementation
LoRA: Low-Rank Adaptation of Large Language Models LoRA Paper 2021 LoRA Implementation

Some useful resources.

List of resources, that I found helpful while understanding and coding the concepts.

  1. Attention is all you need (Transformer) - Model explanation (including math), Inference and Training by Umar jamil: Youtube.

  2. Coding a Transformer from scratch on PyTorch, with full explanation, training and inference by Umar jamil: Youtube.

  3. Let's build GPT: from scratch, in code, spelled out by Andrej Karpathy. Youtube.

  4. Formal Algorithms for Transformers. arXiv