OpenSourceTransformers
This is github repo to store all notes and documents related to this program Of course! Here it is:
3-Year Curriculum for Learning Transformer Networks
Year 1: Introduction to Machine Learning and Basic Mathematics
Mathematical Foundation
-
Linear Algebra: Essential for understanding data structures used in ML.
-
Calculus: Essential for understanding how learning happens in ML.
-
Probability and Statistics: Essential for understanding how decision making happens in ML.
Python Programming
Python is the most commonly used programming language in the field of Machine Learning.
Introduction to Machine Learning
Covers basics of machine learning, various types of algorithms and practical implementation.
Year 2: Deep Learning and Neural Networks
Deep Learning Specialization
Covers neural networks, deep learning, structuring machine learning projects, convolutional networks, sequence models etc.
Convolutional Neural Networks
Special type of neural networks very effective for image related tasks.
Recurrent Neural Networks
Special type of neural networks very effective for sequence related tasks like text, time series etc.
Year 3: Advanced Topics and Transformers
Natural Language Processing
Basic understanding of NLP is needed before moving to transformers.
Attention Mechanisms
Understanding attention mechanisms is essential as Transformers are based on this concept.
Transformers
A deep dive into Transformer models which are primarily used in tasks related to Natural Language Processing.
Note: Although I've divided this into a 3-year plan, the pace can be adjusted according to your comfort and understanding. It's important to practice and implement as you learn these topics to reinforce your understanding. You should work on mini-projects and participate in competitions on platforms like Kaggle to get hands-on.
You can copy this markdown directly into a README.md file in a new GitHub repository. The links and formatting should work correctly when viewed on GitHub.