This repository contains my projects and assignments from the Natural Language Processing course in the Deep Learning Specialization on Coursera. The course focuses on sequence models and their applications in areas such as speech recognition, music synthesis, chatbots, machine translation, and more.
- Building and training Recurrent Neural Networks (RNNs) and their variants like GRUs and LSTMs.
- Applying RNNs to character-level language modeling.
- Gaining experience with natural language processing and Word Embeddings.
- Utilizing HuggingFace tokenizers and transformer models for tasks like Named Entity Recognition (NER) and Question Answering.
The course is structured into several modules, covering a range of topics and practical assignments.
- Topics Covered: Basics of RNNs, Backpropagation, Language Model and Sequence Generation, Vanishing Gradients, GRUs, LSTMs, Bidirectional RNNs, Deep RNNs.
- Programming Assignments:
- Building an RNN step by step
- Dinosaur Island - Character-level language modeling
- Jazz improvisation with LSTM
- Topics Covered: Word Representation, Word2Vec, GloVe Word Vectors, Sentiment Classification, Debiasing Word Embeddings.
- Programming Assignments:
- Operations on Word Vectors - Debiasing
- Emojify
- Topics Covered: Beam Search, BLEU Score, Attention Model, Speech Recognition, Trigger Word Detection.
- Programming Assignments:
- Neural Machine Translation
- Trigger Word Detection
- Topics Covered: Transformer Network Intuition, Self-Attention, Multi-Head Attention, Transformer Network.
- Programming Assignments:
- Transformers Architecture with TensorFlow
- Ungraded Labs:
- Transformer Pre-processing
- Transformer Network Application: Named-Entity Recognition
- Transformer Network Application: Question Answering
Each folder in this repository corresponds to a module in the course and includes:
- Jupyter notebooks with code, detailed explanations, and comments.
- Datasets used in the projects (or links to access them).
- Supplementary resources and notes.
- Proficiency in designing and implementing sequence models for various NLP tasks.
- Deep understanding of the mechanics behind RNNs, LSTMs, GRUs, and Transformers.
- Experience with advanced NLP techniques like word embeddings and attention mechanisms.
- Familiarity with Python, TensorFlow, and HuggingFace libraries in NLP applications.
- Explore Projects: Navigate through each project folder to see the specific models and tasks implemented.
- Datasets and Models: Follow instructions within each project to access the datasets or pre-trained models used.
- Running the Notebooks: Jupyter notebooks can be run to replicate the results or used as a starting point for further exploration.
I extend my gratitude to the course instructors and Coursera for an enriching and comprehensive learning experience in the field of NLP and sequence models.