Probabilistic Models in NLP
This is the second course of the Natural Language Processing Specialization.
Week 1: Auto-correct using Minimum Edit Distance
- Create a simple auto-correct algorithm using minimum edit distance and dynamic programming
Week 2: Part-of-Speech (POS) Tagging
- Apply the Viterbi algorithm for POS tagging, which is important for computational linguistics
Week 3: N-gram Language Models
- Write a better auto-complete algorithm using an N-gram model (similar models are used for translation, determining the author of a text, and speech recognition)
Week 4: Word2Vec and Stochastic Gradient Descent
- Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model