This is a course on natural language processing.
-
Lecturer: Felipe Bravo-Marquez
-
Lectures: Tuesday 14:30 - 16:00, Thursday 14:30 - 16:00 (Lecture Room B104, Beauchef 851, Piso 1, Edificio Norte)
-
Course Program (in Spanish)
The neural network-related topics of the course are taken from the book of Yoav Goldberg: Neural Network Methods for Natural Language Processing. The non-neural network topics (e.g., grammars, HMMS) are taken from the course of Michael Collins.
- Introduction to Natural Language Processing | (tex source file)
- Vector Space Model and Information Retrieval | (tex source file)
- Language Models (slides by Michael Collins), notes, videos 1, videos 2, videos 3
- Text Classification and Naive Bayes (slides by Dan Jurafsky), notes, video 1, video 2, video 3, video 4, video 5, video 6, video 7, video 8, video 9
- Linear Models | (tex source file)
- Neural Networks | (tex source file)
- Word Vectors | (tex source file)
- Tagging, and Hidden Markov Models (slides by Michael Collins), notes, videos
- MEMMs and CRFs | (tex source file)
- Convolutional Neural Networks | (tex source file)
- Recurrent Neural Networks | (tex source file)
- Sequence to Sequence Models | (tex source file)
- Constituency Parsing slides 1, slides 2, slides 3, slides 4 (slides by Michael Collins), notes 1, notes 2, videos 1, videos 2, videos 3, videos 4
- Recursive Networks and Paragraph Vectors | (tex source file)
- Speech and Language Processing (3rd ed. draft) by Dan Jurafsky and James H. Martin.
- Michael Collins' NLP notes.
- A Primer on Neural Network Models for Natural Language Processing by Joav Goldberg.
- Natural Language Understanding with Distributed Representation by Kyunghyun Cho
- Natural Language Processing Book by Jacob Eisenstein
- CS224n: Natural Language Processing with Deep Learning, Stanford course
- NLP-progress: Repository to track the progress in Natural Language Processing (NLP)
- NLTK book
- AllenNLP: Open source project for designing deep leaning-based NLP models
- Real World NLP Book: AllenNLP tutorials
- Attention is all you need explained
- ELMO explained
- BERT exaplained
- Better Language Models and Their Implications OpenAI Blog
- David Bamman NLP Slides @Berkley
- RNN effectiveness
- SuperGLUE: an benchmark of Natural Language Understanding Tasks
- decaNLP The Natural Language Decathlon: a benchmark for studying general NLP models that can perform a variety of complex, natural language tasks.
- Deep Learning in NLP: slides by Horacio Rodríguez
- Chatbot and Related Research Paper Notes with Images
- XLNet Explained
- PyTorch-Transformers: a library of state-of-the-art pre-trained models for Natural Language Processing (NLP)
- Natural Language Processing MOOC videos by Dan Jurafsky and Chris Manning, 2012
- Natural Language Processing MOOC videos by Michael Collins, 2013
- Natural Language Processing with Deep Learning by Chris Manning and Richard Socher, 2017
- CS224N: Natural Language Processing with Deep Learning | Winter 2019
- Visualizing and Understanding Recurrent Networks