Natural Language Processing with Deep Learning
Taking together Stanford CS224n course with support of DeepPavlov team.
Time: весенний семестр 2020 года, вторник, 19:00
Location: Учебный центр 1С, Москва, Дмитровское шоссе, д.9 (метро "Тимирязевская"), аудитория 9235 (2 этаж).
https://t.me/dlinnlp2020spring
News:https://t.me/dlinnlp_discuss
Chat:https://forum.deeppavlov.ai/c/schools-hackatons/Deep-Learning-in-NLP/41
Forum:Course Structure
- weekly quizes
- up to 5 practical hometasks (Jupyter Notebook), to be announced.
- course project (obligatory), to be announced.
Spring 2020 syllabus
Week 1. Word Vector Representations (11.02.2020)
- Word Vector Representations (word2vec)
- Word Vectors and Word Senses (0:00-38:40, 58:00-1:20:00)
- Additional materials:
Week 2. Neural Networks. Backpropagation (18.02.2020)
- CS224n: Word Vectors and Word Senses (38:40-58:00)
- Enriching Word Vectors with Subword Information
- CS231n: Backpropagation, Neural Networks 1
- Additional materials:
Week 3. Neural Networks. Initialization and Normalization
Week 4. Neural Networks. Optimization
Week 5. Recurrent Neural Networks and Language Models
Week 6. Vanishing Gradients, Fancy RNNs
- CS224n: Vanishing Gradients, Fancy RNNs
- Paper: On the difficulty of training recurrent neural networks
- Article: Understanding LSTM Networks
Week 7. Convolutional Networks for NLP
- CS224n: Convolutional Networks for NLP
- Paper: Natural Language Processing (Almost) from Scratch
- Paper: Comparative Study of CNN and RNN for Natural Language Processing
- Paper: Convolutional Neural Networks for Sentence Classification
Week 8. Translation, Seq2Seq, Attention
- CS224n: Translation, Seq2Seq, Attention
- Lecture Notes
- Paper: Neural Machine Translation by Jointly Learning to Align and Translate
- Paper: Effective approaches to attention-based neural machine translation
- Article: Attention? Attention! by Lilian Weng
Approximate Syllabus
Week 6. Deep contextualized word representations
Week 7. Translation, Seq2Seq, Attention
Week 8. Contextual Word Embeddings
Week 9. Question Answering
Week 10. Natural Language Generation
Project Proposals
- The BERT-based Schema-Guided State Tracking [paper] [paper] [pic]
- The BERT Cross-Lingual Transferability [medium] [paper]
- BERT adaptation for new languages and tasks [presentation]
- How conversational is Conversational BERT? [docs]
- Grammatical error correction [Shared Task] [paper for Russian]
- Semi-supervised morpheme segmentation [paper]
- Low-resource morphological inflection [Shared Task] (to be updated)
- More morphological inflection [Shared Task]
- Automatic data augmentation (to be updated)
- Автоматическое решение ЕГЭ [Соревнование]
- Образовательные приложения глубокого обучения (обучение иностранному или русскому языку)
- [Semeval 2020]
- [SemEval 2018]
- [SemEval 2019]
- [Taxonomy enrichment]
- Russian aspect-based sentiment analysis [Dialog-2015]
Related Courses
- CS224n: Natural Language Processing with Deep Learning [course] [youtube]
- CS231n: Convolutional Neural Networks for Visual Recognition [course] [youtube]
- Machine Learning Glossary
- Open Machine Learning Course by @yorko
- DEEP LEARNING НА ПАЛЬЦАХ
- Theoretical Deep Learning