- For full Data Science tasks, materials, etc. please check Data Science repository.
- For Machine Learning algorithms please check Machine Learning repository.
- For Deep Learning algorithms please check Deep Learning repository.
- For Computer Vision please check Computer Vision repository.
- NLP-progress - Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
Folders with all materials for specific task/domain
- Data Analysis
- Knowledge Graph
- Models and Algorithms
- Ontologies
- Question Answering System
- Search Engine
- Sentiment Analysis
- Shallow Discourse Parsing
- Text Classification
- Topic Modeling
- Word Embedings
- Natural Language Processing - Stanford University| Dan Jurafsky, Christopher
- Natural Language Processing with Deep Learning (Stanford CS224N)
- Course website
- Stanford CS224N: Natural Language Processing with Deep Learning | Winter 2021
- Natural Language Processing with Dan Jurafsky and Chris Manning, 2012 Stanford Online
Modern NLP techniques from recurrent neural networks and word embeddings to transformers and self-attention. Covers applied topics like questions answering and text generation.
- Stanford CS224U: Natural Language Understanding | Spring 2019
- Natural Language Processing | University of Michigan
- A Code-First Introduction to NLP course by fast.ai
- From Languages to Information by Stanford University
- Deep Learning for Natural Language Processing by University of Oxford
- Natural Language Processing by University of Washington
- Natural Language Processing by Yandex Data School
This is an extension to the (ML for) Natural Language Processing course teached at the Yandex School of Data Analysis (YSDA). For now, only part of the topics is likely to be covered here.
- Natural Language Processing by National Research University Higher School of Economics (via Coursera)
- Applied Natural Language Processing by UC Berkeley
- Advanced Methods in Natural Language Processing by Tel Aviv University
- Text Retrieval and Search Engines [FULL COURSE] | UIUC
This course will cover search engine technologies, which play an important role in any data mining applications involving text data for two reasons. First, while the raw data may be large for any particular problem, it is often a relatively small subset of the data that are relevant, and a search engine is an essential tool for quickly discovering a small subset of relevant text data in a large text collection. Second, search engines are needed to help analysts interpret any patterns discovered in the data by allowing them to examine the relevant original text data to make sense of any discovered pattern. You will learn the basic concepts, principles, and the major techniques in text retrieval, which is the underlying science of search engines.
- Text Mining and Analytics [FULL COURSE] | UIUC
- Mining Massive Datasets - Stanford University [FULL COURSE]
- Coursera:
- Специализация Обработка текстов, написанных на естественных языках
- Text Mining (Анализ текстовой информации и аналитика)
- LinkedIn Learning:
- Advanced NLP with Python for Machine Learning
- Folder with code
- Advanced NLP with Python for Machine Learning
- Матеріали для курсу NLP в Проджекторі
- Advaced NLP with spaCy
- fast.ai course: A Code-First Introduction to Natural Language Processing
- CS 4650 and 7650 - Course materials for Georgia Tech CS 4650 and 7650, "Natural Language"
- A lot of NLP books (Natural Language Processing)
- Advanced Natural Language Processing with TensorFlow 2, published by Packt
- Natural Language Processing with Python – Analyzing Text with the Natural Language Toolkit, Steven Bird, Ewan Klein, and Edward Loper
- Speech and Language Processing (3rd ed. draft), Dan Jurafsky and James H. Martin
- Neural Network Methods for Natural Language Processing
- Natural Language Processing with Python – Analyzing Text with the Natural Language Toolkit
- NLTK_book on GitHub
- Practical Natural Language Processing
-
Official Repository for 'Practical Natural Language Processing' by O'Reilly Media
-
- Natural Language Processing Notebooks
-
Available as a Book: NLP in Python - Quickstart Guide
-
Title | Description |
---|---|
ACL | at Vimeo |
[Stanford CS224N: Natural Language Processing with Deep Learning | Winter 2021](https://www.youtube.com/playlist?list=PLoROMvodv4rOSH4v6133s9LFPRHjEmbmJ) |
Natural Language Processing (NLP) Zero to Hero | by TensorFlow |
Zero to Hero: NLP with Tensorflow and Keras (GDG Sofia meetup) | |
Natural Language Processing | This content is based on Machine Learning University (MLU) Accelerated Natural Language Processing class. Slides, notebooks and datasets are available on GitHub |
- Natural Language Processing on Papers with Code
- Chris McCormick Blog
- ChrisMcCormickAI YouTube Channel
- The NLP Index
- Sebastian Ruder
-
I'm a research scientist at Google. I blog about natural language processing and machine learning.
-
- Software Engineering Blogs
-
A curated list of engineering blogs
-
- The Gradient
-
The Gradient is an organization with the missions of making it easier for anyone to learn about AI and of facilitating discussion within the AI community. We were founded in 2017 by a group of students and researchers at the Stanford AI Lab.
-
Title | Description |
---|---|
NVIDIA Deep Learning Examples for Tensor Cores - Natural Language Processing | Deep Learning Examples |
Natural Language Processing with Transformers | Notebooks and materials for the O'Reilly book "Natural Language Processing with Transformers" |
Awesome NLP References | A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP) |
NLP - Tutorial | |
Natural Language-Process Tutorials | |
NLP with Python | Scikit-Learn, NLTK, Spacy, Gensim, Textblob and more... |
NLP and Data Science GitHub Repository Spotlight | Daily spotlights of some underrated NLP and Data Science GitHub repositories. |
NLP 101: a Resource Repository for Deep Learning and Natural Language Processing | This document is drafted for those who have enthusiasm for Deep Learning in natural language processing. If there are any good recommendations or suggestions, I will try to add more. |
NLP-progress | Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks. |
Hugging Face | Public repo for HF blog posts |
AllenNLP | An open-source NLP research library, built on PyTorch. Allenai.org |
- NeurIPS - Neural Information Processing Systems
- ACL - ACL Home Association for Computational Linguistics
- ACL Anthology - The ACL Anthology currently hosts 80890 papers on the study of computational linguistics and natural language processing.
NLP libraries, frameworks, modules
Title | Description |
---|---|
Natural Language Toolkit (NLTK) | NLTK - the Natural Language Toolkit - is a suite of open source Python modules, data sets, and tutorials supporting research and development in Natural Language Processing. |
flair |
A very simple framework for state-of-the-art Natural Language Processing (NLP). Flair is: |
textacy | textacy is a Python library for performing a variety of natural language processing (NLP) tasks, built on the high-performance spaCy library. With the fundamentals --- tokenization, part-of-speech tagging, dependency parsing, etc. --- delegated to another library, textacy focuses primarily on the tasks that come before and follow after. |
AllenNLP | NLP research library, built on PyTorch, for developing state-of-the-art deep learning models on a wide variety of linguistic tasks. |
NLPGym | NLPGym is a toolkit to bridge the gap between applications of RL and NLP. This aims at facilitating research and benchmarking of DRL application on natural language processing tasks. The toolkit provides interactive environments for standard NLP tasks such as sequence tagging, question answering, and sequence classification. |
Gensim |
Title | Description |
---|---|
nlp-tutorial | Natural Language Processing Tutorial for Deep Learning Researchers
|
Natural Language Processing in Python Tutorial | comparing stand up comedians using natural language processing |
- Introduction to Deep Learning for Natural Language Processing
- Deep Learning architectures for NLP - Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
🔹 Stanford, NLP Seminar Schedule
🔹 CS224n: Natural Language Processing with Deep Learning
🔹 CIS 700-008 - Interactive Fiction and Text Generation
Implemented a Bidirectional Attention Flow neural network as a baseline on SQuAD, improving Chris Chute's model implementation, adding word-character inputs as described in the original paper and improving GauthierDmns' code.
- Interpretation of Natural Language Rules in Conversational Machine Reading
- Skip-Thought Vectors, Article
- Selectional Preference - (Katz and Fodor, 1963; Wilks, 1975; Resnik, 1993) are the tendency for a word to semantically select or constrain which other words may appear in a direct syntactic relation with it." In case this selection is expressed in binary term (allowed/not-allowed), it is also called selectional restriction (Séaghdha and Korhonen, 2014). SP can be contrasted with verb subcategorization "with subcategorization describing the syntactic arguments taken by a verb, and selectional preferences describing the semantic preferences verbs have for their arguments" (Van de Cruys et al., 2012)
- Selectional Restrictions -