/live-manning-nlpconf20

Papers, code and slides for my session at the live@manning NLP conference, 2020 covering my talk on Deep Transfer Learning for Natural Language Processing

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

live@Manning Natural Language Processing Conference 2020

Conference Recording

Watch the video

Deep Transfer Learning for Natural Language Processing

Papers, code and slides for my session at the live@Manning NLP conference, 2020 covering my talk on Deep Transfer Learning for Natural Language Processing

The intent of this session is to journey through the recent advancements in deep transfer learning for NLP by taking a look at various state-of-the-art models and methodologies. These will include:

  • Pre-trained embeddings for Deep Learning Models (FastText with CNNs\Bi-directional LSTMs + Attention)
  • Universal Embeddings (Sentence Encoders, NNLMs)
  • Transformers (BERT, DistilBERT etc.)

We will also look at the power of some of these models, especially transformers with a couple of hands-on tutorials with code.

Powered By: