/ner-re-with-transformers-odsc2022

Building NER and RE components using HuggingFace Transformers

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

ner-re-with-transformers-odsc2022

Title

Transformer based approaches to Named Entity Recognition (NER) and Relationship Extraction (RE)

Session type

Workshop (hands-on)

Abstract

Named Entity Recognition (NER) and Relationship Extraction (RE) are foundational for many downstream NLP tasks such as Information Retrieval and Knowledge Base construction. While pre-trained models exist for both NER and RE tasks, they are usually specialized for some narrow application domain. If your application domain is different, your best bet is to train your own models. However, the costs associated with training, specifically generating training data, can be a significant deterrent for doing so. Fortunately, Language Models learned by pre-trained Transformers learn a lot about the language of the domain it is trained and fine-tuned on, and therefore NER and RE models based on these Language Models require fewer training examples to deliver the same level of performance. In this workshop, participants will learn about, train, and evaluate Transformer based neural models for NER and RE.

Outline

Running the Code

  • (optional) Fork this repository
  • Navigate to the Colab Web UI
  • Click on the GitHub tab
  • Enter the URL of your forked (or this) repository in the field titled "Enter a GitHub URL" and hit the Search icon
  • You should see the notebooks appear in the results. Click on the notebook you want to work with in Colab

Datasets

Additional Links