/nlu-jointbert-dl2021

Repository for the course project of Deep Learning for Speech and Natural Language Processing at Universität Stuttgart.

Primary LanguageJupyter Notebook

nlu-jointbert-dl2021

Repository for the course project of Deep Learning for Speech and Natural Language Processing at Universität Stuttgart. Winter 2020-2021.

Open In Colab

Task

Natural Language Understanding

Data

Custom dataset provided by the instructors.

Dataset properties

{
  "text": "",
  "positions": [{}],
  "slots": [{}],
  "intent": ""
}

Model Description

Inputs: text

Labels: slots, intents

Pretrained model in use: bert-base-cased from Huggingface Transformers.

ENV Setup

pip install -r requirements.txt

Sample prediction output

Input

add kansas city, missouri to Stress Relief

Output

{
  "intent": "AddToPlaylist",
  "slots": {
    "playlist": "Stress Relief",
    "entity_name": "kansas city, missouri"
  }
}

Run

python main.py <gpu_id>
# use gpu_id from nvidia-smi for multi-gpu systems
# for single gpu, use 0

Reference

  1. Chen et al. (2019), BERT for Joint Intent Classification and Slot Filling. https://arxiv.org/abs/1902.10909

  2. https://github.com/monologg/JointBERT