Repository for the course project of Deep Learning for Speech and Natural Language Processing at Universität Stuttgart. Winter 2020-2021.
Natural Language Understanding
Custom dataset provided by the instructors.
{
"text": "",
"positions": [{}],
"slots": [{}],
"intent": ""
}
Inputs: text
Labels: slots, intents
Pretrained model in use: bert-base-cased
from Huggingface Transformers.
pip install -r requirements.txt
add kansas city, missouri to Stress Relief
{
"intent": "AddToPlaylist",
"slots": {
"playlist": "Stress Relief",
"entity_name": "kansas city, missouri"
}
}
python main.py <gpu_id>
# use gpu_id from nvidia-smi for multi-gpu systems
# for single gpu, use 0
-
Chen et al. (2019), BERT for Joint Intent Classification and Slot Filling. https://arxiv.org/abs/1902.10909