This repository consists of code, processed data and trained models leveraged in DiSCoL: Toward Engaging Dialogue Systems through Conversational Line Guided Response Generation. Please cite this work as:
@article{ghazarian2021discol,
title={DiSCoL: Toward Engaging Dialogue Systems through Conversational Line Guided Response Generation},
author={Sarik Ghazarian and Zixi Liu and Tuhin Chakrabarty and Xuezhe Ma and Aram Galstyan and Nanyun Peng},
booktitle={2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), Demonstrations Track},
pages={26–34},
year={2021}
}
Please feel free to contact me for any suggestions or issues.
Create a new environment from the webdemo.yml file which includes the necessary libraries and packages to run the demo, then install fairseq using the library in this repository.
We have four models that should be loaded to run the DiSCoL:
- ent_kwd: is a finetuned BART (Lewis et al., 2019) model that predicts convlines given the dialogue context utterance, entities and topics.
- topic_cls: is a finetuned BERT (Devlin et al., 2019) that predicts a topic label for each given dialogue utterance.
- bartgen: is a finetuned BART (Lewis et al., 2019) model that generates next utterance (response) given dialogue context utterance, predicted convlines and topic.
- baseline: is the pretrained DialoGPT (Zhang et al., 2019) model that we consider as the baseline model to generate responses (it doesn't take the predicted keywords and topic as the input to generate the response).
Download all these models from here and put them in a folder (eg. ./Models). Then try to update all the paths in the first four lines of webdemo/SETTING.py file accordingly such that DiSCoL would be able to locate and load them correctly.
It is encouraged to run DiSCoL on a machine with GPUs. If your machine does not have a GPU, you can remotely access to a machine with GPUs using ssh -L PORT_NUMBER:127.0.0.1:PORT_NUMBER MACHINE_NAME.
On the connected server run the DiSCoL on a GPU: python webdemo/app.py
In your local browser, try to connect to the server: http://127.0.0.1:PORT_NUMBER
The DiCoL should be ready to converse. Enjoy conversing with DiSCoL!