Authors: Yixuan Su, Deng Cai, Qingyu Zhou, Zibo Lin, Simon Baker, Yunbo Cao, Shuming Shi, Nigel Collier, and Yan Wang
Code for ACL 2021 paper Dialogue Response Selection with Hierarchical Curriculum Learning
In this repository, we provide a simpler and more robust implementation of our ACL 2021 paper and it requires less hyper-parameter tuning. We provide data, pre-trained models for Douban dataset. We will update data for Ubuntu and E-commerce soon.
pip install -r requirements.txt
2. Download Data here:
unzip data.zip and replace it with the empty ./data folder.
a. 4 x Tesla V100 GPUs(16GB)
b. Cuda Version: 11.0
(2) Download pre-trained BERT parameter here:
unzip bert-base-chinese.zip and replace it with the empty ./SABERT/bert-base-chinese folder
cd ./SABERT
chmod +x ./train.sh
./train.sh
(a) Download pre-trained parameters here:
unzip ckpt.zip and replace it with the empty ./SABERT/ckpt folder
cd ./SABERT
chmod +x ./inference.sh
./inference.sh
a. 1 x Tesla V100 GPUs(16GB)
b. Cuda Version: 11.0
(2) Download embeddings for both models here:
unzip embeddings.zip and replace it with the empty ./SMN_MSN/embeddings folder
cd ./SMN_MSN/
chmod +x train_X.sh (X in ['smn', 'msn])
./train_X.sh
(a) Download pre-trained parameters here:
unzip ckpt.zip and replace it with the empty ./SMN_MSN/ckpt folder
cd ./SMN_MSN
chmod +x ./inference_X.sh (X in ['smn', 'msn])
./inference_X.sh
If you find our paper and resources useful, please kindly cite our paper:
@inproceedings{su-etal-2021-dialogue,
title = "Dialogue Response Selection with Hierarchical Curriculum Learning",
author = "Su, Yixuan and
Cai, Deng and
Zhou, Qingyu and
Lin, Zibo and
Baker, Simon and
Cao, Yunbo and
Shi, Shuming and
Collier, Nigel and
Wang, Yan",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.137",
doi = "10.18653/v1/2021.acl-long.137",
pages = "1740--1751"
}