TransformerCVAE
This repository contains source code for paper Transformer-based Conditional Variational Autoencoder for Controllable Story Generation:
@article{fang2021transformer,
title={Transformer-based Conditional Variational Autoencoder for Controllable Story Generation},
author={Fang, Le and Zeng, Tao and Liu, Chaochun and Bo, Liefeng and Dong, Wen and Chen, Changyou},
journal={arXiv preprint arXiv:2101.00828},
year={2021}
}
- get source data (Arxiv, Yelp, WritingPrompts, WikiPlots).
- data pre-processing (data/).
- training (choose from several different implementations on parallelism and precision: train.py, train_dist.py, train_dist_half.py).
- generation, evaluation and analysis (generate.py/generate_prefix.py, eval_ppl.py/eval_ppl_prefix.py, tsne_plot.py).
Contact: lefang@buffalo.edu