Issues
- 7
Pre-trained model download is not available.
#30 opened by Yongyeon-Kim - 1
Seems like checkpoints for {beta=0, beta=0.5} latent size=32 are the same checkpoints
#27 opened by yiminzme - 0
- 1
Dataset access denied
#29 opened by Elfsong - 0
How about the reconstruction BLEU of AE and VAE?
#28 opened by ChawDoe - 2
GPT2ForLatentConnector
#20 opened by Bila12 - 0
- 1
Question Label-Conditional Text Generation
#7 opened by Diego999 - 0
Question about mutual information
#25 opened by smolPixel - 2
Demo webset is dead
#16 opened by silverriver - 1
issue about reproducing results on SNLI dataset
#24 opened by 20000607-lxc - 0
Format of input files split by NLTK used as input for preprocessing: "wikipedia.segmented.nltk.split.seq64.0.json"
#23 opened by gabrer - 0
Chinese Pretrained Model
#22 opened by ywb2018 - 0
- 2
About Pre-training on the Wikipedia dataset
#19 opened by dongqian0206 - 1
Missing requirements file
#18 opened by ghazi-f - 0
One question about the decoder of vae
#17 opened by Hsintien-Ng - 1
how about using gpt2 as encoder and decoder?
#15 opened by hanqi-qi - 0
demo website
#14 opened by amituofo1996 - 4
DailyDialogue dataset
#13 opened by Rabona17 - 2
Question: why this choice of BERT and GPT2?
#12 opened by alxthm - 0
running your docker on an arm computer
#11 opened by arccoxx - 5
- 3
Suggestion for some added functions
#4 opened by summerstay - 1
interpolation scheme
#5 opened by vseledkin - 15
- 0
How can I load your docker in colab?
#9 opened by aidan-collins - 2
Number of pretraining epochs
#8 opened by bo-son - 1
additional sampling scheme
#6 opened by vseledkin - 1
The default value of the argument "--decoder_model_name_or_path" is bert-base-cased
#3 opened by righ120