pre-training and fine-tuning framework for text generation
backbone code for "An Empirical Investigation of Pre-Trained Transformer Language Models for Open-Domain Dialogue Generation": https://arxiv.org/abs/2003.04195
./prepare_data.sh
./train.sh
./inference.sh
Example: chat-bot
cd chat_bot
./fine_tune.sh
./inference.sh
./deploy.sh
-
12-layer, 768-hidden, 12-heads, Chinese (News + zhwiki, 200G) and English (Gigawords + Bookscorpus + enwiki, 60G)
-
24-layer, 768-hidden, 12-heads, Chinese (News + zhwiki, 200G) and English (Gigawords + Bookscorpus + enwiki, 60G)
-
download them: https://github.com/lipiji/Guyu/tree/master/model