Sentence-State LSTM for Text Representation
Pre-Training a Graph Recurrent Network for Language Representation
- fairseq
- transformers
- sentencepiece
-
we follow fairseq for pre-training language model
-
put files in models to fairseq/fairseq/models
-
pip install .
-
preprocess dataset and run train.sh for lm pre-training
See corresponding directories
@article{wang2022pre,
title={Pre-Training a Graph Recurrent Network for Language Representation},
author={Wang, Yile and Yang, Linyi and Teng, Zhiyang and Zhou, Ming and Zhang, Yue},
journal={arXiv preprint arXiv:2209.03834},
year={2022}
}
If you have any question, please create a issue or contact to the authors.