/Exo-Machina

A large language model, GPT-2, is trained on ~1.6M manuscript abstracts from the ArXiv. Fine-tuning is performed with the abstracts from NASA's ADS and the model is later used to generate text for a predictive keyboard

Primary LanguagePython

No issues in this repository yet.