Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking the Positional Encoding in Language Pre-training".
Primary LanguagePythonMIT LicenseMIT
No one’s watching this repository yet.