/bpemb

Pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE)

Primary LanguagePythonMIT LicenseMIT

Watchers