/mixture-of-experts

A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models

Primary LanguagePythonMIT LicenseMIT

Watchers

No one’s watching this repository yet.