/mixture-of-experts

A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models

Primary LanguagePythonMIT LicenseMIT

No issues in this repository yet.