codecaution/Awesome-Mixture-of-Experts-Papers
A curated reading list of research in Mixture-of-Experts(MoE).
Apache-2.0
Stargazers
- Ag2S1langboat.com
- alphadlJD Explore Academy, JD.com Inc.
- atomicooXiamen
- caojiangxiaBeijing
- codecautionPeking University
- cooelfShanghai Jiao Tong University
- dsx-aishanghai
- EricWangCN@microsoft
- fly51flyPRIS
- forrestszShenzhen
- gavin1332
- GoGoDuck912
- HswordPurdue CS AP; CMU-Catalyst Group Postdoc; PKU-DAIR Lab PhD
- JiYuanFengHKU
- JustinLin610Beijing
- KevinSRRUniverse
- Kite0011
- kugwzk
- kunzhanLanzhou University
- LooperXXHarbin Institute of Technology
- LQBDDPeking University
- MancheryTsinghua University
- MM-IRUniversity of California, San Diego
- ozyyshrIL, USA
- rentainheIDEA
- salty-fish-97Peking University
- SewensLawbda Co.
- SinclairCoderChina
- SkyKuang
- thinkweeTsinghua University
- tim5goHong Kong
- xcfcode
- xingchensongTsinghua University (2019-2022) &&&&& WeNet Community (2021-now)
- XingLuxiBeijing, China
- ZhiYuanZeng
- zyrmj0212