codecaution/Awesome-Mixture-of-Experts-Papers

Three new papers about MoE

XueFuzhao opened this issue · 2 comments

Hi authors,
Thank you for your repo! I also created one awesome MoE repo recently. I will update your new work into my repo.
https://github.com/XueFuzhao/awesome-mixture-of-experts
Also, I think a few of my papers are missing.
Go Wider Instead of Deeper [AAAI2022]
Cross-token Modeling with Conditional Computation [5 Sep 2021]
One Student Knows All Experts Know: From Sparse to Dense [26 Jan 2022]

Thank you so much!

Hi, XueFu.
Thanks for your suggestions!
I have already added your papers in my list and add the contents section as your repo.

Please also include our system paper HetuMoE:
HetuMoE: An Efficient Trillion-scale Mixture-of-Expert Distributed Training System [pdf] [github]
Thanks!

No problem! Done