fkodom/soft-mixture-of-experts
PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
PythonMIT
Stargazers
- 152334HNational University of Singapore
- aashay96www.kubric.io
- Agrover112
- air-teaShenzhen
- alexandonianMIT CSAIL
- alimoezzi
- anas-rzOakland University
- AnkushMalakerVoxela.ai
- antoinedandi
- caodonghui426
- chance189ApotheoTech LLC
- chl916185
- d27x
- fan1dy
- fkodomPlainsight
- fly51flyPRIS
- Ina299Tokyo
- jazzbearz
- joaopedrosdmm
- kprybolThe Wound Pros
- Kunlun-ZhuMila-Quebec AI Institute; UdeM
- linhduongtuanKTH Royal Institute of Technology
- mandliya@discord
- mensaochun
- MHarris021Pachyderm
- SandalotsVolcanak
- segyges
- shahules786Terra
- shawn2306Mumbai, india
- stereoplegic🧪Science🖌️Art🪄Magic
- torridgristle
- TowerYsableGuangdong University of Technology
- tugberkbastepeKyndryl (a spin-off of IBM)
- xanderdunn
- youlandasuAtlanta, GA
- ZhenbangDuHuazhong University of Science and Technology