moe
There are 111 repositories under moe topic.
hiyouga/LLaMA-Factory
Unify Efficient Fine-Tuning of 100+ LLMs
czy0729/Bangumi
:electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。
PKU-YuanGroup/MoE-LLaVA
Mixture-of-Experts for Large Vision-Language Models
davidmrau/mixture-of-experts
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
open-compass/MixtralKit
A toolkit for inference and evaluation of 'mixtral-8x7b-32kseqlen' from Mistral AI
pjlab-sys4nlp/llama-moe
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
microsoft/tutel
Tutel MoE: An Optimized Mixture-of-Experts Implementation
ymcui/Chinese-Mixtral
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
kokororin/pixiv.moe
😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity.
mindspore-courses/step_into_llm
MindSpore online courses: Step into LLM
LISTEN-moe/android-app
Official LISTEN.moe Android app
inferflow/inferflow
Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).
libgdx/gdx-pay
A libGDX cross-platform API for InApp purchasing.
IBM/ModuleFormer
ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.
shalldie/chuncai
A lovely Page Wizard, is responsible for selling moe.
LISTEN-moe/desktop-app
Official LISTEN.moe Desktop Client
xrsrke/pipegoose
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
phanirithvij/twist.moe
Batch download high quality videos from https://twist.moe
ianhom/MOE
MOE is an event-driven OS for 8/16/32-bit MCUs. MOE means "Minds Of Embedded system", It’s also the name of my lovely baby daughter :sunglasses:
kyegomez/MoE-Mamba
Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta
LISTEN-moe/windows-app
Official LISTEN.moe Windows-only Client
marisukukise/japReader
japReader is an app for breaking down Japanese sentences and tracking vocabulary progress
sahuang/priconne-rainbow-fart
会长我挂树了 - 公主连结 vscode-rainbow-fart 扩展语音包 (Priconne extension vocal pack)
dragonzurfer/moe
A command line tool for all things anime
dsrkafuu/moe-counter-cf
Fork of Moe Counter powered by Cloudflare Workers.
Harry-Chen/InfMoE
Inference framework for MoE layers based on TensorRT with Python binding
VITA-Group/Random-MoE-as-Dropout
[ICLR 2023] "Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers" by Tianlong Chen*, Zhenyu Zhang*, Ajay Jaiswal, Shiwei Liu, Zhangyang Wang
facebookresearch/AdaTT
pytorch open-source library for the paper "AdaTT Adaptive Task-to-Task Fusion Network for Multitask Learning in Recommendations"
haxpor/blockbunny
Libgdx-based game for Android, iOS, and PC following the tutorial by ForeignGuyMike on youtube channel. Read more on README.md
MoeFE/MoeUI
UI components Library with Vue.js (Moe is Justice!!!)
LISTEN-moe/discord-bot
Official LISTEN.moe Discord Bot. Add it to your server!
LISTEN-moe/browser-extension
Official LISTEN.moe browser extension
YeonwooSung/Pytorch_mixture-of-experts
PyTorch implementation of moe, which stands for mixture of experts