/llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training

Primary LanguagePythonApache License 2.0Apache-2.0

Watchers