suyanzhou626/llama-moe
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
PythonApache-2.0
Watchers
No one’s watching this repository yet.
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
PythonApache-2.0
No one’s watching this repository yet.