/Radiantloom-Mixtral-8X7B-Fusion

A large language model (LLM) developed by AI Geek Labs, features approximately 47 billion parameters and employs a Mixture of Experts (MoE) architecture. With a context length of 4096 tokens, this model is suitable for commercial use.

No issues in this repository yet.