TIGER-AI-Lab/MMLU-Pro

Add Tencent Hunyuan-Large

Opened this issue · 2 comments

They claim a overal MMLU-Pro score of 60.2.

The currently unveiled Hunyuan-Large (Hunyuan-MoE-A52B) model is the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters. This is currently the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters.

https://huggingface.co/tencent/Tencent-Hunyuan-Large

Agreed, this would be very interesting to see

They claim a overal MMLU-Pro score of 60.2.

The currently unveiled Hunyuan-Large (Hunyuan-MoE-A52B) model is the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters. This is currently the largest open-source Transformer-based MoE model in the industry, featuring a total of 389 billion parameters and 52 billion active parameters.

https://huggingface.co/tencent/Tencent-Hunyuan-Large

60.2 for the base, instruct probably better