Pinned Repositories
MS-AMP
Microsoft Automatic Mixed Precision Library
LLaMA-Factory
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
HNU-Auto-Clockin
HNU每日自动打卡
Mrzhang-dada.github.io
我的hexo博客
Megatron-LM
Ongoing research training transformer models at scale
TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
human-eval
Code for the paper "Evaluating Large Language Models Trained on Code"
Mrzhang-dada's Repositories
Mrzhang-dada/HNU-Auto-Clockin
HNU每日自动打卡
Mrzhang-dada/Mrzhang-dada.github.io
我的hexo博客