Pinned Repositories
DeepSpeedExamples
Example models using DeepSpeed
Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
shufflenetv2-tensorflow2.0
shufflenet v2 tensorflow2.0 tf2,0 tf-keras
TA-MoE
NPKit
NCCL Profiling Kit
nccl
Optimized primitives for collective multi-GPU communication
Chen-Chang's Repositories
Chen-Chang/TA-MoE
Chen-Chang/DeepSpeedExamples
Example models using DeepSpeed
Chen-Chang/Megatron-LM
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Chen-Chang/shufflenetv2-tensorflow2.0
shufflenet v2 tensorflow2.0 tf2,0 tf-keras