Pinned Repositories
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Megatron-LM
Ongoing research training transformer models at scale
pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
DeepSpeed_tzw
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
pytorch_tzw
Tensors and Dynamic neural networks in Python with strong GPU acceleration
xDiT
xDiT: A Scalable Inference Engine for Diffusion Transformers (DiTs) on multi-GPU Clusters
xDiT
xDiT: A Scalable Inference Engine for Diffusion Transformers (DiTs) with Massive Parallelism
taozhiwei's Repositories
taozhiwei/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
taozhiwei/pytorch_tzw
Tensors and Dynamic neural networks in Python with strong GPU acceleration
taozhiwei/xDiT
xDiT: A Scalable Inference Engine for Diffusion Transformers (DiTs) on multi-GPU Clusters
taozhiwei/DeepSpeed_enflame
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.