Pinned Repositories
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
DeepSpeedExamples
Example models using DeepSpeed
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
DLWorkspace
Deep Learning Workspace
sc-dnn
transformers
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
transformers-bloom-inference
Fast Inference Solutions for BLOOM
tjruwase's Repositories
tjruwase/DLWorkspace
Deep Learning Workspace
tjruwase/transformers
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
tjruwase/transformers-bloom-inference
Fast Inference Solutions for BLOOM
tjruwase/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
tjruwase/sc-dnn