Pinned Repositories
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
alpa
Training and serving large-scale neural networks
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
envpool
C++-based high-performance parallel environment execution engine (vectorized env) for general RL environments.
gear
A distributed GPU-centric experience replay system for large AI models.
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
mksit.github.io
pybind11
Seamless operability between C++11 and Python
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
mksit's Repositories
mksit/alpa
Training and serving large-scale neural networks
mksit/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
mksit/envpool
C++-based high-performance parallel environment execution engine (vectorized env) for general RL environments.
mksit/gear
A distributed GPU-centric experience replay system for large AI models.
mksit/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
mksit/mksit.github.io
mksit/pybind11
Seamless operability between C++11 and Python
mksit/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.