Pinned Repositories
flash-attention
Fast and memory-efficient exact attention
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Hydro
Surrogate-based Hyperparameter Tuning System
ASPLOS23
myinfo
Sylvie
torchgt
zxmeng98.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes
zxmeng98's Repositories
zxmeng98/ASPLOS23
zxmeng98/myinfo
zxmeng98/Sylvie
zxmeng98/torchgt
zxmeng98/zxmeng98.github.io
Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes