Pinned Repositories
Megatron-LM
Ongoing research training transformer models at scale
OFA
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
yh351016's Repositories
yh351016/OFA
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
yh351016/Megatron-LM
Ongoing research training transformer models at scale