sequence-parallelism
There are 4 repositories under sequence-parallelism topic.
InternLM/InternEvo
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
xrsrke/pipegoose
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
AlibabaPAI/FlashModels
Fast and easy distributed model training examples.
InternLM/InternEvo-HFModels
Democratizing huggingface model training with InternEvo