EleutherAI/oslo

Integration ZeroDDP and ShardedModelv2 from colossal AI

dongsungkim opened this issue · 1 comments

Describe a TODO feature

There are two version of Zero support from ZeroDDP and ShardedModelv2

  1. check the possibility to merge two into one
  2. Otherwise, it has a function to choose one of them based on the flag (not shown to users directly)

Assignees

  • Dongsung and Hyen

https://github.com/hpcaitech/ColossalAI/blob/6e51d296f07c0ad34d7f85cf9a70d4ceee15ede7/colossalai/zero/sharded_optim/low_level_optim.py#L29

There is updates about Zero-DP in Colossal AI.
Zero 1 -> Low_level_optimizer
Zero 2 -> Low_level_optimizer (partitioned gradient =True)
Zero 3 -> Sharded optimizer v2, sharded modelV2