/large-model-parallelism

Functional local implementations of main model parallelism approaches

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

large-model-parallelism-illustration

Model parallelism 101

Learn how model parallelism enables training models like stable diffusion and Chat GPT in less than 300 lines of code. This notebook provides practical local implementations of the main model parallelism methods. It explores three approaches: data parallelism, tensor parallelism, and pipeline parallelism with a 2-layer MLP example that can be naturally extended to more complex models.

Reading this notebook will give you a solid overview of model parallelism techniques and an intuition for how to implement them.

Pull requests welcome. Illustration above generated with Lexica's Aperture model.