/awesome-3d-diffusion

A collection of papers on diffusion models for 3D generation.

MIT LicenseMIT

Awesome 3D Diffusion

This repo collects papers that use diffusion models for 3D generation.

🔥🔥🔥 Check out our collection of papers on 4D generation: https://github.com/cwchenwang/awesome-4d-generation

🔥🔥🔥 Please take a look at our survey on diffusion models for 3D Generation, which gives a summary of the papers of this list: https://github.com/cwchenwang/awesome-3d-diffusion/blob/main/survey.pdf

If you consider our paper or list useful, please cite our paper:

@article{wang2024diffusion,
  title={Diffusion Models for 3D Generation: A Survey},
  author={Wang, Chen and Peng, Hao-Yang and Liu, Ying-Tian and Gu, Jiatao and Hu, Shi-Min},
  howpublished = {\url{https://github.com/cwchenwang/awesome-3d-diffusion}},
  year={2024}
}

Note: This list is far from complete, please directly open a pull request if you want to add a paper or modify the information. You don't need to open an issue.

Table of Contents

2D Diffusion with Pretraining

Text-to-3D Object Generation

Compositional or Scene Generation

Image-to-3D

Human and Animal

3D Editing

Texturing

Multi-view Diffusion

2D Diffusion without Pretraining

3D Objects

3D Scenes

Diffusion in 3D Space

3D Gaussians

Point Cloud, Meshs, Volumes

Implicit Representation

Triplane

Latent Representation

Novel Representations

Diffusion for Motion

Human Motion