A comprehensive list of papers about Large-Language-Diffusion-Models.
Important
Contributions welcome:
-
If you have a relevant paper not included in the library, please contact us! Or, you may also consider submitting 'Pull requests' directly, thank you!
-
If you think your paper is more suitable for another category, please contact us or submit 'Pull requests'.
-
If your paper is accepted, you may consider updating the relevant information.
-
Thank you!
- 🔥🔥🔥 Awesome-LLDM is now open!
- Gemini Diffusion
- Dream-7B
- DreamOn
- What are Diffusion Language Models?
- Generative Modeling by Estimating Gradients of the Data Distribution
| Paper Title | Year | Conference/Journal | Remark |
|---|---|---|---|
| Discrete Diffusion in Large Language and Multimodal Models: A Survey | 2025 | Arxiv | |
| Diffusion-based Large Language Models Survey | 2025 | Arxiv | |
| A Survey on Parallel Text Generation: From Parallel Decoding to Diffusion Language Models | 2025 | Arxiv |
| Paper Title | Year | Conference/Journal | Remark |
|---|---|---|---|
| David helps Goliath: Inference-Time Collaboration Between Small Specialized and Large General Diffusion LMs | 2023 | NAACL | |
| Diffusion Language Models Can Perform Many Tasks with Scaling and Instruction-Finetuning | 2023 | Arxiv | |
| TESS 2: A Large-Scale Generalist Diffusion Language Model | 2025 | ACL | Adapted from Mistral-7B-v0.1 |
| Scaling Diffusion Language Models via Adaptation from Autoregressive Models | 2025 | ICLR | 127M~7B (GPT2, LLaMA2) |
| Large Language Diffusion Models | 2025 | Arxiv | LLaDA-8B |
| LLaDA 1.5: Variance-Reduced Preference Optimization for Large Language Diffusion Models | 2025 | Arxiv | |
| Large Language Models to Diffusion Finetuning | 2025 | Arxiv | |
| LongLLaDA: Unlocking Long Context Capabilities in Diffusion LLMs | 2025 | Arxiv | Long context scaling |
| Dream 7B: Diffusion Large Language Models | 2025 | Arxiv | |
| UltraLLaDA: Scaling the Context Length to 128K for Diffusion Large Language Models | 2025 | Arxiv |
We welcome all researchers to contribute to this repository.
If you have a related paper that was not added to the library, please contact us.
Email: jake630@snu.ac.kr / wjk9904@snu.ac.kr / qicher@snu.ac.kr