Discrete Diffusion Models: Novel Analysis and New Sampler Guarantees
Liang, Yuchen, Liang, Yingbin, Lai, Lifeng, Shroff, Ness
–arXiv.org Artificial Intelligence
Discrete diffusion models have recently gained significant prominence in applications involving natural language and graph data. A key factor influencing their effectiveness is the efficiency of discretized samplers. Among these, $τ$-leaping samplers have become particularly popular due to their theoretical and empirical success. However, existing theoretical analyses of $τ$-leaping often rely on somewhat restrictive and difficult-to-verify regularity assumptions, and their convergence bounds contain quadratic dependence on the vocabulary size. In this work, we introduce a new analytical approach for discrete diffusion models that removes the need for such assumptions. For the standard $τ$-leaping method, we establish convergence guarantees in KL divergence that scale linearly with vocabulary size, improving upon prior results with quadratic dependence. Our approach is also more broadly applicable: it provides the first convergence guarantees for other widely used samplers, including the Euler method and Tweedie $τ$-leaping. Central to our approach is a novel technique based on differential inequalities, offering a more flexible alternative to the traditional Girsanov change-of-measure methods. This technique may also be of independent interest for the analysis of other stochastic processes.
arXiv.org Artificial Intelligence
Nov-3-2025
- Country:
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.04)
- North America > United States
- California > Yolo County
- Davis (0.04)
- Ohio (0.04)
- California > Yolo County
- Europe > United Kingdom
- Genre:
- Research Report > Promising Solution (0.34)
- Industry:
- Government (0.67)
- Technology: