Any-Order Flexible Length Masked Diffusion

Kim, Jaeyeon, Cheuk-Kit, Lee, Domingo-Enrich, Carles, Du, Yilun, Kakade, Sham, Ngotiaoco, Timothy, Chen, Sitan, Albergo, Michael

arXiv.org Artificial Intelligence 

Early diffusion models were formulated as continuous-time Markov chains over continuous spaces with Gaussian transition kernels (Sohl-Dickstein et al., 2015; Ho et al., 2020), and were later connected to continuous-time formulations via stochastic differential equations, offering a unifying perspective on score-based generative modeling (Song et al., 2020). In parallel, discrete diffusion has been developed from the viewpoint of Markov chains over discrete space (Hoogeboom et al., 2021). Notably, Austin et al. (2021) introduced D3PM with several families of discrete transition kernels, and Lou et al. (2023) proposed SEDD, which adopts score-based training objectives. A complementary line of work studies discrete flows (Campbell et al., 2024; Gat et al., 2024), aiming to understand continuous-time Markov chains (CTMCs) that interpolate between data and base distributions; this perspective aligns with ours. Subsequent extensions consider token-wise paths and path-wise structure within such flows (Shaul et al., 2024).