Distributional Diffusion Models with Scoring Rules
De Bortoli, Valentin, Galashov, Alexandre, Guntupalli, J. Swaroop, Zhou, Guangyao, Murphy, Kevin, Gretton, Arthur, Doucet, Arnaud
Diffusion models generate high-quality synthetic data. They operate by defining a continuous-time forward process which gradually adds Gaussian noise to data until fully corrupted. The corresponding reverse process progressively "denoises" a Gaussian sample into a sample from the data distribution. However, generating high-quality outputs requires many discretization steps to obtain a faithful approximation of the reverse process. This is expensive and has motivated the development of many acceleration methods. We propose to accomplish sample generation by learning the posterior {\em distribution} of clean data samples given their noisy versions, instead of only the mean of this distribution. This allows us to sample from the probability transitions of the reverse process on a coarse time scale, significantly accelerating inference with minimal degradation of the quality of the output. This is accomplished by replacing the standard regression loss used to estimate conditional means with a scoring rule. We validate our method on image and robot trajectory generation, where we consistently outperform standard diffusion models at few discretization steps.
Feb-4-2025
- Country:
- Europe > Germany (0.14)
- North America > United States (0.14)
- Genre:
- Research Report > New Finding (0.67)
- Industry:
- Leisure & Entertainment > Games (0.61)
- Technology: