Exploring Diffusion and Flow Matching Under Generator Matching
Patel, Zeeshan, DeLoye, James, Mathias, Lance
–arXiv.org Artificial Intelligence
Recent techniques in deep generative modeling have leveraged Markov generative processes to learn complex, high-dimensional probability distributions in a more structured and flexible manner [17]. By integrating Markov chain methods with deep neural architectures, these approaches aim to exploit the representational power of deep networks while maintaining a tractable and theoretically grounded training procedure. In contrast to early generative models that relied heavily on direct maximum likelihood estimation or adversarial objectives, this class of methods employs iterative stochastic transformations--often expressed as Markovian updates--to gradually refine initial noise samples into samples drawn from the desired target distribution. Diffusion and flow matching models represent two prominent classes of generative approaches that construct data samples through a sequence of continuous transformations. Diffusion models [6, 13] introduce a forward-noising and reverse-denoising process, progressively refining a simple noise distribution into a complex target distribution by learning to undo incremental noise corruption at each step.
arXiv.org Artificial Intelligence
Dec-17-2024