Universality of Gaussian-Mixture Reverse Kernels in Conditional Diffusion
Ishtiaque, Nafiz, Haque, Syed Arefinul, Alam, Kazi Ashraful, Jahara, Fatima
We prove that conditional diffusion models whose reverse kernels are finite Gaussian mixtures with ReLU-network logits can approximate suitably regular target distributions arbitrarily well in context-averaged conditional KL divergence, up to an irreducible terminal mismatch that typically vanishes with increasing diffusion horizon. A path-space decomposition reduces the output error to this mismatch plus per-step reverse-kernel errors; assuming each reverse kernel factors through a finite-dimensional feature map, each step becomes a static conditional density approximation problem, solved by composing Norets' Gaussian-mixture theory with quantitative ReLU bounds. Under exact terminal matching the resulting neural reverse-kernel class is dense in conditional KL.
Apr-16-2026
- Country:
- Asia
- Bangladesh > Dhaka Division
- Dhaka District > Dhaka (0.04)
- China > Shanghai
- Shanghai (0.04)
- Japan > Honshū
- Chūbu > Aichi Prefecture > Nagoya (0.04)
- Middle East > Jordan (0.04)
- Singapore (0.04)
- Bangladesh > Dhaka Division
- Europe
- North America
- Canada > British Columbia
- United States
- Hawaii > Honolulu County
- Honolulu (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Massachusetts > Suffolk County
- Boston (0.04)
- New Jersey > Middlesex County
- New Brunswick (0.04)
- Hawaii > Honolulu County
- Asia
- Genre:
- Research Report (0.40)
- Technology: