Deep generative models as the probability transformation functions
Bondar, Vitalii, Babenko, Vira, Trembovetskyi, Roman, Korobeinyk, Yurii, Dzyuba, Viktoriya
–arXiv.org Artificial Intelligence
This paper introduces a unified theoretical perspective that views deep generative models as probability transformation functions. Despite the apparent differences in architecture and training methodologies among various types of generative models - autoencoders, autoregressive models, generative adversarial networks, normalizing flows, diffusion models, and flow matching - we demonstrate that they all fundamentally operate by transforming simple predefined distributions into complex target data distributions. This unifying perspective facilitates the transfer of methodological improvements between model architectures and provides a foundation for developing universal theoretical approaches, potentially leading to more efficient and effective generative modeling techniques.
arXiv.org Artificial Intelligence
Jun-23-2025
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe
- France > Hauts-de-France
- Ukraine > Cherkasy Oblast
- Cherkasy (0.05)
- Asia > Middle East
- Genre:
- Overview (1.00)
- Research Report (1.00)
- Industry:
- Health & Medicine (0.46)
- Technology: