Building Expressive and Tractable Probabilistic Generative Models: A Review

Sidheekh, Sahil, Natarajan, Sriraam

arXiv.org Artificial Intelligence 

However, they still struggle to capture dependencies as data complexity and dimensionality increase. We present a comprehensive survey of the advancements In contrast, advancements in deep learning have given rise and techniques in the field of tractable probabilistic to expressive Deep Generative Models (DGMs) that exploit generative modeling, primarily focusing on the power of neural networks to learn flexible representations Probabilistic Circuits (PCs). We provide a unified of complex data distributions. Notable examples include perspective on the inherent trade-offs between expressivity Generative Adversarial Networks, Variational Autoencoders, and the tractability, highlighting the design and Normalizing Flows. These models prioritize expressiveness principles and algorithmic extensions that have and have demonstrated impressive proficiency in enabled building expressive and efficient PCs, and capturing dependencies and generating high fidelity samples.