Scaling Continuous Latent Variable Models as Probabilistic Integral Circuits Gennaro Gala 1, Cassio de Campos 1 Antonio V ergari 2, Erik Quaeghebeur

Neural Information Processing Systems 

Probabilistic integral circuits (PICs) have been recently introduced as probabilistic models enjoying the key ingredient behind expressive generative models: continuous latent variables (L Vs). PICs are symbolic computational graphs defining continuous L V models as hierarchies of functions that are summed and multiplied together, or integrated over some L Vs. They are tractable if L Vs can be analytically integrated out, otherwise they can be approximated by tractable probabilistic circuits (PC) encoding a hierarchical numerical quadrature process, called QPCs. So far, only tree-shaped PICs have been explored, and training them via numerical quadrature requires memory-intensive processing at scale. In this paper, we address these issues, and present: (i) a pipeline for building DAG-shaped PICs out of arbitrary variable decompositions, (ii) a procedure for training PICs using tensorized circuit architectures, and (iii) neural functional sharing techniques to allow scalable training.