Scaling Continuous Latent Variable Models as Probabilistic Integral Circuits Gennaro Gala 1, Cassio de Campos 1 Antonio V ergari 2, Erik Quaeghebeur
–Neural Information Processing Systems
Probabilistic integral circuits (PICs) have been recently introduced as probabilistic models enjoying the key ingredient behind expressive generative models: continuous latent variables (L Vs). PICs are symbolic computational graphs defining continuous L V models as hierarchies of functions that are summed and multiplied together, or integrated over some L Vs. They are tractable if L Vs can be analytically integrated out, otherwise they can be approximated by tractable probabilistic circuits (PC) encoding a hierarchical numerical quadrature process, called QPCs. So far, only tree-shaped PICs have been explored, and training them via numerical quadrature requires memory-intensive processing at scale. In this paper, we address these issues, and present: (i) a pipeline for building DAG-shaped PICs out of arbitrary variable decompositions, (ii) a procedure for training PICs using tensorized circuit architectures, and (iii) neural functional sharing techniques to allow scalable training.
Neural Information Processing Systems
Nov-14-2025, 00:23:21 GMT
- Country:
- Africa > Senegal
- Kolda Region > Kolda (0.04)
- Europe
- Middle East > Malta
- Port Region > Southern Harbour District > Floriana (0.04)
- Netherlands > North Brabant
- Eindhoven (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Middle East > Malta
- Africa > Senegal
- Genre:
- Research Report > Experimental Study (1.00)
- Technology: