On the Expressive Power of Tree-Structured Probabilistic Circuits
–Neural Information Processing Systems
Probabilistic circuits (PCs) have emerged as a powerful framework to compactly represent probability distributions for efficient and exact probabilistic inference. It has been shown that PCs with a general directed acyclic graph (DAG) structure can be understood as a mixture of exponentially (in its height) many components, each of which is a product distribution over univariate marginals. However, existing structure learning algorithms for PCs often generate tree-structured circuits or use tree-structured circuits as intermediate steps to compress them into DAGstructured circuits. This leads to the intriguing question of whether there exists an exponential gap between DAGs and trees for the PC structure.
Neural Information Processing Systems
Mar-23-2025, 09:50:02 GMT