Learning Tractable Graphical Models Using Mixture of Arithmetic Circuits
Rooshenas, Amirmohammad (University of Oregon) | Lowd, Daniel (University of Oregon)
In recent years, there has been a growing interest in learning tractable graphical models in which exact inference is efficient. Two main approaches are to restrict the inference complexity directly, as done by low-treewidth graphical models and arithmetic circuits (ACs), or introduce latent variables, as done by mixtures of trees, latent tree models, and sum-product networks (SPNs). In this paper, we combine these approaches to learn a mixtures of ACs (MAC). A mixture can represent many distributions exponentially more compactly than a single AC. By using ACs as mixture components, MAC can represent complex distributions using many fewer components than required by other mixture models. MAC generalizes ACs, mixtures of trees, latent class models, and thin junction trees, and can be seen as a special case of an SPN. Compared to state-of-the-art algorithms for learning SPNs and other tractable models, MAC is consistently more accurate while maintaining tractable inference.
- Technology: