constrained log-submodular model
Provable Variational Inference for Constrained Log-Submodular Models
Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functions, as well as the decisions made with the computed solutions, are subject to statistical noise and randomness, it is arguably necessary to go beyond computing a single approximate optimum and quantify its inherent uncertainty. To this end, we define a rich class of probabilistic models associated with constrained submodular maximization problems. These capture log-submodular dependencies of arbitrary order between the variables, but also satisfy hard combinatorial constraints. Namely, the variables are assumed to take on one of -- possibly exponentially many -- set of states, which form the bases of a matroid.
Reviews: Provable Variational Inference for Constrained Log-Submodular Models
The authors present an algorithm for approximate inference in exponential family models over the bases of a given matroid. In particular, the authors show how to leverage standard variational methods to yield a provable approximation to the log partition function of certain restricted families. This is of interest as 1) these families can be difficult to handle using standard probabilisitic modeling approaches and 2) previous bounds derived from variational methods do not necessarily come with performance guarantees. General comments: Although the approach herein would be considered variational approximation methods, the term variational inference is more commonly used for a related but different optimization problem. There are lots of other variational approaches that yield provable upper/lower bounds on the partition function.
Provable Variational Inference for Constrained Log-Submodular Models
Djolonga, Josip, Jegelka, Stefanie, Krause, Andreas
Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functions, as well as the decisions made with the computed solutions, are subject to statistical noise and randomness, it is arguably necessary to go beyond computing a single approximate optimum and quantify its inherent uncertainty. To this end, we define a rich class of probabilistic models associated with constrained submodular maximization problems. These capture log-submodular dependencies of arbitrary order between the variables, but also satisfy hard combinatorial constraints. Namely, the variables are assumed to take on one of -- possibly exponentially many -- set of states, which form the bases of a matroid. To perform inference in these models we design novel variational inference algorithms, which carefully leverage the combinatorial and probabilistic properties of these objects.