Goto

Collaborating Authors

 provable variational inference


Provable Variational Inference for Constrained Log-Submodular Models

Neural Information Processing Systems

Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functions, as well as the decisions made with the computed solutions, are subject to statistical noise and randomness, it is arguably necessary to go beyond computing a single approximate optimum and quantify its inherent uncertainty. To this end, we define a rich class of probabilistic models associated with constrained submodular maximization problems. These capture log-submodular dependencies of arbitrary order between the variables, but also satisfy hard combinatorial constraints. Namely, the variables are assumed to take on one of -- possibly exponentially many -- set of states, which form the bases of a matroid.


Reviews: Provable Variational Inference for Constrained Log-Submodular Models

Neural Information Processing Systems

The authors present an algorithm for approximate inference in exponential family models over the bases of a given matroid. In particular, the authors show how to leverage standard variational methods to yield a provable approximation to the log partition function of certain restricted families. This is of interest as 1) these families can be difficult to handle using standard probabilisitic modeling approaches and 2) previous bounds derived from variational methods do not necessarily come with performance guarantees. General comments: Although the approach herein would be considered variational approximation methods, the term variational inference is more commonly used for a related but different optimization problem. There are lots of other variational approaches that yield provable upper/lower bounds on the partition function.


From Sets to Multisets: Provable Variational Inference for Probabilistic Integer Submodular Models

Sahin, Aytunc, Bian, Yatao, Buhmann, Joachim M., Krause, Andreas

arXiv.org Machine Learning

Submodular functions have been studied extensively in machine learning and data mining. In particular, the optimization of submodular functions over the integer lattice (integer submodular functions) has recently attracted much interest, because this domain relates naturally to many practical problem settings, such as multilabel graph cut, budget allocation and revenue maximization with discrete assignments. In contrast, the use of these functions for probabilistic modeling has received surprisingly little attention so far. In this work, we firstly propose the Generalized Multilinear Extension, a continuous DR-submodular extension for integer submodular functions. We study central properties of this extension and formulate a new probabilistic model which is defined through integer submodular functions. Then, we introduce a block-coordinate ascent algorithm to perform approximate inference for those class of models. Finally, we demonstrate its effectiveness and viability on several real-world social connection graph datasets with integer submodular objectives.


Provable Variational Inference for Constrained Log-Submodular Models

Djolonga, Josip, Jegelka, Stefanie, Krause, Andreas

Neural Information Processing Systems

Submodular maximization problems appear in several areas of machine learning and data science, as many useful modelling concepts such as diversity and coverage satisfy this natural diminishing returns property. Because the data defining these functions, as well as the decisions made with the computed solutions, are subject to statistical noise and randomness, it is arguably necessary to go beyond computing a single approximate optimum and quantify its inherent uncertainty. To this end, we define a rich class of probabilistic models associated with constrained submodular maximization problems. These capture log-submodular dependencies of arbitrary order between the variables, but also satisfy hard combinatorial constraints. Namely, the variables are assumed to take on one of -- possibly exponentially many -- set of states, which form the bases of a matroid. To perform inference in these models we design novel variational inference algorithms, which carefully leverage the combinatorial and probabilistic properties of these objects.