Stochastic Submodular Maximization: The Case of Coverage Functions
Karimi, Mohammad, Lucic, Mario, Hassani, Hamed, Krause, Andreas
–Neural Information Processing Systems
Stochastic optimization of continuous objectives is at the heart of modern machine learning. However, many important problems are of discrete nature and often involve submodular objectives. We seek to unleash the power of stochastic continuous optimization, namely stochastic gradient descent and its variants, to such discrete problems. We first introduce the problem of stochastic submodular optimization, where one needs to optimize a submodular objective which is given as an expectation. Our model captures situations where the discrete objective arises as an empirical risk (e.g., in the case of exemplar-based clustering), or is given as an explicit stochastic model (e.g., in the case of influence maximization in social networks).
Neural Information Processing Systems
Feb-14-2020, 19:40:55 GMT
- Technology: