Submodular Maximization via Gradient Ascent: The Case of Deep Submodular Functions
Bai, Wenruo, Noble, William Stafford, Bilmes, Jeff A.
–Neural Information Processing Systems
We study the problem of maximizing deep submodular functions (DSFs) subject to a matroid constraint. DSFs are an expressive class of submodular functions that include, as strict subfamilies, the facility location, weighted coverage, and sums of concave composed with modular functions. We use a strategy similar to the continuous greedy approach, but we show that the multilinear extension of any DSF has a natural and computationally attainable concave relaxation that we can optimize using gradient ascent. Our results show a guarantee of $\max_{0 \delta 1}(1-\epsilon-\delta-e {-\delta 2\Omega(k)})$ with a running time of $O( icefrac{n 2}{\epsilon 2})$ plus time for pipage rounding to recover a discrete solution, where $k$ is the rank of the matroid constraint. This bound is often better than the standard $1-1/e$ guarantee of the continuous greedy algorithm, but runs much faster.
Neural Information Processing Systems
Feb-14-2020, 19:58:24 GMT