Reviews: Stochastic Continuous Greedy ++: When Upper and Lower Bounds Match
–Neural Information Processing Systems
This paper considers DR-submodular maximization under convex constraints. It achieves optimal results in terms of both approximation and query complexity. The first paper in this line of work shows that Stochastic Gradient Ascent (SGA) achieves a 1/2-epsilon approximation in 1/epsilon 2 stochastic gradient. There have been multiple papers on this problem since then. This paper introduces SCG which achieves a 1-1/e-epsilon in 1/epsilon 2 stochastic oracle calls using a novel variance reduction method.
Neural Information Processing Systems
Jan-23-2025, 08:07:54 GMT
- Technology: