Goto

Collaborating Authors

 reedy




Fast Adaptive Non-Monotone Submodular Maximization Subject to a Knapsack Constraint Supplementary Material

Neural Information Processing Systems

In this appendix, we include all the material missing from the main paper. Moreover, we restate a key result which connects random sampling and submodular maximization. The original version of the theorem was due to Feige et al. In fact, in what follows we exclusively use S and O for their final versions. Before stating the next lemma, let us introduce some notation for the sake of readability.





Discretely Beyond 1 /e: Guided Combinatorial Algorithms for Submodular Maximization

Neural Information Processing Systems

These are achieved by guiding the randomized greedy algorithm with a fast local search algorithm. Further, we develop deterministic versions of these algorithms, maintaining the same ratio and asymptotic time complexity.



Discretely Beyond 1 /e: Guided Combinatorial Algorithms for Submodular Maximization

Neural Information Processing Systems

These are achieved by guiding the randomized greedy algorithm with a fast local search algorithm. Further, we develop deterministic versions of these algorithms, maintaining the same ratio and asymptotic time complexity.