Fast Adaptive Non-Monotone Submodular Maximization Subject to a Knapsack Constraint
Amanatidis, Georgios, Fusco, Federico, Lazos, Philip, Leonardi, Stefano, Reiffenhäuser, Rebecca
–arXiv.org Artificial Intelligence
Constrained submodular maximization is a fundamental problem at the heart of discrete optimization. The reason for this is as simple as it is clear: submodular functions capture the notion of diminishing returns present in a wide variety of real-world settings. Consequently to its striking importance and coinciding NP-hardness [20], extensive research has been conducted on submodular maximization since the seventies (e.g., [15, 42]), with focus lately shifting towards handling the massive datasets emerging in modern applications. With a wide variety of possible constraints, often regarding cardinality, independence in a matroid, or knapsacktype restrictions, the number of applications is vast. To name just a few, there are recent works on feature selection in machine learning [13, 14, 32], influence maximization in viral marketing [2, 31], and data summarization [43, 38, 45]. Many of these applications have non-monotone submodular objectives, meaning that adding an element to an existing set might actually decrease its value. Two such examples are discussed in detail in Section 5. This work was supported by the ERC Advanced Grant 788893 AMDROMA "Algorithmic and Mechanism Design Research in Online Markets" and the MIUR PRIN project ALGADIMAR "Algorithms, Games, and Digital Markets."
arXiv.org Artificial Intelligence
Oct-23-2023
- Country:
- Asia (0.67)
- Europe (1.00)
- North America
- Canada (0.93)
- United States
- California
- Los Angeles County > Los Angeles (0.14)
- San Francisco County > San Francisco (0.14)
- New York > New York County
- New York City (0.14)
- California
- Genre:
- Research Report (0.50)
- Technology: