Horizontally Scalable Submodular Maximization
Lucic, Mario, Bachem, Olivier, Zadimoghaddam, Morteza, Krause, Andreas
A variety of large-scale machine learning problems can be cast as instances of constrained submodular maximization. Existing approaches for distributed submodular maximization have a critical drawback: The capacity - number of instances that can fit in memory - must grow with the data set size. In practice, while one can provision many machines, the capacity of each machine is limited by physical constraints. We propose a truly scalable approach for distributed submodular maximization under fixed capacity. The proposed framework applies to a broad class of algorithms and constraints and provides theoretical guarantees on the approximation factor for any available capacity. We empirically evaluate the proposed algorithm on a variety of data sets and demonstrate that it achieves performance competitive with the centralized greedy solution.
May-31-2016
- Country:
- Europe > Switzerland
- North America > United States
- New York (0.14)
- Genre:
- Research Report (0.50)
- Technology: