Variational Optimization for the Submodular Maximum Coverage Problem

Du, Jian, Hua, Zhigang, Yang, Shuang

arXiv.org Machine Learning 

We provide the first While [25] shows the greedy method with a modular approximation variational approximation for this problem based on the Nemhauser has good performance, we take a step further to build a mathematical divergence, and show that it can be solved efficiently using variational connection between the variational modular approximation optimization. The algorithm alternates between two steps: to a submodular function based on Namhauser divergence and (1) an E step that estimates a variational parameter to maximize a classical variational approximation based on KullbackâĂŞLeibler parameterized modular lower bound; and (2) an M step that updates divergence. We take advantage of this framework to iteratively solve the solution by solving the local approximate problem. We provide SMCP, leading to a novel variational approach. Analogous to the theoretical analysis on the performance of the proposed approach counterpart of variational optimization based on Kullback-Leibler and its curvature-dependent approximate factor, and empirically divergence, the proposed method consists of two alternating steps, evaluate it on a number of public data sets and several application namely estimation (E step) and maximization (M step) to monotonically tasks.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found