SoS Degree Reduction with Applications to Clustering and Robust Moment Estimation

Steurer, David, Tiegel, Stefan

arXiv.org Machine Learning 

The Sum-of-Squares hierarchy is a hierarchy of semidefinite programs which has proven to be a powerful tool in the theory of approximation algorithms [GW95, ARV09]. More recently it has also given rise to a flurry of algorithms for estimation problems such as various tensor [BKS15, BM16, MSS16, HSS15], clustering [HL18], and robust estimation problems [KSS18, KKK19, KKM18], often yielding significant improvements over existing algorithms and in some cases even the first efficient ones. The hierarchy is based on the sum-of-squares proof system which on a high-level allows to argue about non-negativity of polynomials by manipulating a set of polynomial inequalities. Most importantly, it can be algorithmically exploited in the sense that certain proofs in this proof system directly certify approximation guarantees of algorithms based on the hierarchy. The running time of these algorithms depends mainly on the number of variables involved and the maximum degree of the polynomials occurring in the inequalities mentioned above. In general using a higher degree often leads to more accurate solutions but also requires more time. In this work, we show how we can significantly reduce the degree of a wide range of sum-of-squares proofs in an almost black-box manner while still certifying similar guarantees and thus giving a direct speedup for concrete algorithms. As two examples we will consider estimation algorithms for clustering and outlier-robust moment estimation. We hope that this technique can inform future algorithms based on the sum-of-squares hierarchy.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found