Local Linear Convergence of Gradient Methods for Subspace Optimization via Strict Complementarity

Neural Information Processing Systems 

In this work we bridge these two approaches under a strict complementarity assumption, which in particular implies that the optimal solution to the convex relaxation is unique and is also the optimal solution to the original nonconvex problem.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found