second-order stationary point efficiently
Finding Second-Order Stationary Points Efficiently in Smooth Nonconvex Linearly Constrained Optimization Problems
This paper proposes two efficient algorithms for computing approximate second-order stationary points (SOSPs) of problems with generic smooth non-convex objective functions and generic linear constraints. While finding (approximate) SOSPs for the class of smooth non-convex linearly constrained problems is computationally intractable, we show that generic problem instances in this class can be solved efficiently. Specifically, for a generic problem instance, we show that certain strict complementarity (SC) condition holds for all Karush-Kuhn-Tucker (KKT) solutions. Based on this condition, we design an algorithm named Successive Negative-curvature grAdient Projection (SNAP), which performs either conventional gradient projection or some negative curvature-based projection steps to find SOSPs.
Review for NeurIPS paper: Finding Second-Order Stationary Points Efficiently in Smooth Nonconvex Linearly Constrained Optimization Problems
Additional Feedback: To summarize, the authors consider the problem of finding approximate Second Order Stationary Points (SOSPs). Compared with other works, the authors assume that the objective is generally non-convex and constrains are linear. Two methods are designed to solve this optimization problem. Both of these two methods are proved to have polynomial per-iteration complexity and global sublinear rates. In general, the problem is both theoretically and empirically interesting.
Review for NeurIPS paper: Finding Second-Order Stationary Points Efficiently in Smooth Nonconvex Linearly Constrained Optimization Problems
All the reviewers were positive towards the paper. Originally, the paper got 687, with relatively high confidences. The major concern about the paper was that it may not fit for large scale problems. During discussion, Reviewer #3 deemed that the rebuttal is "clear and makes sense", hence raised his/her score from 7 to 8. The AC deemed that the theoretical contribution of the paper is good and agreed to forgive the weakness in experiments. Thus the AC recommended acceptance.
Finding Second-Order Stationary Points Efficiently in Smooth Nonconvex Linearly Constrained Optimization Problems
This paper proposes two efficient algorithms for computing approximate second-order stationary points (SOSPs) of problems with generic smooth non-convex objective functions and generic linear constraints. While finding (approximate) SOSPs for the class of smooth non-convex linearly constrained problems is computationally intractable, we show that generic problem instances in this class can be solved efficiently. Specifically, for a generic problem instance, we show that certain strict complementarity (SC) condition holds for all Karush-Kuhn-Tucker (KKT) solutions. Based on this condition, we design an algorithm named Successive Negative-curvature grAdient Projection (SNAP), which performs either conventional gradient projection or some negative curvature-based projection steps to find SOSPs. Building on SNAP, we propose a first-order algorithm, named SNAP, that requires \mathcal{O}(1/\epsilon {2.5}) iterations to compute (\epsilon, \sqrt{\epsilon}) -SOSP. The per-iteration computational complexities of our algorithms are polynomial in the number of constraints and problem dimension.