Goto

Collaborating Authors

 feasible level proximal point method


A Feasible Level Proximal Point Method for Nonconvex Sparse Constrained Optimization

Neural Information Processing Systems

Nonconvex sparse models have received significant attention in high-dimensional machine learning. In this paper, we study a new model consisting of a general convex or nonconvex objectives and a variety of continuous nonconvex sparsity-inducing constraints. For this constrained model, we propose a novel proximal point algorithm that solves a sequence of convex subproblems with gradually relaxed constraint levels. Each subproblem, having a proximal point objective and a convex surrogate constraint, can be efficiently solved based on a fast routine for projection onto the surrogate constraint. We establish the asymptotic convergence of the proposed algorithm to the Karush-Kuhn-Tucker (KKT) solutions. We also establish new convergence complexities to achieve an approximate KKT solution when the objective can be smooth/nonsmooth, deterministic/stochastic and convex/nonconvex with complexity that is on a par with gradient descent for unconstrained optimization problems in respective cases. To the best of our knowledge, this is the first study of the first-order methods with complexity guarantee for nonconvex sparse-constrained problems. We perform numerical experiments to demonstrate the effectiveness of our new model and efficiency of the proposed algorithm for large scale problems.


Review for NeurIPS paper: A Feasible Level Proximal Point Method for Nonconvex Sparse Constrained Optimization

Neural Information Processing Systems

Summary and Contributions: UPDATE AFTER REBUTTAL Thank you for your response. The authors have agreed to clarify the motivation for the use of non-convex constraints and will be precise about the use of the word "suboptimal". They have agreed to state that they have considerably more knowledge on the specific kind of non-convexity they are dealing with when comparing with prior works. As a consequence I have increased my score, although I believe the experimental section remains rather unconvincing. This paper proposes a method based on a sequence of convex approximations to solve optimization problems with a non-convex sparsity constraint.



A Feasible Level Proximal Point Method for Nonconvex Sparse Constrained Optimization

Neural Information Processing Systems

Nonconvex sparse models have received significant attention in high-dimensional machine learning. In this paper, we study a new model consisting of a general convex or nonconvex objectives and a variety of continuous nonconvex sparsity-inducing constraints. For this constrained model, we propose a novel proximal point algorithm that solves a sequence of convex subproblems with gradually relaxed constraint levels. Each subproblem, having a proximal point objective and a convex surrogate constraint, can be efficiently solved based on a fast routine for projection onto the surrogate constraint. We establish the asymptotic convergence of the proposed algorithm to the Karush-Kuhn-Tucker (KKT) solutions.