Accelerated Proximal Gradient Methods for Nonconvex Programming
–Neural Information Processing Systems
Nonconvex and nonsmooth problems have recently received considerable attention in signal/image processing, statistics and machine learning. However, solving the nonconvex and nonsmooth optimization problems remains a big challenge. Accelerated proximal gradient (APG) is an excellent method for convex programming. However, it is still unknown whether the usual APG can ensure the convergence to a critical point in nonconvex programming. In this paper, we extend APG for general nonconvex and nonsmooth programs by introducing a monitor that satisfies the sufficient descent property. Accordingly, we propose a monotone APG and a nonmonotone APG. The latter waives the requirement on monotonic reduction of the objective function and needs less computation in each iteration.
Neural Information Processing Systems
Mar-13-2024, 05:16:43 GMT
- Country:
- Asia > China
- North America > United States
- Washington > King County > Seattle (0.04)
- Technology: