Nonmonotone subgradient methods based on a local descent lemma
Aragón-Artacho, Francisco J., Campoy, Rubén, Pérez-Aros, Pedro, Torregrosa-Belén, David
–arXiv.org Artificial Intelligence
The aim of this paper is to extend the context of nonmonotone descent methods to the class of nonsmooth and nonconvex functions called upper-$\mathcal{C}^2$, which satisfy a nonsmooth and local version of the descent lemma. Under this assumption, we propose a general subgradient method that performs a nonmonotone linesearch, and we prove subsequential convergence to a stationary point of the optimization problem. Our approach allows us to cover the setting of various subgradient algorithms, including Newton and quasi-Newton methods. In addition, we propose a specification of the general scheme, named Self-adaptive Nonmonotone Subgradient Method (SNSM), which automatically updates the parameters of the linesearch. Particular attention is paid to the minimum sum-of-squares clustering problem, for which we provide a concrete implementation of SNSM. We conclude with some numerical experiments where we exhibit the advantages of SNSM in comparison with some known algorithms.
arXiv.org Artificial Intelligence
Oct-23-2025
- Country:
- Asia > Vietnam (0.04)
- Europe
- Netherlands > South Holland
- Dordrecht (0.04)
- Spain
- Aragón (0.04)
- Valencian Community > Alicante Province
- Alicante (0.04)
- Netherlands > South Holland
- North America > United States
- California > Orange County
- Irvine (0.04)
- New York (0.04)
- California > Orange County
- South America > Chile
- Genre:
- Research Report (1.00)
- Technology: