Breaking the Nonsmooth Barrier: A Scalable Parallel Method for Composite Optimization
Fabian Pedregosa, Rémi Leblond, Simon Lacoste-Julien
–Neural Information Processing Systems
Due to their simplicity and excellent performance, parallel asynchronous variants of stochastic gradient descent have become popular methods to solve a wide range of large-scale optimization problems on multi-core architectures. Y et, despite their practical success, support for nonsmooth objectives is still lacking, making them unsuitable for many problems of interest in machine learning, such as the Lasso, group Lasso or empirical risk minimization with convex constraints.
Neural Information Processing Systems
Nov-21-2025, 04:09:02 GMT
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe > France
- Île-de-France > Paris > Paris (0.04)
- North America
- Canada (0.04)
- United States > California
- Los Angeles County > Long Beach (0.04)
- Asia > Middle East
- Genre:
- Research Report (0.68)
- Technology: