Reviews: Homotopy Smoothing for Non-Smooth Problems with Lower Complexity than O(1/\epsilon)
–Neural Information Processing Systems
The submission considers algorithms for solving a specific class of optimization problems, namely min_{x in Omega_1} F(x), where F(x) max_{u in Omega_2} \langle Ax, u \rangle - phi(u) g(x). Here, g is convex, Omega_1 is closed and convex, Omega_2 is closed, convex, and bounded, and the set of optimal solutions Omega_* \subset Omega_1 is convex, compact, and non-empty. The submission also assumes a proximal mapping for g can be computed efficiently. The above framework is apparently general enough to capture a number of applications, including various natural regularized empirical loss minimization problems that arise in machine learning. Classic work of Nesterov combined a smooth approximation technique with accelerated proximal gradient descent to converge to a solution with epsilon of optimal in O(1/epsilon) iterations.
Neural Information Processing Systems
Jan-20-2025, 18:26:00 GMT
- Technology: