Efficient Algorithms for Smooth Minimax Optimization
–Neural Information Processing Systems
This paper studies first order methods for solving smooth minimax optimization problems \min_x \max_y g(x,y) where g(\cdot,\cdot) is smooth and g(x,\cdot) is concave for each x . In terms of g(\cdot,y), we consider two settings -- strongly convex and nonconvex -- and improve upon the best known rates in both. For strongly-convex g(\cdot, y),\ \forall y, we propose a new direct optimal algorithm combining Mirror-Prox and Nesterov's AGD, and show that it can find global optimum in \widetilde{O}\left(1/k 2 \right) iterations, improving over current state-of-the-art rate of O(1/k) . We use this result along with an inexact proximal point method to provide \widetilde{O}\left(1/k {1/3} \right) rate for finding stationary points in the nonconvex setting where g(\cdot, y) can be nonconvex. This improves over current best-known rate of O(1/k {1/5}) .
Neural Information Processing Systems
Oct-9-2024, 11:06:25 GMT