A Single-Loop Accelerated Extra-Gradient Difference Algorithm with Improved Complexity Bounds for Constrained Minimax Optimization
–Neural Information Processing Systems
In this paper, we propose a novel extra-gradient difference acceleration algorithm for solving constrained nonconvex-nonconcave (NC-NC) minimax problems. In particular, we design a new extra-gradient difference step to obtain an important quasi-cocoercivity property, which plays a key role to significantly improve the convergence rate in the constrained NC-NC setting without additional structural assumption. Then momentum acceleration is also introduced into our dual accelerating update step. As the special cases of the constrained NC-NC setting, our algorithm can also obtain the same complexity \mathcal{O}(\epsilon {-2}) for both the nonconvex-concave (NC-C) and convex-nonconcave (C-NC) cases, while the best-known complexity bounds are \widetilde{\mathcal{O}}(\epsilon {-2.5}) for the NC-C case and \widetilde{\mathcal{O}}(\epsilon {-4}) for the C-NC case. For fair comparison with existing algorithms, we also analyze the complexity bound to find \epsilon -stationary point of the primal function \phi for the constrained NC-C problem, which shows that our algorithm can improve the complexity bound from \widetilde{\mathcal{O}}(\epsilon {-3}) to \mathcal{O}(\epsilon {-2}) .
Neural Information Processing Systems
Jan-19-2025, 21:12:27 GMT
- Technology: