An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization

Chen, Lesi, Ye, Haishan, Luo, Luo

arXiv.org Artificial Intelligence 

This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave (NC-SC) minimax problems over a multi-agent network. We propose an efficient algorithm, called the Decentralized Recursive-gradient descEnt Ascent Method (DREAM), which achieves the best-known theoretical guarantee for finding the $\epsilon$-stationary point of the primal function. The proposed method requires $\mathcal{O}(\min (\kappa^3\epsilon^{-3},\sqrt{N} \kappa^2 \epsilon^{-2} ))$ stochastic first-order oracle (SFO) calls and $\tilde{\mathcal{O}}(\kappa^2 \epsilon^{-2})$ communication rounds to find an $\epsilon$-stationary point, where $\kappa$ is the condition number. DREAM achieves the best-known complexity for both the online and offline setups.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found