Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity
–Neural Information Processing Systems
This paper considers the distributed convex-concave minimax optimization under the second-order similarity. We propose stochastic variance-reduced optimistic gradient sliding (SVOGS) method, which takes the advantage of the finite-sum structure in the objective by involving mini-batch client sampling and variance reduction.
Neural Information Processing Systems
May-29-2025, 00:23:08 GMT