Optimal Extragradient-Based Algorithms for Stochastic Variational Inequalities with Separable Structure Huizhuo Yuan Chris Junchi Li Michael I. Jordan
–Neural Information Processing Systems
We consider the problem of solving stochastic monotone variational inequalities with a separable structure using a stochastic first-order oracle. Building on standard extragradient for variational inequalities we propose a novel algorithm--stochastic accelerated gradient-extragradient (AG-EG)--for strongly monotone variational inequalities (VIs). Our approach combines the strengths of extragradient and Nesterov acceleration. By showing that its iterates remain in a bounded domain and applying scheduled restarting, we prove that AG-EG has an optimal convergence rate for strongly monotone VIs.
Neural Information Processing Systems
Mar-26-2025, 16:04:21 GMT