Learning Variational Inequalities from Data: Fast Generalization Rates under Strong Monotonicity
Zhao, Eric, Chavdarova, Tatjana, Jordan, Michael
Variational inequalities (VIs) are a broad class of optimization problems encompassing machine learning problems ranging from standard convex minimization to more complex scenarios like min-max optimization and computing the equilibria of multi-player games. In convex optimization, strong convexity allows for fast statistical learning rates requiring only $\Theta(1/\epsilon)$ stochastic first-order oracle calls to find an $\epsilon$-optimal solution, rather than the standard $\Theta(1/\epsilon^2)$ calls. In this paper, we explain how one can similarly obtain fast $\Theta(1/\epsilon)$ rates for learning VIs that satisfy strong monotonicity, a generalization of strong convexity. Specifically, we demonstrate that standard stability-based generalization arguments for convex minimization extend directly to VIs when the domain admits a small covering, or when the operator is integrable and suboptimality is measured by potential functions; such as when finding equilibria in multi-player games.
Dec-10-2024
- Country:
- Asia
- Middle East > Jordan (0.04)
- Russia (0.04)
- Europe
- Russia (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- North America > United States
- California > Alameda County
- Berkeley (0.04)
- New York (0.04)
- California > Alameda County
- Asia
- Genre:
- Research Report (0.82)
- Industry:
- Education (0.34)
- Technology: