Learning Variational Inequalities from Data: Fast Generalization Rates under Strong Monotonicity
Zhao, Eric, Chavdarova, Tatjana, Jordan, Michael
Variational inequalities (VIs) are a broad class of optimization problems encompassing machine learning problems ranging from standard convex minimization to more complex scenarios like min-max optimization and computing the equilibria of multi-player games. In convex optimization, strong convexity allows for fast statistical learning rates requiring only $\Theta(1/\epsilon)$ stochastic first-order oracle calls to find an $\epsilon$-optimal solution, rather than the standard $\Theta(1/\epsilon^2)$ calls. In this paper, we explain how one can similarly obtain fast $\Theta(1/\epsilon)$ rates for learning VIs that satisfy strong monotonicity, a generalization of strong convexity. Specifically, we demonstrate that standard stability-based generalization arguments for convex minimization extend directly to VIs when the domain admits a small covering, or when the operator is integrable and suboptimality is measured by potential functions; such as when finding equilibria in multi-player games.
Dec-10-2024
- Country:
- North America > United States > California > Alameda County > Berkeley (0.40)
- Genre:
- Research Report (0.82)
- Industry:
- Leisure & Entertainment > Sports > Basketball (0.40)
- Technology: