High-probability complexity guarantees for nonconvex minimax problems
–Neural Information Processing Systems
Stochastic smooth nonconvex minimax problems are prevalent in machine learning, e.g., GAN training, fair classification, and distributionally robust learning. Stochastic gradient descent ascent (GDA)-type methods are popular in practice due to their simplicity and single-loop nature. However, there is a significant gap between the theory and practice regarding high-probability complexity guarantees for these methods on stochastic nonconvex minimax problems. Existing high-probability bounds for GDA-type single-loop methods only apply to convex/concave minimax problems and to particular non-monotone variational inequality problems under some restrictive assumptions. In this work, we address this gap by providing the first high-probability complexity guarantees for nonconvex/PL minimax problems corresponding to a smooth function that satisfies the PL-condition in the dual variable.
Neural Information Processing Systems
May-25-2025, 21:40:43 GMT
- Country:
- Europe > France
- Provence-Alpes-Côte d'Azur (0.14)
- North America > United States
- New Jersey (0.14)
- Europe > France
- Genre:
- Research Report > Experimental Study (0.46)
- Industry:
- Health & Medicine (0.93)
- Technology: