Reviews: Efficient Algorithms for Smooth Minimax Optimization

Neural Information Processing Systems 

This paper aims at solving the saddle point problem \min_{x} \max_{y} g(x, y) where g(x, \cdot) is concave for each x and g(\cdot, y) is either strongly convex or nonconvex for every y . The authors introduce a new algorithm called DIAG which combines Mirror-prox and Nesterov's AGD to solve the minimax problem. For the case that g is strongly-convex with respect to x, the authors show that DIAG has a convergence rate of \mathcal{O}(1/k 2) when the suboptimality for a pair of point (\hat{x},\hat{y}) is defined as the primal-dual optimality gap \max_{y} g(\hat{x},y) - \min_{x} g(x, \hat{y}) . For the case that g is nonconvex with respect to x, the authors show that DIAG finds an \epsilon -first-order stationary point (FOSP) after at most \mathcal{O}(1/\epsilon 3) gradient evaluations. The convergence criteria considered in this paper for solving a strongly convex-concave problem is interesting.