Goto

Collaborating Authors

 Kiran K. Thekumparampil



Efficient Algorithms for Smooth Minimax Optimization

Neural Information Processing Systems

In terms of g(, y), we consider two settings - strongly convex and nonconvex - and improve upon the best known rates in both. For strongly-convex g(, y), y, we propose a new direct optimal algorithm combining Mirror-Prox and Nesterov's AGD, and show that it can find global optimum in Õ (1/k



Efficient Algorithms for Smooth Minimax Optimization

Neural Information Processing Systems

In terms of g(, y), we consider two settings - strongly convex and nonconvex - and improve upon the best known rates in both. For strongly-convex g(, y), y, we propose a new direct optimal algorithm combining Mirror-Prox and Nesterov's AGD, and show that it can find global optimum in Õ (1/k