Accelerated Mini-Batch Stochastic Dual Coordinate Ascent
Shalev-Shwartz, Shai, Zhang, Tong
Stochastic dual coordinate ascent (SDCA) is an effective technique for solving regularized loss minimization problems in machine learning. This paper considers an extension of SDCA under the mini-batch setting that is often used in practice. Our main contribution is to introduce an accelerated mini-batch version of SDCA and prove a fast convergence rate for this method. We discuss an implementation of our method over a parallel computing system, and compare the results to both the vanilla stochastic dual coordinate ascent and to the accelerated deterministic gradient descent method of \cite{nesterov2007gradient}.
May-12-2013
- Country:
- Asia > Middle East
- Israel > Jerusalem District > Jerusalem (0.04)
- North America > United States (0.04)
- Asia > Middle East
- Genre:
- Research Report (0.82)
- Technology: