Fast Rates for Regularized Objectives
Sridharan, Karthik, Shalev-shwartz, Shai, Srebro, Nathan
–Neural Information Processing Systems
We study convergence properties of empirical minimization of a stochastic strongly convex objective, where the stochastic component is linear. We show that the value attained by the empirical minimizer converges to the optimal value with rate 1/n. The result applies, in particular, to the SVM objective. Thus, we obtain a rate of 1/n on the convergence of the SVM objective (with fixed regularization parameter)to its infinite data limit. We demonstrate how this is essential for obtaining certain type of oracle inequalities for SVMs.
Neural Information Processing Systems
Dec-31-2009
- Technology: