BeyondTikhonov: FasterLearningwith Self-ConcordantLossesviaIterativeRegularization

Neural Information Processing Systems 

The theory of spectral filtering is a remarkable tool to understand the statistical properties of learning with kernels. For least squares, it allows to derive various regularization schemes that yield faster convergence rates of the excess risk than with Tikhonov regularization. This is typically achieved by leveraging classical assumptions called source and capacity conditions, which characterize the difficulty of the learning task. In order to understand estimators derived from other lossfunctions,Marteau-Fereyetal.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found