GloballyConvergentNewtonMethodsfor Ill-conditionedGeneralizedSelf-concordantLosses

Neural Information Processing Systems 

Second, in the non-parametric machine learning setting, we provide an explicit algorithm combining the previous scheme with Nyström projection techniques, andprovethatitachievesoptimal generalization bounds with atime complexity of orderO(ndfλ), a memory complexity of orderO(df2λ) and no dependence on the condition number, generalizing the results known for leastsquaresregression.Here nisthenumberofobservationsand dfλ istheassociated degrees of freedom.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found