GloballyConvergentNewtonMethodsfor Ill-conditionedGeneralizedSelf-concordantLosses
–Neural Information Processing Systems
Second, in the non-parametric machine learning setting, we provide an explicit algorithm combining the previous scheme with Nyström projection techniques, andprovethatitachievesoptimal generalization bounds with atime complexity of orderO(ndfλ), a memory complexity of orderO(df2λ) and no dependence on the condition number, generalizing the results known for leastsquaresregression.Here nisthenumberofobservationsand dfλ istheassociated degrees of freedom.
Neural Information Processing Systems
Feb-12-2026, 08:45:51 GMT
- Country:
- Technology: