On the Asymptotic Learning Curves of Kernel Ridge Regression under Power-law Decay

Neural Information Processing Systems 

The widely observed'benign overfitting phenomenon' in the neural network literature raises the challenge to the'bias-variance trade-off' doctrine in the statistical learning theory. Since the generalization ability of the'lazy trained' over-parametrized neural network can be well approximated by that of the neural tangent