Goto

Collaborating Authors

 variance term








Data driven estimation of Laplace-Beltrami operator

Frederic Chazal, Ilaria Giulini, Bertrand Michel

Neural Information Processing Systems

Approximations of Laplace-Beltrami operators on manifolds through graph Lapla-cians have become popular tools in data analysis and machine learning. These discretized operators usually depend on bandwidth parameters whose tuning remains a theoretical and practical problem.


Active Learning for Non-Parametric Regression Using Purely Random Trees

Jack Goetz, Ambuj Tewari, Paul Zimmerman

Neural Information Processing Systems

Active learning is the task of using labelled data to select additional points to label, with the goal of fitting the most accurate model with a fixed budget of labelled points. In binary classification active learning is known to produce faster rates than passive learning for a broad range of settings. However in regression restrictive structure and tailored methods were previously needed to obtain theoretically superior performance. In this paper we propose an intuitive tree based active learning algorithm for non-parametric regression with provable improvement over random sampling. When implemented with Mondrian Trees our algorithm is tuning parameter free, consistent and minimax optimal for Lipschitz functions.



On the Asymptotic Learning Curves of Kernel Ridge Regression under Power-law Decay

Neural Information Processing Systems

The widely observed'benign overfitting phenomenon' in the neural network literature raises the challenge to the'bias-variance trade-off' doctrine in the statistical learning theory. Since the generalization ability of the'lazy trained' over-parametrized neural network can be well approximated by that of the neural tangent