Nonparametric Estimation via Variance-Reduced Sketching

Khoo, Yuehaw, Peng, Yifan, Wang, Daren

arXiv.org Artificial Intelligence 

Nonparametric models have extensive applications across diverse fields, including biology (Mac-Farland et al. (2016)), economics (Ullah and Pagan (1999); Li and Racine (2023)), engineering (Lanzante (1996)), and machine learning (Hofmann et al. (2008); Schmidt-hieber (2020)). The most representative nonparametric approaches are kernel methods, known for their numerical robustness and statistical stability in lower-dimensional settings. However, kernel methods often suffer from the curse of dimensionality in higher-dimensional spaces. Recently, a number of significant studies have tackled various modern challenges in nonparametric models. For example, Ravikumar et al. (2009), Raskutti et al. (2012), and Yuan and Zhou (2016) have studied additive models for high-dimensional nonparametric regression; Zhang et al. (2015) and Yang et al. (2017) analyzed randomized algorithms for kernel regression estimation; and Liu et al. (2007) explored nonparametric density estimation in higher dimensions. Despite these contributions, the curse of dimensionality in nonparametric problems, particularly in aspects of statistical accuracy and computational efficiency, remains an open area for further exploration. In this paper, we aim to develop a new framework specifically designed for nonparametric estimation problems. Within this framework, we conceptualize functions as matrices or tensors and explore new methods for handling the bias-variance trade-off, aiming to reduce the curse of dimensionality in higher dimensions. Matrix approximation algorithms, such as singular value decomposition and QR decomposition, play a crucial role in computational mathematics and statistics.