Extrapolation Towards Imaginary 0-Nearest Neighbour and Its Improved Convergence Rate School of Statistical Thinking, The Institute of Statistical Mathematics
–Neural Information Processing Systems
The weights and the parameter k N regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the k-NN classifier; several existing studies considered selecting optimal k and weights to obtain faster convergence rate. Whereas k-NN with non-negative weights has been developed widely, it was also proved that negative weights are essential for eradicating the bias terms and attaining optimal convergence rate. In this paper, we propose a novel multiscale k-NN (MS-k-NN), that extrapolates unweighted k-NN estimators from several k 1 values to k = 0, thus giving an imaginary 0-NN estimator. Our method implicitly computes optimal real-valued weights that are adaptive to the query and its neighbour points. We theoretically prove that the MS-k-NN attains the improved rate, which coincides with the existing optimal rate under some conditions.
Neural Information Processing Systems
Mar-22-2025, 00:17:26 GMT