Fast, smooth and adaptive regression in metric spaces

Kpotufe, Samory

Neural Information Processing Systems 

It was recently shown that certain nonparametric regressors can escape the curse of dimensionality in the sense that their convergence rates adapt to the intrinsic dimension of data (\cite{BL:65, SK:77}). We prove some stronger results in more general settings. In particular, we consider a regressor which, by combining aspects of both tree-based regression and kernel regression, operates on a general metric space, yields a smooth function, and evaluates in time $O(\log n)$. We derive a tight convergence rate of the form $n {-2/(2 d)}$ where $d$ is the Assouad dimension of the input space. Papers published at the Neural Information Processing Systems Conference.