Regression-tree Tuning in a Streaming Setting

Kpotufe, Samory, Orabona, Francesco

Neural Information Processing Systems 

We consider the problem of maintaining the data-structures of a partition-based regression procedure in a setting where the training data arrives sequentially over time. We prove that it is possible to maintain such a structure in time $O(\log n)$ at any time step $n$ while achieving a nearly-optimal regression rate of $\tilde{O}(n {-2/(2 d)})$ in terms of the unknown metric dimension $d$. Finally we prove a new regression lower-bound which is independent of a given data size, and hence is more appropriate for the streaming setting. Papers published at the Neural Information Processing Systems Conference.