Non-parametric classification via expand-and-sparsify representation

Neural Information Processing Systems 

We propose two algorithms for non-parametric classification using such EaS representation. For our first algorithm, we use winners-take-all operation for the sparsification step and show that the proposed classifier admits the form of a locally weighted average classifier and establish its consistency via Stone's Theorem. Further, assuming that the conditional probability function P (y = 1|x) = η(x) is Hölder continuous and for optimal choice of m, we show that the convergence rate of this classifier is minimax-optimal.