Reviews: An adaptive nearest neighbor rule for classification

Neural Information Processing Systems 

The paper proposes a variant of the k-Nearest Neighbors algorithm (called adaptive knn) in which k is chosen for each example to classify, instead of being tuned as a global hyperparameter. To do so, the authors define a new notion that applied locally in the input space they call the advantage, instead of the local Lipschitz condition that is often used in such setting. An important contribution of the paper is the prove that the proposed algorithm is consistent and have pointwise convergence at the limit. The proposed notion of advantage is allers related to some error bounds for pointwise convergence. The experimental part is clearly sufficient for this type of paper, even if there is no comparison with other state-of-the-art algorithm.