Goto

Collaborating Authors

 active nearest-neighbor learning


Active Nearest-Neighbor Learning in Metric Spaces

Neural Information Processing Systems

We propose a pool-based non-parametric active learning algorithm for general metric spaces, called MArgin Regularized Metric Active Nearest Neighbor (MARMANN), which outputs a nearest-neighbor classifier. We give prediction error guarantees that depend on the noisy-margin properties of the input sample, and are competitive with those obtained by previously proposed passive learners. We prove that the label complexity of MARMANN is significantly lower than that of any passive learner with similar error guarantees. Our algorithm is based on a generalized sample compression scheme and a new label-efficient active model-selection procedure.


Reviews: Active Nearest-Neighbor Learning in Metric Spaces

Neural Information Processing Systems

I am not qualified to evaluate this work in term of its relevance within the literature. Therefore my judgment is only about the paper content itself. Also, I have only reviewed the proofs contained in the main paper the one of Lemma A.1. Theorem 3.2 guarantees a significant improvement upon the passive learner characterized by 3.1. I find the example in line2 141-143 about the 1/sqrt(m) order very helpful and I suggest the authors to include it in the introduction as well.


Active Nearest-Neighbor Learning in Metric Spaces

Kontorovich, Aryeh, Sabato, Sivan, Urner, Ruth

Neural Information Processing Systems

We propose a pool-based non-parametric active learning algorithm for general metric spaces, called MArgin Regularized Metric Active Nearest Neighbor (MARMANN), which outputs a nearest-neighbor classifier. We give prediction error guarantees that depend on the noisy-margin properties of the input sample, and are competitive with those obtained by previously proposed passive learners. We prove that the label complexity of MARMANN is significantly lower than that of any passive learner with similar error guarantees. Our algorithm is based on a generalized sample compression scheme and a new label-efficient active model-selection procedure. Papers published at the Neural Information Processing Systems Conference.