Export Reviews, Discussions, Author Feedback and Meta-Reviews
–Neural Information Processing Systems
The paper proposes a new approach to Mahalanobis metric learning which views metric learning as a structure-prediction problem. The structure we seek to establish is the set, h, of appropriate k-Nearest Neighbors (k-NNs) of an instance; appropriate in the sense that it will lead to a small classification error. The idea of the authors is quite elegant: in learning a metric for k-NN prediction we can have a correct prediction for an instance x_i as long as the majority of its k-NNs are of the same class as x_i . The metric learning algorithm they propose does exactly that, i.e. it looks for the Mahalanobis metric matrix W that leads to k-nearest neighborhoods for each x_i instance such that the majority of the instances of the neighborhood belongs to the correct class, i.e. that of x_i . Note here that there can be many such neighborhoods that predict the correct label for a given instance x_i, it is enough to prefer at least one of them over all the ones that predict the wrong label.
Neural Information Processing Systems
Feb-9-2025, 17:09:40 GMT
- Technology: