Distance Metric Learning for Large Margin Nearest Neighbor Classification
Weinberger, Kilian Q., Blitzer, John, Saul, Lawrence K.
–Neural Information Processing Systems
We show how to learn a Mahanalobis distance metric for k-nearest neighbor (kNN)classification by semidefinite programming. The metric is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven data sets of varying size and difficulty, we find that metrics trained in this way lead to significant improvements in kNN classification--for example, achieving a test error rate of 1.3% on the MNIST handwritten digits. As in support vector machines (SVMs), the learning problem reduces to a convex optimization based on the hinge loss. Unlike learning in SVMs, however, our framework requires no modification or extension for problems in multiway (as opposed to binary) classification.
Neural Information Processing Systems
Dec-31-2006
- Country:
- North America > United States
- Massachusetts > Middlesex County
- Cambridge (0.14)
- Pennsylvania (0.28)
- Massachusetts > Middlesex County
- North America > United States
- Industry:
- Education (0.34)
- Technology: