Non-linear Metric Learning

Neural Information Processing Systems 

How to compare examples is a fundamental question in machine learning. If an algorithm could perfectly determine whether two examples were semantically similar or dissimilar, most subsequent machine learning tasks would become trivial (i.e., a nearest neighbor classifier will achieve perfect results). Guided by this motivation, a surge of recent research [10, 13, 15, 24, 31, 32] has focused on Mahalanobis metric learning. The resulting methods greatly improve the performance of metric dependent algorithms, such as k-means clustering and kNN classification, and have gained popularity in many research areas and applications within and beyond machine learning. One reason for this success is the out-of-the-box usability and robustness of several popular methods to learn these linear metrics.