Characterizing Generalization under Out-Of-Distribution Shifts in Deep Metric Learning
–Neural Information Processing Systems
Deep Metric Learning (DML) aims to find representations suitable for zero-shot transfer to a priori unknown test distributions. However, common evaluation protocols only test a single, fixed data split in which train and test classes are assigned randomly. More realistic evaluations should consider a broad spectrum of distribution shifts with potentially varying degree and difficulty.In this work, we systematically construct train-test splits of increasing difficulty and present the ooDML benchmark to characterize generalization under out-of-distribution shifts in DML.
Neural Information Processing Systems
Dec-24-2025, 22:57:53 GMT
- Technology: