Learning with Symmetric Label Noise: The Importance of Being Unhinged Brendan van Rooyen Aditya Krishna Menon †, ∗ Robert C. Williamson The Australian National University
–Neural Information Processing Systems
Convex potential minimisation is the de facto approach to binary classification. However, Long and Servedio [2010] proved that under symmetric label noise (SLN), minimisation of any convex potential over a linear function class can result in classification performance equivalent to random guessing. This ostensibly shows that convex losses are not SLN-robust. In this paper, we propose a convex, classification-calibrated loss and prove that it is SLN-robust. The loss avoids the Long and Servedio [2010] result by virtue of being negatively unbounded. The loss is a modification of the hinge loss, where one does not clamp at zero; hence, we call it the unhinged loss.
Neural Information Processing Systems
Mar-12-2024, 23:13:16 GMT