Variation-Bounded Loss for Noise-Tolerant Learning

Wang, Jialiang, Zhou, Xiong, Liu, Xianming, Hu, Gangfeng, Zhai, Deming, Jiang, Junjun, Li, Haoliang

arXiv.org Artificial Intelligence 

Mitigating the negative impact of noisy labels has been a perennial issue in supervised learning. Robust loss functions have emerged as a prevalent solution to this problem. In this work, we introduce the V ariation Ratio as a novel property related to the robustness of loss functions, and propose a new family of robust loss functions, termed V ariation-Bounded Loss (VBL), which is characterized by a bounded variation ratio. We provide theoretical analyses of the variation ratio, proving that a smaller variation ratio would lead to better robustness. Furthermore, we reveal that the variation ratio provides a feasible method to relax the symmetric condition and offers a more concise path to achieve the asymmetric condition. Based on the variation ratio, we reformulate several commonly used loss functions into a variation-bounded form for practical applications.