Multi-Label Learning with Stronger Consistency Guarantees
–Neural Information Processing Systems
We present a detailed study of surrogate losses and algorithms for multi-label learning, supported by H-consistency bounds. We first show that, for the simplest form of multi-label loss (the popular Hamming loss), the well-known consistent binary relevance surrogate suffers from a sub-optimal dependency on the number of labels in terms of H-consistency bounds, when using smooth losses such as logistic losses. Furthermore, this loss function fails to account for label correlations. To address these drawbacks, we introduce a novel surrogate loss, multi-label logistic loss, that accounts for label correlations and benefits from label-independent H-consistency bounds. We then broaden our analysis to cover a more extensive family of multi-label losses, including all common ones and a new extension defined based on linear-fractional functions with respect to the confusion matrix.
Neural Information Processing Systems
May-28-2025, 07:41:51 GMT
- Country:
- North America > United States (0.14)
- Genre:
- Research Report > Experimental Study (0.93)
- Technology: