scaled cross-entropy 0
Analyzing Cost-Sensitive Surrogate Losses via $\mathcal{H}$-calibration
Shah, Sanket, Tambe, Milind, Finocchiaro, Jessie
This paper aims to understand whether machine learning models should be trained using cost-sensitive surrogates or cost-agnostic ones (e.g., cross-entropy). Analyzing this question through the lens of $\mathcal{H}$-calibration, we find that cost-sensitive surrogates can strictly outperform their cost-agnostic counterparts when learning small models under common distributional assumptions. Since these distributional assumptions are hard to verify in practice, we also show that cost-sensitive surrogates consistently outperform cost-agnostic surrogates on classification datasets from the UCI repository. Together, these make a strong case for using cost-sensitive surrogates in practice.
2502.19522
Country:
- North America > United States > New York > New York County > New York City (0.14)
- North America > Canada > Quebec > Montreal (0.04)
- Asia > Middle East > Jordan (0.04)
Technology: Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)