Cost-Sensitive Conformal Training with Provably Controllable Learning Bounds
Jia, Xuesong, Shi, Yuanjie, Liu, Ziquan, Xu, Yi, Yan, Yan
Conformal prediction (CP) is a general framework to quantify the predictive uncertainty of machine learning models that uses a set prediction to include the true label with a valid probability. To align the uncertainty measured by CP, confor-mal training methods minimize the size of the prediction sets. A typical way is to use a surrogate indicator function, usually Sigmoid or Gaussian error function. However, these surrogate functions do not have a uniform error bound to the indicator function, leading to uncontrollable learning bounds. In this paper, we propose a simple cost-sensitive conformal training algorithm that does not rely on the indicator approximation mechanism. Specifically, we theoretically show that minimizing the expected size of prediction sets is upper bounded by the expected rank of true labels. To this end, we develop a rank weighting strategy that assigns the weight using the rank of true label on each data sample. Our analysis provably demonstrates the tightness between the proposed weighted objective and the expected size of conformal prediction sets. Extensive experiments verify the validity of our theoretical insights, and superior empirical performance over other con-formal training in terms of predictive efficiency with 21.38% reduction for average prediction set size.
Nov-25-2025
- Country:
- Asia
- China > Liaoning Province
- Dalian (0.04)
- Middle East > Jordan (0.04)
- China > Liaoning Province
- Europe
- Slovakia > Bratislava
- Bratislava (0.04)
- Sweden (0.04)
- Slovakia > Bratislava
- North America > United States
- Washington (0.04)
- Asia
- Genre:
- Research Report > New Finding (0.67)
- Industry:
- Health & Medicine > Diagnostic Medicine (0.46)
- Technology: