Data-Efficient Prediction-Powered Calibration via Cross-Validation
Yoo, Seonghoon, Sifaou, Houssem, Park, Sangwoo, Kang, Joonhyuk, Simeone, Osvaldo
--Calibration data are necessary to formally quantify the uncertainty of the decisions produced by an existing artificial intelligence (AI) model. T o overcome the common issue of scarce calibration data, a promising approach is to employ synthetic labels produced by a (generally different) predictive model. This paper introduces a novel approach that efficiently utilizes limited calibration data to simultaneously fine-tune a predictor and estimate the bias of the synthetic labels. The proposed method yields prediction sets with rigorous coverage guarantees for AI-generated decisions. N many AI applications, it is critical to quantify the uncertainty in model decisions by constructing prediction sets or confidence intervals.
Jul-29-2025
- Country:
- Asia
- Middle East > Jordan (0.04)
- South Korea > Daejeon
- Daejeon (0.04)
- Europe > United Kingdom
- England > Greater London > London (0.04)
- North America > United States (0.04)
- Asia
- Genre:
- Research Report
- New Finding (0.46)
- Promising Solution (0.54)
- Research Report
- Technology: