Data-Efficient Prediction-Powered Calibration via Cross-Validation

Yoo, Seonghoon, Sifaou, Houssem, Park, Sangwoo, Kang, Joonhyuk, Simeone, Osvaldo

arXiv.org Machine Learning 

--Calibration data are necessary to formally quantify the uncertainty of the decisions produced by an existing artificial intelligence (AI) model. T o overcome the common issue of scarce calibration data, a promising approach is to employ synthetic labels produced by a (generally different) predictive model. This paper introduces a novel approach that efficiently utilizes limited calibration data to simultaneously fine-tune a predictor and estimate the bias of the synthetic labels. The proposed method yields prediction sets with rigorous coverage guarantees for AI-generated decisions. N many AI applications, it is critical to quantify the uncertainty in model decisions by constructing prediction sets or confidence intervals.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found