Noise-Adaptive Conformal Classification with Marginal Coverage
Bortolotti, Teresa, Wang, Y. X. Rachel, Tong, Xin, Menafoglio, Alessandra, Vantini, Simone, Sesia, Matteo
Conformal inference seeks rigorous uncertainty quantification for the predictions of any black-box machine learning model, without requiring parametric assumptions (Vovk et al., 2005). In classification, these methods aim to construct a prediction set for the label of a new test point while guaranteeing a specified coverage level. The split-conformal approach achieves this by leveraging residuals (or non-conformity scores) from a pre-trained model applied to an independent calibration data set, assuming exchangeability with the test data. Perfect exchangeability, however, may not always hold in practice, due for example to possible distribution shifts between the available data and the future test points of interest, creating a need to relax the assumptions underlying conformal inference (Barber et al., 2023).
Jan-29-2025
- Country:
- Europe (0.92)
- North America > United States
- California > Los Angeles County > Los Angeles (0.14)
- Genre:
- Research Report > New Finding (0.67)
- Technology: