Calibration of Neural Networks
Vasilev, Ruslan, D'yakonov, Alexander
–arXiv.org Artificial Intelligence
Neural networks solving real-world problems are often required not only to make accurate predictions but also to provide a confidence level in the forecast. The calibration of a model indicates how close the estimated confidence is to the true probability. This paper presents a survey of confidence calibration problems in the context of neural networks and provides an empirical comparison of calibration methods. We analyze problem statement, calibration definitions, and different approaches to evaluation: visualizations and scalar measures that estimate whether the model is well-calibrated. We review modern calibration techniques: based on post-processing or requiring changes in training. Empirical experiments cover various datasets and models, comparing calibration methods according to different criteria.
arXiv.org Artificial Intelligence
Mar-19-2023
- Country:
- Europe > Russia
- Central Federal District > Moscow Oblast > Moscow (0.04)
- North America
- Canada > Ontario
- Toronto (0.14)
- United States > Massachusetts
- Middlesex County > Cambridge (0.04)
- Canada > Ontario
- Europe > Russia
- Genre:
- Overview (1.00)
- Technology: