Fixing Overconfidence in Dynamic Neural Networks
Meronen, Lassi, Trapp, Martin, Pilzer, Andrea, Yang, Le, Solin, Arno
–arXiv.org Artificial Intelligence
Dynamic neural networks are a recent technique that promises a remedy for the increasing size of modern deep learning models by dynamically adapting their computational cost to the difficulty of the inputs. In this way, the model can adjust to a limited computational budget. However, the poor quality of uncertainty estimates in deep learning models makes it difficult to distinguish between hard and easy samples. To address this challenge, we present a computationally efficient approach for post-hoc uncertainty quantification in dynamic neural networks. We show that adequately quantifying and accounting for both aleatoric and epistemic uncertainty through a probabilistic treatment of the last layers improves the predictive performance and aids decision-making when determining the computational budget. In the experiments, we show improvements on CIFAR-100, ImageNet, and Caltech-256 in terms of accuracy, capturing uncertainty, and calibration error.
arXiv.org Artificial Intelligence
Dec-8-2023
- Country:
- North America > Canada > Ontario > Toronto (0.14)
- Genre:
- Research Report (1.00)
- Industry:
- Health & Medicine (0.74)
- Technology: