Calibrating Uncertainties in Object Localization Task
Phan, Buu, Salay, Rick, Czarnecki, Krzysztof, Abdelzad, Vahdat, Denouden, Taylor, Vernekar, Sachin
In many safety-critical applications such as autonomous driving and surgical robots, it is desirable to obtain prediction uncertainties from object detection modules to help support safe decision-making. Specifically, such modules need to estimate the probability of each predicted object in a given region and the confidence interval for its bounding box. While recent Bayesian deep learning methods provide a principled way to estimate this uncertainty, the estimates for the bounding boxes obtained using these methods are uncalibrated. In this paper, we address this problem for the single-object localization task by adapting an existing technique for calibrating regression models. We show, experimentally, that the resulting calibrated model obtains more reliable uncertainty estimates.
Nov-27-2018
- Country:
- Europe > Sweden (0.14)
- North America > Canada (0.14)
- Genre:
- Research Report > New Finding (0.47)
- Industry:
- Automobiles & Trucks (0.35)
- Information Technology > Robotics & Automation (0.35)
- Transportation > Ground
- Road (0.35)
- Technology: