Consensus-Driven Uncertainty for Robotic Grasping based on RGB Perception
Joyce, Eric C., Zhao, Qianwen, Burgdorfer, Nathaniel, Wang, Long, Mordohai, Philippos
–arXiv.org Artificial Intelligence
--Deep object pose estimators are notoriously overconfident. A grasping agent that both estimates the 6-DoF pose of a target object and predicts the uncertainty of its own estimate could avoid task failure by choosing not to act under high uncertainty. Even though object pose estimation improves and uncertainty quantification research continues to make strides, few studies have connected them to the downstream task of robotic grasping. We propose a method for training lightweight, deep networks to predict whether a grasp guided by an image-based pose estimate will succeed before that grasp is attempted. We generate training data for our networks via object pose estimation on real images and simulated grasping. We also find that, despite high object variability in grasping trials, networks benefit from training on all objects jointly, suggesting that a diverse variety of objects can nevertheless contribute to the same goal. Remarkable progress in object pose estimation from single RGB images has been made in the past few years [1]-[4], primarily driven by deep learning and the ability to reduce the so-called sim2real gap . This has enabled end-to-end system training on large amounts of synthetic data with precise ground truth. Consider for example the pose estimates illustrated in Figure 1. These were made by current methods, yet all four caused grasping attempts to fail when used as guides. Motivated by this disconnect between pose evaluation and success in downstream grasping, we propose an approach to estimate the likelihood for success before a grasp is actually attempted.
arXiv.org Artificial Intelligence
Jun-27-2025
- Country:
- North America > United States
- New Jersey > Hudson County
- Hoboken (0.04)
- Utah (0.04)
- New Jersey > Hudson County
- North America > United States
- Genre:
- Research Report > New Finding (0.68)
- Technology: