Hypothesis-based Belief Planning for Dexterous Grasping

Zito, Claudio, Ortenzi, Valerio, Adjigble, Maxime, Kopicki, Marek, Stolkin, Rustam, Wyatt, Jeremy L.

arXiv.org Artificial Intelligence 

Noname manuscript No. (will be inserted by the editor) Abstract Belief space planning is a viable alternative to formalise partially observable control problems and, in the recent years, its application to robot manipulation problems has grown. However, this planning approach was tried successfully only on simplified control problems. In this paper, we apply belief space planning to the problem of planning dexterous reach-tograsp trajectories under object pose uncertainty. In our framework, the robot perceives the object to be grasped on-the-fly as a point cloud and compute a full 6D, non-Gaussian distribution over the object's pose (our belief space). The system has no limitations on the geometry of the object, i.e., non-convex objects can be represented, nor assumes that the point cloud is a complete Figure 1: Boris: half-humanoid robot platform developed representation of the object. A plan in the belief space at the University of Birmingham. is then created to reach and grasp the object, such that the information value of expected contacts along the trajectory is maximised to compensate for the pose uncertainty. 1 Introduction If an unexpected contact occurs when performing the action, such information is used to refine Imagine that you are reaching into the fridge to grasp the pose distribution and triggers a re-planning. Experimental an object you can only partially see. Rather than relying results show that our planner (IR3ne) improves solely on vision, you must use touch in order to grasp reliability and compensates for the pose uncertainty localise it and securely grasp it.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found