GeoDEx: A Unified Geometric Framework for Tactile Dexterous and Extrinsic Manipulation under Force Uncertainty

Chen, Sirui, Marinovic, Sergio Aguilera, Iba, Soshi, Zarrin, Rana Soltani

arXiv.org Artificial Intelligence 

--Sense of touch that allows robots to detect contact and measure interaction forces enables them to perform challenging tasks such as grasping fragile objects or using tools. T actile sensors in theory can equip the robots with such capabilities. However, accuracy of the measured forces is not on a par with those of the force sensors due to the potential calibration challenges and noise. This has limited the values these sensors can offer in manipulation applications that require force control. In this paper, we introduce GeoDEx, a unified estimation, planning, and control framework using geometric primitives such as plane, cone and ellipsoid, which enables dexterous as well as extrinsic manipulation in the presence of uncertain force readings. Through various experimental results, we show that while relying on direct inaccurate and noisy force readings from tactile sensors results in unstable or failed manipulation, our method enables successful grasping and extrinsic manipulation of different objects. Additionally, compared to directly running optimization using SOCP (Second Order Cone Programming), planning and force estimation using our framework achieves a 14x speed-up. Dexterous manipulation capabilities are essential for robots to function effectively in human-centered environments, such as helping with tasks that involve handling different objects or tools [1]. Planning and control methods for such tasks that can enable robust and generalizable contact-rich manipulation need contact force information. While force sensors can provide accurate force readings, physical limitations associated with embedding the sensors into the robotic hands, as well as lack of high-resolution tactile information limit the use of these sensors. On the other hand, recent developments in tactile sensors resulted in ever lighter and higher resolution sensors, which can be installed on different robot end-effectors such as fingertips of a dexterous hand. However, it is still quite challenging and expensive to develop tactile sensors that can provide accurate force readings, especially in both normal and shearing directions. While developing such tactile sensors is important, the ongoing research efforts in the field of dexterous manipulation require relying on the available tactile sensors with the aforementioned shortcomings. Previous works have shown that even using binary [2] or highly discretized [3] information from tactile sensors can already significantly improve performance during in-hand object reorientation.