Learning Haptic-based Object Pose Estimation for In-hand Manipulation Control with Underactuated Robotic Hands

Azulay, Osher, Ben-David, Inbar, Sintov, Avishai

arXiv.org Artificial Intelligence 

Abstract--Unlike traditional robotic hands, underactuated compliant hands are challenging to model due to inherent uncertainties. Consequently, pose estimation of a grasped object is usually performed based on visual perception. However, visual perception of the hand and object can be limited in occluded or partly-occluded environments. In this paper, we aim to explore the use of haptics, i.e., kinesthetic and tactile sensing, for pose estimation and in-hand manipulation with underactuated hands. Such haptic approach would mitigate occluded environments where line-of-sight is not always available. We put an emphasis on identifying the feature state representation of the system that does not include vision and can be obtained with simple and low-cost hardware. For tactile sensing, therefore, we propose a low-cost and flexible sensor that is mostly 3D printed along with the finger-tip and can provide implicit contact information. Taking a two-finger underactuated hand as a test-case, we analyze the contribution of kinesthetic and tactile features along with various regression models to the accuracy of the predictions. Visual perception is not available within the cabinet and, therefore, the hand must use haptic perception. To cope with the lack of an analytical solution, data-based modeling was shown I. HILE the ability to manipulate an object within the hand is a fundamental everyday task for humans, such intrinsically estimate model parameters that can be difficult problem remains challenging for robots.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found