Investigating Active Sampling for Hardness Classification with Vision-Based Tactile Sensors
Chen, Junyi, Kshirsagar, Alap, Heller, Frederik, Andreu, Mario Gómez, Belousov, Boris, Schneider, Tim, Lin, Lisa P. Y., Doerschner, Katja, Drewing, Knut, Peters, Jan
–arXiv.org Artificial Intelligence
-- One of the most important object properties that humans and robots perceive through touch is hardness. This paper investigates information-theoretic active sampling strategies for sample-efficient hardness classification with vision-based tactile sensors. We evaluate three probabilistic classifier models and two model-uncertainty-based sampling strategies on a robotic setup as well as on a previously published dataset of samples collected by human testers. Our findings indicate that the active sampling approaches, driven by uncertainty metrics, surpass a random sampling baseline in terms of accuracy and stability. Additionally, while in our human study, the participants achieve an average accuracy of 48 .00% I. INTRODUCTION Robots are increasingly being utilized in a variety of fields, from manufacturing to healthcare, where they interact with objects in their environment and plan their actions based on sensory feedback. A significant challenge in robotics is accurately perceiving object properties. This work focuses on a crucial property perceived through touch: hardness. Specifically, we investigate active sampling strategies for rapid hardness classification with a Vision-Based Tactile Sensor (VBTS). VBTSs like GelSight Mini [1] or FingerVision [2] provide a cost-effective and high-resolution alternative to traditional tactile sensors and also allow leveraging advancements in camera technology and computer vision.
arXiv.org Artificial Intelligence
May-20-2025
- Country:
- Europe > Germany > Hesse > Darmstadt Region > Darmstadt (0.05)
- Genre:
- Research Report > New Finding (1.00)
- Technology: