Tactile Gesture Recognition with Built-in Joint Sensors for Industrial Robots
Song, Deqing, Yang, Weimin, Rezayati, Maryam, van de Venn, Hans Wernher
–arXiv.org Artificial Intelligence
-- While gesture recognition using vision or robot skins is an active research area in Human-Robot Collaboration (HRC), this paper explores deep learning methods relying solely on a robot's built-in joint sensors, eliminating the need for external sensors. We evaluated various convolutional neural network (CNN) architectures and collected two datasets to study the impact of data representation and model architecture on the recognition accuracy. Our results show that spectrogram-based representations significantly improve accuracy, while model architecture plays a smaller role. We also tested generalization to new robot poses, where spectrogram-based models performed better . Implemented on a Franka Emika Research robot, two of our methods, STFT2DCNN and STT3DCNN, achieved over 95% accuracy in contact detection and gesture classification. These findings demonstrate the feasibility of external-sensor-free tactile recognition and promote further research toward cost-effective, scalable solutions for HRC. I. INTRODUCTION Transiting from Industry 4.0 to Industry 5.0, the industry is putting more emphasis on placing the well-being of the industry workers at the center of the production process [1], [2], [3].
arXiv.org Artificial Intelligence
Aug-19-2025
- Country:
- Europe
- Switzerland > Zürich
- Zürich (0.14)
- United Kingdom > England
- Greater London > London (0.04)
- Switzerland > Zürich
- Europe
- Genre:
- Research Report > New Finding (1.00)
- Technology: