A Bimanual Gesture Interface for ROS-Based Mobile Manipulators Using TinyML and Sensor Fusion
Bhuiyan, Najeeb Ahmed, Huq, M. Nasimul, Chowdhury, Sakib H., Mangharam, Rahul
–arXiv.org Artificial Intelligence
Gesture-based control for mobile manipulators faces persistent challenges in reliability, efficiency, and intuitiveness. This paper presents a dual-hand gesture interface that integrates TinyML, spectral analysis, and sensor fusion within a ROS framework to address these limitations. The system uses left-hand tilt and finger flexion, captured using accelerometer and flex sensors, for mobile base navigation, while right-hand IMU signals are processed through spectral analysis and classified by a lightweight neural network. This pipeline enables TinyML-based gesture recognition to control a 7-DOF Kinova Gen3 manipulator. By supporting simultaneous navigation and manipulation, the framework improves efficiency and coordination compared to sequential methods. Key contributions include a bimanual control architecture, real-time low-power gesture recognition, robust multimodal sensor fusion, and a scalable ROS-based implementation. The proposed approach advances Human-Robot Interaction (HRI) for industrial automation, assistive robotics, and hazardous environments, offering a cost-effective, open-source solution with strong potential for real-world deployment and further optimization.
arXiv.org Artificial Intelligence
Sep-25-2025
- Country:
- Asia > Bangladesh (0.14)
- North America > United States
- Pennsylvania > Philadelphia County > Philadelphia (0.04)
- Genre:
- Research Report (0.50)
- Industry:
- Government (0.68)
- Health & Medicine > Therapeutic Area (0.93)
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning > Neural Networks (1.00)
- Representation & Reasoning > Information Fusion (0.92)
- Robots (1.00)
- Vision > Gesture Recognition (1.00)
- Information Technology > Artificial Intelligence