forearm
A Comparative Study of EMG- and IMU-based Gesture Recognition at the Wrist and Forearm
Baghernezhad, Soroush, Mohammadreza, Elaheh, da Fonseca, Vinicius Prado, Zou, Ting, Jiang, Xianta
Gestures are an integral part of our daily interactions with the environment. Hand gesture recognition (HGR) is the process of interpreting human intent through various input modalities, such as visual data (images and videos) and bio-signals. Bio-signals are widely used in HGR due to their ability to be captured non-invasively via sensors placed on the arm. Among these, surface electromyography (sEMG), which measures the electrical activity of muscles, is the most extensively studied modality. However, less-explored alternatives such as inertial measurement units (IMUs) can provide complementary information on subtle muscle movements, which makes them valuable for gesture recognition. In this study, we investigate the potential of using IMU signals from different muscle groups to capture user intent. Our results demonstrate that IMU signals contain sufficient information to serve as the sole input sensor for static gesture recognition. Moreover, we compare different muscle groups and check the quality of pattern recognition on individual muscle groups. We further found that tendon-induced micro-movement captured by IMUs is a major contributor to static gesture recognition. We believe that leveraging muscle micro-movement information can enhance the usability of prosthetic arms for amputees. This approach also offers new possibilities for hand gesture recognition in fields such as robotics, teleoperation, sign language interpretation, and beyond.
- North America > Canada > Newfoundland and Labrador > Newfoundland > St. John's (0.04)
- North America > United States (0.04)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Health & Medicine > Health Care Technology (0.88)
- Health & Medicine > Therapeutic Area > Neurology (0.48)
- Health & Medicine > Therapeutic Area > Musculoskeletal (0.46)
- (2 more...)
Development of a 15-Degree-of-Freedom Bionic Hand with Cable-Driven Transmission and Distributed Actuation
Han, Haoqi, Yang, Yi, Yu, Yifei, Zhou, Yixuan, Zhu, Xiaohan, Wang, Hesheng
Abstract--In robotic hand research, minimizing the number of actuators while maintaining human-hand-consistent dimensions and degrees of freedom constitutes a fundamental challenge. Drawing bio-inspiration from human hand kinematic configurations and muscle distribution strategies, this work proposes a novel 15-DoF dexterous robotic hand, with detailed analysis of its mechanical architecture, electrical system, and control system. The bionic hand employs a new tendon-driven mechanism, significantly reducing the number of motors required by traditional tendon-driven systems while enhancing motion performance and simplifying the mechanical structure. This design integrates five motors in the forearm to provide strong gripping force, while ten small motors are installed in the palm to support fine manipulation tasks. Additionally, a corresponding joint sensing and motor driving electrical system was developed to ensure efficient control and feedback. The entire system weighs only 1.4kg, combining lightweight and high-performance features. Through experiments, the bionic hand exhibited exceptional dexterity and robust grasping capabilities, demonstrating significant potential for robotic manipulation tasks. HE development of actuator systems with human-level dexterity presents significant challenges [1], [2], stemming from the bio-integrated nature of the human hand: it is not an isolated entity but a highly coupled system intricately connected through skeletal-muscular-neural networks to the forearm, forming a synergistic functional unit.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- Asia > China > Shanghai > Shanghai (0.07)
- Asia > China > Heilongjiang Province > Harbin (0.05)
- (7 more...)
Toward the smooth mesh climbing of a miniature robot using bioinspired soft and expandable claws
Wang, Hong, Liu, Peng, Ngoc, Phuoc Thanh Tran, Li, Bing, Li, Yao, Sato, Hirotaka
--While most micro -robots face difficulty traveling on rugged and uneven terrain, b eetles can walk smoothly on the complex substrate without slipping or getting stuck o n the surface due to their stiffness-variable tarsi and expandable hooks on the tip of tarsi. In this study, we found that beetles actively bent and expand ed their claws regularly to crawl freely on mesh surfaces. Inspired by the crawling mechanism of the beetles, we designed an 8 -cm miniature climbing robot equipping artificial claw s to open and bend in the same cyclic manner as natural beetles. The robot can climb freely with a controllable gait on the mesh surface, steep incline of the angle of 60, and even transition surface. To our best knowledge, this is the first micro -scale robot that can climb both the mesh surface and cliffy incline. Their small size, lightweight, and strong navigation capabilities allow them to be deployed in complicated environments quickly. Numerous insect -scale robots have been developed with diversiform locomotion modes, including crawling [1-3], rolling [4-6], jumping[7-9], gliding [10, 11], and flying [12-14]. The actuators are diverse from traditional motor s [15] and pneumatic [16] to shape memory alloy [17], piezoelectric ceramics [18], and dielectric elastomer [19]. However, they can only locomote on a nearly level surface, which makes them unable to overcome barriers several times larger than their body size.
- Asia > China > Heilongjiang Province > Harbin (0.05)
- Asia > China > Guangdong Province > Shenzhen (0.05)
- Asia > Singapore > Central Region > Singapore (0.04)
- (9 more...)
How Much Do Large Language Models Know about Human Motion? A Case Study in 3D Avatar Control
Li, Kunhang, Naradowsky, Jason, Feng, Yansong, Miyao, Yusuke
We explore the human motion knowledge of Large Language Models (LLMs) through 3D avatar control. Given a motion instruction, we prompt LLMs to first generate a high-level movement plan with consecutive steps (High-level Planning), then specify body part positions in each step (Low-level Planning), which we linearly interpolate into avatar animations. Using 20 representative motion instructions that cover fundamental movements and balance body part usage, we conduct comprehensive evaluations, including human and automatic scoring of both high-level movement plans and generated animations, as well as automatic comparison with oracle positions in low-level planning. Our findings show that LLMs are strong at interpreting high-level body movements but struggle with precise body part positioning. While decomposing motion queries into atomic components improves planning, LLMs face challenges in multi-step movements involving high-degree-of-freedom body parts. Furthermore, LLMs provide reasonable approximations for general spatial descriptions, but fall short in handling precise spatial specifications. Notably, LLMs demonstrate promise in conceptualizing creative motions and distinguishing culturally specific motion patterns.
Improving Tactile Gesture Recognition with Optical Flow
Zhong, Shaohong, Albini, Alessandro, Caroleo, Giammarco, Cannata, Giorgio, Maiolino, Perla
Tactile gesture recognition systems play a crucial role in Human-Robot Interaction (HRI) by enabling intuitive communication between humans and robots. The literature mainly addresses this problem by applying machine learning techniques to classify sequences of tactile images encoding the pressure distribution generated when executing the gestures. However, some gestures can be hard to differentiate based on the information provided by tactile images alone. In this paper, we present a simple yet effective way to improve the accuracy of a gesture recognition classifier. Our approach focuses solely on processing the tactile images used as input by the classifier. In particular, we propose to explicitly highlight the dynamics of the contact in the tactile image by computing the dense optical flow. This additional information makes it easier to distinguish between gestures that produce similar tactile images but exhibit different contact dynamics. We validate the proposed approach in a tactile gesture recognition task, showing that a classifier trained on tactile images augmented with optical flow information achieved a 9% improvement in gesture classification accuracy compared to one trained on standard tactile images.
- Information Technology > Artificial Intelligence > Vision > Gesture Recognition (1.00)
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
Strong, Accurate, and Low-Cost Robot Manipulator
Chebly, Georges, Little, Spencer, Perera, Nisal, Abedeen, Aliya, Suzuki, Ken, Kim, Donghyun
--This paper presents Forte, a fully 3D-printable, 6-DoF robotic arm designed to achieve near industrial-grade performance - 0 . As an accessible robot for broad applications across classroom education to AI experiments, Forte pushes forward the performance limitations of existing low-cost educational arms. We introduce a cost-effective mechanical design that combines capstan-based cable drives, timing belts, simple tensioning mechanisms, and lightweight 3D-printed structures, along with topology optimization for structural stiffness. Through careful drivetrain engineering, we minimize backlash and maintain control fidelity without relying on high-power electronics or expensive manufacturing processes. Experimental validation demonstrates that Forte achieves high repeatability and load capacity, offering a compelling robotic platform for both classroom instruction and advanced robotics research. Can we build a 6-degree-of-freedom (DoF) robotic arm with a material cost under $400, while achieving a half-meter workspace, a payload capacity of more than 0.5 kg, and repeatability within 0. 5 mm? We introduce Forte, a fully 3D-printed robotic manipulator, developed to affirmatively answer this question. In light of surging interest in robotics and artificial intelligence, providing accessible, hands-on educational tools has never been more important, as practical experience and experimental validation are essential components of robotics education.
- Research Report (0.50)
- Instructional Material (0.48)
Effects of Wrist-Worn Haptic Feedback on Force Accuracy and Task Speed during a Teleoperated Robotic Surgery Task
Vuong, Brian B., Davidson, Josie, Cheon, Sangheui, Cho, Kyujin, Okamura, Allison M.
--Previous work has shown that the addition of haptic feedback to the hands can improve awareness of tool-tissue interactions and enhance performance of teleoperated tasks in robot-assisted minimally invasive surgery. However, hand-based haptic feedback occludes direct interaction with the manipulanda of surgeon console in teleoperated surgical robots. We propose relocating haptic feedback to the wrist using a wearable haptic device so that haptic feedback mechanisms do not need to be integrated into the manipulanda. However, it is unknown if such feedback will be effective, given that it is not co-located with the finger movements used for manipulation. T o test if relocated haptic feedback improves force application during teleoperated tasks using da Vinci Research Kit (dVRK) surgical robot, participants learned to palpate a phantom tissue to desired forces. Participants performed the palpation task with and without wrist-worn haptic feedback and were evaluated for the accuracy of applied forces. Participants demonstrated statistically significant lower force error when wrist-worn haptic feedback was provided. Participants also performed the palpation task with longer movement times when provided wrist-worn haptic feedback, indicating that the haptic feedback may have caused participants to operate at a different point in the speed-accuracy tradeoff curve.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- North America > United States > Massachusetts > Middlesex County > Somerville (0.04)
- North America > United States > California > Santa Clara County > Sunnyvale (0.04)
- (4 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Health & Medicine > Surgery (1.00)
- Health & Medicine > Health Care Technology (1.00)
Whole-body Multi-contact Motion Control for Humanoid Robots Based on Distributed Tactile Sensors
Murooka, Masaki, Fukumitsu, Kensuke, Hamze, Marwan, Morisawa, Mitsuharu, Kaminaga, Hiroshi, Kanehiro, Fumio, Yoshida, Eiichi
--T o enable humanoid robots to work robustly in confined environments, multi-contact motion that makes contacts not only at extremities, such as hands and feet, but also at intermediate areas of the limbs, such as knees and elbows, is essential. We develop a method to realize such whole-body multi-contact motion involving contacts at intermediate areas by a humanoid robot. Deformable sheet-shaped distributed tactile sensors are mounted on the surface of the robot's limbs to measure the contact force without significantly changing the robot body shape. The multi-contact motion controller developed earlier, which is dedicated to contact at extremities, is extended to handle contact at intermediate areas, and the robot motion is stabilized by feedback control using not only force/torque sensors but also distributed tactile sensors. Through verification on dynamics simulations, we show that the developed tactile feedback improves the stability of whole-body multi-contact motion against disturbances and environmental errors. Furthermore, the life-sized humanoid RHP Kaleido demonstrates whole-body multi-contact motions, such as stepping forward while supporting the body with forearm contact and balancing in a sitting posture with thigh contacts. UMANOID robots are expected to realize various manipulation and locomotion tasks to support or replace humans.
Learning and Online Replication of Grasp Forces from Electromyography Signals for Prosthetic Finger Control
Arbaud, Robin, Motta, Elisa, Avaro, Marco Domenico, Picinich, Stefano, Lorenzini, Marta, Ajoudani, Arash
-- Partial hand amputations significantly affect the physical and psychosocial well-being of individuals, yet intuitive control of externally powered prostheses remains an open challenge. T o address this gap, we developed a force-controlled prosthetic finger activated by electromyography (EMG) signals. The prototype, constructed around a wrist brace, functions as a supernumerary finger placed near the index, allowing for early-stage evaluation on unimpaired subjects. A neural network-based model was then implemented to estimate fingertip forces from EMG inputs, allowing for online adjustment of the prosthetic finger grip strength. The force estimation model was validated through experiments with ten participants, demonstrating its effectiveness in predicting forces. Additionally, online trials with four users wearing the prosthesis exhibited precise control over the device. Our findings highlight the potential of using EMG-based force estimation to enhance the functionality of prosthetic fingers. I. INTRODUCTION Upper extremity amputations make up 3% to 23% of all amputations, with approximately 50% to 90% of these being related to trauma.
- North America > United States > Texas (0.04)
- Europe > Italy > Liguria > Genoa (0.04)
- North America > United States > Massachusetts > Middlesex County > Natick (0.04)
Exploring Interference between Concurrent Skin Stretches
Cheng, Ching Hei, Eden, Jonathan, Oetomo, Denny, Tan, Ying
--Proprioception is essential for coordinating human movements and enhancing the performance of assistive robotic devices. Skin stretch feedback, which closely aligns with natural proprioception mechanisms, presents a promising method for conveying proprioceptive information. T o better understand the impact of interference on skin stretch perception, we conducted a user study with 30 participants that evaluated the effect of two simultaneous skin stretches on user perception. We observed that when participants experience simultaneous skin stretch stimuli, a masking effect occurs which deteriorates perception performance in the collocated skin stretch configurations. However, the perceived workload stays the same. These findings show that interference can affect the perception of skin stretch such that multi-channel skin stretch feedback designs should avoid locating modules in close proximity. I. INTRODUCTION Proprioception, the sense of limb position relative to the body [1], is crucial for coordinating human movements.
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- Asia > China (0.04)
- Oceania > Australia > Victoria > Melbourne (0.04)
- Africa > Central African Republic > Ombella-M'Poko > Bimbo (0.04)