Goto

Collaborating Authors

 grasp gesture


GAT-Grasp: Gesture-Driven Affordance Transfer for Task-Aware Robotic Grasping

Wang, Ruixiang, Zhou, Huayi, Yao, Xinyue, Liu, Guiliang, Jia, Kui

arXiv.org Artificial Intelligence

Achieving precise and generalizable grasping across diverse objects and environments is essential for intelligent and collaborative robotic systems. However, existing approaches often struggle with ambiguous affordance reasoning and limited adaptability to unseen objects, leading to suboptimal grasp execution. In this work, we propose GAT-Grasp, a gesture-driven grasping framework that directly utilizes human hand gestures to guide the generation of task-specific grasp poses with appropriate positioning and orientation. Specifically, we introduce a retrieval-based affordance transfer paradigm, leveraging the implicit correlation between hand gestures and object affordances to extract grasping knowledge from large-scale human-object interaction videos. By eliminating the reliance on pre-given object priors, GAT-Grasp enables zero-shot generalization to novel objects and cluttered environments. Real-world evaluations confirm its robustness across diverse and unseen scenarios, demonstrating reliable grasp execution in complex task settings.


The Hand-object Kinematic Model for Bimanual Manipulation

Li, Jingyi

arXiv.org Artificial Intelligence

This paper addresses the planar finger kinematics for seeking optimized manipulation strategies. The first step is to model based on geometric features of linear and rotation motion so that the robot can select the fingers configurations. This kinematic model considers the motion between hands and object. Based on 2-finger manipulation cases, this model can output the strategies for bimanual manipulation. For executing strategies, the second step is to seek the appropriate values of finger joints according to the ending orientation of fingers. The simulation shows that the computed solutions can complete the relative rotation and linear motion of unknown objects.