hand gesture recognition



[P] Hand Gesture Recognition with Python, OpenCV and Keras Demo • r/MachineLearning

@machinelearnbot

I'll be posting all code and relevant files soon, this demo is part of a tutorial series I'm doing at my university. I'll probably do a twitch stream and eventually YouTube playlist if people like it.


Deformable Deep Convolutional Generative Adversarial Network in Microwave Based Hand Gesture Recognition System

arXiv.org Machine Learning

Traditional vision-based hand gesture recognition systems is limited under dark circumstances. In this paper, we build a hand gesture recognition system based on microwave transceiver and deep learning algorithm. A Doppler radar sensor with dual receiving channels at 5.8GHz is used to acquire a big database of hand gestures signals. The received hand gesture signals are then processed with time-frequency analysis. Based on these big databases of hand gesture, we propose a new machine learning architecture called deformable deep convolutional generative adversarial network. Experimental results show the new architecture can upgrade the recognition rate by 10% and the deformable kernel can reduce the testing time cost by 30%.


SLIRS: Sign Language Interpreting System for Human-Robot Interaction

AAAI Conferences

Deaf-mute communities around the world experience a need in effective human-robot interaction system that would act as an interpreter in public places such as banks, hospitals, or police stations. The focus of this work is to address the challenges presented to hearing-impaired people by developing an interpreting robotic system required for effective communication in public places. To this end, we utilize a previously developed neural network-based learning architecture to recognize Cyrillic manual alphabet, which is used for finger spelling in Kazakhstan. In order to train and test the performance of the recognition system, we collected a depth data set of ten people and applied it to a learning-based method for gesture recognition by modeling motion data. We report our results that show an average accuracy of 77.2% for a complete alphabet recognition consisting of 33 letters.


Continuous Body and Hand Gesture Recognition for Natural Human-Computer Interaction: Extended Abstract

AAAI Conferences

We present a new approach to gesture recognition that tracks body and hands simultaneously and recognizes gestures continuously from an unsegmented and unbounded input stream. Our system estimates 3D coordinates of upper body joints and classifies the appearance of hands into a set of canonical shapes. A novel multi-layered filtering technique with a temporal sliding window is developed to enable online sequence labeling and segmentation. Experimental results on the NATOPS dataset show the effectiveness of the approach. We also report on our recent work on multimodal gesture recognition and deep-hierarchical sequence representation learning that achieve the state-of-the-art performances on several real-world datasets.