SLIRS: Sign Language Interpreting System for Human-Robot Interaction
Tazhigaliyeva, Nazgul (University of Edinburgh) | Nurgabulov, Yerniyaz (Nazarbayev University) | Parisi, German I. (University of Hamburg) | Sandygulova, Anara (Nazarbayev University)
Deaf-mute communities around the world experience a need in effective human-robot interaction system that would act as an interpreter in public places such as banks, hospitals, or police stations. The focus of this work is to address the challenges presented to hearing-impaired people by developing an interpreting robotic system required for effective communication in public places. To this end, we utilize a previously developed neural network-based learning architecture to recognize Cyrillic manual alphabet, which is used for finger spelling in Kazakhstan. In order to train and test the performance of the recognition system, we collected a depth data set of ten people and applied it to a learning-based method for gesture recognition by modeling motion data. We report our results that show an average accuracy of 77.2% for a complete alphabet recognition consisting of 33 letters.
Nov-19-2016
- Country:
- Asia > Kazakhstan (0.24)
- Industry:
- Education > Curriculum > Subject-Specific Education (0.40)
- Technology: