Yet simple sentences like "The dog that ran past the barn fell" still miss the mark when translated to Chinese and back (although the result, "The dog ran past the barn," is getting close). Since with language we need to know "what does THIS particular phrase actually mean, right here, right now," any system that fails at this level truly hasn't solved the problem of natural language understanding (NLU). Only then do we have the possibility of achieving true AI and human-like language interactions with machines. San Jose, California-based Viv is a machine learning platform, recently acquired by Samsung, that lets developers to plug into and create an intelligent, conversational interface to anything.
This project explores the development of a sign language to speech translation glove by implementing a Support Vector Machine(SVM) on the Intel Edison to recognize various letters signed by sign language users. Support Vector Machines (SVMs) are machine learning supervised models with associated learning algorithms that analyze data used for classification and regression analysis. Given a set of training examples, each marked for belonging to one of n categories, an SVM training algorithm builds a model that assigns new examples into one category or the other, making it a non-probabilistic binary linear classifier. It is important that users set their own preferred interval since a user just beginning to learn sign language will sign at a slower rate compared to a sign language expert.