Language Learning


Automatic sign language translators turn signing into text

New Scientist

Machine translation systems that convert sign language into text and back again are helping people who are deaf or have difficulty hearing to communicate with those who cannot sign. A sign language user can approach a bank teller and sign to the KinTrans camera that they'd like assistance, for example. KinTrans's machine learning algorithm translates each sign as it is made and then a separate algorithm turns those signs into a sentence that makes grammatical sense. KinTrans founder Mohamed Elwazer says his system can already recognise thousands of signs in both American and Arabic sign language with 98 per cent accuracy.


How Silicon Valley is teaching language to machines

#artificialintelligence

Yet simple sentences like "The dog that ran past the barn fell" still miss the mark when translated to Chinese and back (although the result, "The dog ran past the barn," is getting close). Since with language we need to know "what does THIS particular phrase actually mean, right here, right now," any system that fails at this level truly hasn't solved the problem of natural language understanding (NLU). Only then do we have the possibility of achieving true AI and human-like language interactions with machines. San Jose, California-based Viv is a machine learning platform, recently acquired by Samsung, that lets developers to plug into and create an intelligent, conversational interface to anything.


IBM Research Demonstrates Innovative 'Speech to Sign Language' Translation System

AITopics Original Links

HURSLEY, UK--(Marketwire - September 13, 2007) - IBM (NYSE: IBM) has developed an ingenious system called SiSi (Say It Sign It) that automatically converts the spoken word into British Sign Language (BSL) which is then signed by an animated digital character or avatar. SiSi brings together a number of computer technologies. A speech recognition module converts the spoken word into text, which SiSi then interprets into gestures, that are used to animate an avatar which signs in BSL. Upon development this system would see a signing avatar'pop up' in the corner of the display screen in use -- whether that be a laptop, personal computer, TV, meeting-room display or auditorium screen. Users would be able select the size and appearance of the avatar.


Google Translate is tapping into neural networks for smarter language learning

PCWorld

Google Translate is rolling out a major upgrade that promises more human-like language translations. Google is bullish on its Neural Machine Translation technology, claiming that it's a bigger upgrade to the service than everything that's been accomplished in the last ten years combined. Since it's easier to understand each sentence, translated paragraphs and articles are a lot smoother and easier to read. And this is all possible because of end-to-end learning system built on Neural Machine Translation, which basically means that the system learns over time to create better, more natural translations.