Language Learning


AI-powered language learning promises to fast-track fluency

#artificialintelligence

A linguistics company is using AI to shorten the time it takes to learn a new language. It takes about 200 hours, using traditional methods, to gain basic proficiency in a new language. This AI-powered platform claims it can teach from beginner to fluency in just a few months – through once-daily 20 minute lessons. Learning a new language is hard. Some people seem to pick up new dialects with ease, but for the rest of us it's a trudge through rote memorization.


Glove turns sign language into text for real-time translation

New Scientist

A new glove developed at the University of California, San Diego, can convert the 26 letters of American Sign Language (ASL) into text on a smartphone or computer screen. "For thousands of people in the UK, sign language is their first language," says Jesal Vishnuram, the technology research manager at the charity Action on Hearing Loss. In the UK, someone who is deaf is entitled to a sign language translator at work or when visiting a hospital, but at a train station, for example, it can be incredibly difficult to communicate with people who don't sign. The flexible sensors mean that you hardly notice that you are wearing the glove, says Timothy O'Connor who is working on the technology at the University of California, San Diego.


Number of foreign students at public schools who lack Japanese language skills hits record high

The Japan Times

By prefecture, Aichi tops the list with 7,277 non-Japanese children with poor Japanese skills, followed by Kanagawa at 3,947, Tokyo at 2,932, Shizuoka at 2,673 and Osaka at 2,275. The survey also found 9,612 children who hold Japanese citizenship but have poor Japanese skills, needing remedial language instruction. Such children often have no choice but to learn basic Japanese at language schools or in classes provided by nonprofit groups like the center before entering a public school, Hazeki said. "There are a lot of language schools in Japan for international students, but Japan does not have a well-established system to train people who can teach Japanese to those elementary and junior high school children," Hazeki said.


Automatic sign language translators turn signing into text

New Scientist

Machine translation systems that convert sign language into text and back again are helping people who are deaf or have difficulty hearing to communicate with those who cannot sign. A sign language user can approach a bank teller and sign to the KinTrans camera that they'd like assistance, for example. KinTrans's machine learning algorithm translates each sign as it is made and then a separate algorithm turns those signs into a sentence that makes grammatical sense. KinTrans founder Mohamed Elwazer says his system can already recognise thousands of signs in both American and Arabic sign language with 98 per cent accuracy.


Investigating Bias In AI Language Learning

#artificialintelligence

We recommend addressing this through the explicit characterization of acceptable behavior. One such approach is seen in the nascent field of fairness in machine learning, which specifies and enforces mathematical formulations of nondiscrimination in decision-making. Another approach can be found in modular AI architectures, such as cognitive systems, in which implicit learning of statistical regularities can be compartmentalized and augmented with explicit instruction of rules of appropriate conduct . Certainly, caution must be used in incorporating modules constructed via unsupervised machine learning into decision-making systems.


GIFs can teach sign language

Daily Mail

Aside from adding a funny spin to a message, GIFs can now teach you sign language. Giphy recently released a GIF library of more than 2,000 words and phrases in American Sign Language. The GIF's are based on the video series Sign with Robert with Robert DeMayo, who has been deaf since he was born. Aside from adding a funny spin to a message, GIFs can now teach you sign language. Giphy recently released a GIF library of more than 2,000 words and phrases in American Sign Language.


How Silicon Valley is teaching language to machines

#artificialintelligence

Yet simple sentences like "The dog that ran past the barn fell" still miss the mark when translated to Chinese and back (although the result, "The dog ran past the barn," is getting close). Since with language we need to know "what does THIS particular phrase actually mean, right here, right now," any system that fails at this level truly hasn't solved the problem of natural language understanding (NLU). Only then do we have the possibility of achieving true AI and human-like language interactions with machines. San Jose, California-based Viv is a machine learning platform, recently acquired by Samsung, that lets developers to plug into and create an intelligent, conversational interface to anything.


Automatic sign language translator translates gestures

AITopics Original Links

For years scientists have worked to find a way to make it easier for deaf and hearing impaired people to communicate. And now it is hoped that a new intelligent system could be about to transform their lives. Researchers have used image recognition to translate sign language into'readable language' and while it is early days, the tool could one day be used on smartphones. Researchers have used image recognition to translate sign language (pictured) into'readable language' and while it is early days, the tool could one day be used on smartphones Scientists from Malaysia and New Zealand came up with the Automatic Sign Language Translator (ASLT), which can capture, interpret and translate sign language. It has been tested on gestures and signs representing both isolated words and continuous sentences in Malaysian sign language, with what they claim is a high degree of recognition accuracy and speed.


Kinect sensor can translate sign language into SPEECH and TEXT

AITopics Original Links

Microsoft's Kinect has already proved its credentials in reading simple hand and body movements in the gaming world. But now a team of Chinese researchers have added sign language to its motion-sensing capabilities. Scientists at Microsoft Research Asia recently demonstrated software that allows Kinect to read sign language using hand tracking. What's impressive is that it can do this in real-time, translating sign language to spoken language and vice versa at conversational speeds. The system, dubbed the Kinect Sign Language Translator, is capable of capturing a conversation from both sides.


Humanoid Robot Demonstrates Sign Language

AITopics Original Links

With the DARPA Robotics Challenge looming large on the horizon, it's easy to overlook robots that aren't taking part. One of them was Nino, a humanoid unveiled earlier this year by the National Taiwan University's Robotics Laboratory. Unlike the DARPA robots, Nino may not find itself performing tasks in dangerous situations any time soon. But this robot has some special skills: It is likely the first full-sized humanoid to demonstrate sign language. "Sign language has a high degree of difficulty, requiring the use of both arms, hands, and fingers as well as facial expressions," said Professor Han-Pang Huang, who leads NTU's Robotics Lab.