Results


Sign language turned to text with new electric glove

Daily Mail

An electric glove which can convert sign language into text messages has been unveiled by scientists. The device consists of a sports glove which has been fitted with nine stretchable sensors positioned over the knuckles. When a user bends their fingers or thumb to sign a letter, the sensors stretch, which causes an electrical signal to be produced. When a user bends their fingers or thumb to sign a letter, the sensors stretch, which causes an electrical signal to be produced.


Glove turns sign language into text for real-time translation

New Scientist

A new glove developed at the University of California, San Diego, can convert the 26 letters of American Sign Language (ASL) into text on a smartphone or computer screen. "For thousands of people in the UK, sign language is their first language," says Jesal Vishnuram, the technology research manager at the charity Action on Hearing Loss. In the UK, someone who is deaf is entitled to a sign language translator at work or when visiting a hospital, but at a train station, for example, it can be incredibly difficult to communicate with people who don't sign. The flexible sensors mean that you hardly notice that you are wearing the glove, says Timothy O'Connor who is working on the technology at the University of California, San Diego.


[R] From DeepMind: Grounded Language Learning in a Simulated 3D World • r/MachineLearning

#artificialintelligence

Abstract: We are increasingly surrounded by artificially intelligent technology that takes decisions and executes actions on our behalf. This creates a pressing need for general means to communicate with, instruct and guide artificial agents, with human language the most compelling means for such communication. Here we present an agent that learns to interpret language in a simulated 3D environment where it is rewarded for the successful execution of written instructions. Trained via a combination of reinforcement and unsupervised learning, and beginning with minimal prior knowledge, the agent learns to relate linguistic symbols to emergent perceptual representations of its physical surroundings and to pertinent sequences of actions.


Automatic sign language translator translates gestures

AITopics Original Links

For years scientists have worked to find a way to make it easier for deaf and hearing impaired people to communicate. And now it is hoped that a new intelligent system could be about to transform their lives. Researchers have used image recognition to translate sign language into'readable language' and while it is early days, the tool could one day be used on smartphones. Researchers have used image recognition to translate sign language (pictured) into'readable language' and while it is early days, the tool could one day be used on smartphones Scientists from Malaysia and New Zealand came up with the Automatic Sign Language Translator (ASLT), which can capture, interpret and translate sign language. It has been tested on gestures and signs representing both isolated words and continuous sentences in Malaysian sign language, with what they claim is a high degree of recognition accuracy and speed.


Learn a new language with Duolingo's chatbots

Engadget

Duolingo has been offering language learning tools for a while now, but today the company debuted a new tool inside its iPhone app that could make the task a bit easier. Thanks to AI-powered chatbots, the language-learning app offers a way to have conversations while you're trying to learn French, German and Spanish. Duolingo gave these bots a bit of personality to make them more like real people and created them to be flexible with the answers they'll accept when there's multiple ways for you to respond. The new feature gives users of the free iOS app a way to learn through conversations without the anxiety of making mistakes when speaking with a real person.


New AI-Based App Develops Kids' Tech, Photo and Language Skills

#artificialintelligence

Los Angeles, California – Indie developer and computer vision engineer, Mustafa Jaber, is pleased to announce the release of Capture Caption Lite, an AI-based app developed for iOS and Android devices. With the Capture Caption app, users can snap a photo with their smartphone or iPad and artificial intelligence will generate a word cloud using cutting-edge computer technology. These word clouds can be then downloaded to the user's image library and shared across multiple social media platforms. The brainchild of electrical engineer and image processing expert Mustafa Jaber, the app uses an artificial intelligence platform that derives information from images. This program understands the content of any image by using powerful machine-learning models, which can quickly classify images into thousands of categories.


'SignAloud' gloves translate sign language movements into spoken English

Daily Mail

For people living in a world without sound, sign language can make sure their points of view are heard. But outside of the deaf and hard-of-hearing communities, this gesture-based language can lose its meaning. Now a pair of entrepreneurial technology students in the US has designed a pair of gloves to break down the communication barriers, by translating hand gestures into speech. US inventors have designed a pair of gloves, called'SignAloud', which translate the gestures of sign language to spoken English. The gloves (pictured) use embedded sensors to monitor the position and movement of the user's hands, while a central computer analyses the data and converts gestures to speech Called'SignAloud', the gloves use embedded sensors to monitor the position and movement of the user's hands.