Using artificial intelligence to translate sign language in real time - see how we used Python to train a neural network with 86% accuracy in less than a day. Imagine a world where anyone can communicate using sign language over video. Inspired by this vision, some of our engineering team decided to bring this idea to HealthHack 2018. In less than 48 hours and using the power of artificial intelligence, their team was able to produce a working prototype which translated signs from the Auslan alphabet to English text in real time. People who are hearing impaired are left behind in video consultations.
It seems like voice interfaces are going to be a big part of the future of computing; popping up in phones, smart speakers, and even household appliances. But how useful is this technology for people who don't communicate using speech? Are we creating a system that locks out certain users? These were the questions that inspired software developer Abhishek Singh to create a mod that lets Amazon's Alexa assistant understand some simple sign language commands. In a video, Singh demonstrates how the system works.
Speaking verbally and performing sign language require the same parts of the brain, according to a new study. Researchers at New York University found that the neural skills needed to perform sign language are the similar to those required for speaking out loud. Their report is the first of its kind to prove the association between the two communication forms. Sign language communicators and verbal English speakers rely on the same neural skills, a new report says. The new research was published in the journal Scientific Reports.
A robotic hand that can translate words into sign language gestures for deaf people has been created by scientists. Named Project Aslan, the 3D-printed hand costs as little as £400 ($560) to make and interprets both written text and spoken words. The device communicates through'fingerspelling', a type of sign language where words are spelled out letter-by-letter through separate gestures on a single hand. The robot, which will be ready in five years, could one day be carried around in a rucksack, scientists say. It could help some of the 70 million worldwide who are deaf or hard of hearing to communicate with people who don't know sign language.
Translating is difficult work, the more so the further two languages are from one another. But sign language is a unique case, and translating it uniquely difficult, because it is fundamentally different from spoken and written languages. All the same, SignAll has been working hard for years to make accurate, real-time machine translation of ASL a reality.
Sign language translators are scarce. If you're hearing impaired, that's a huge problem. Three engineering students from the University of Antwerp have novel solution: Cheap 3D printed humanoids that can translate to sign language on the fly. It's a solution that's only become possible with the converge of 3D printing, the massive popularity of microcontrollers like the Arduino Due, and falling prices for robotics components. It's also the kind of "why didn't I think of that?" idea we'll see more of in the field of robotics as barriers to development keep falling.
While we usually see robotics applied to industrial or research applications, there are plenty of ways they could help in everyday life as well: an autonomous guide for blind people, for instance, or a kitchen bot that helps disabled folks cook. Or -- and this one is real -- a robot arm that can perform rudimentary sign language. It's part of a masters thesis from grad students at the University of Antwerp who wanted to address the needs of the deaf and hearing impaired. In classrooms, courts and at home, these people often need interpreters -- who aren't always available. Their solution is "Antwerp's Sign Language Actuating Node," or ASLAN.
An electric glove which can convert sign language into text messages has been unveiled by scientists. The $100 (£77) device will will allow deaf people to instantly send messages to those who don't understand sign language, according to its inventors. Researchers fitted a standard sports glove with nine flexible strain sensors which react when a user bends their fingers to create the new device. The device consists of a sports glove which has been fitted with nine stretchable sensors positioned over the knuckles. When a user bends their fingers or thumb to sign a letter, the sensors stretch, which causes an electrical signal to be produced.
Handwriting will never be the same again. A new glove developed at the University of California, San Diego, can convert the 26 letters of American Sign Language (ASL) into text on a smartphone or computer screen. Because it's cheaper and more portable than other automatic sign language translators on the market, it could be a game changer. People in the deaf community will be able to communicate effortlessly with those who don't understand their language. ASL is a language all of its own, but few people outside the deaf community speak it.
Machine translation systems that convert sign language into text and back again are helping people who are deaf or have difficulty hearing to communicate with those who cannot sign. KinTrans, a start-up based in Dallas, Texas, is trialling its technology in a bank and government offices in the United Arab Emirates, and plans to install it in more places over the next couple of months. SignAll, a company based in Budapest, Hungary, will begin its own trials next year. KinTrans uses a 3D camera to track the movement of a person's hands as they sign words. A sign language user can approach a bank teller and sign to the KinTrans camera that they'd like assistance, for example.