Speaking verbally and performing sign language require the same parts of the brain, according to a new study. Researchers at New York University found that the neural skills needed to perform sign language are the similar to those required for speaking out loud. Their report is the first of its kind to prove the association between the two communication forms. Sign language communicators and verbal English speakers rely on the same neural skills, a new report says. The new research was published in the journal Scientific Reports.
A robotic hand that can translate words into sign language gestures for deaf people has been created by scientists. Named Project Aslan, the 3D-printed hand costs as little as £400 ($560) to make and interprets both written text and spoken words. The device communicates through'fingerspelling', a type of sign language where words are spelled out letter-by-letter through separate gestures on a single hand. The robot, which will be ready in five years, could one day be carried around in a rucksack, scientists say. It could help some of the 70 million worldwide who are deaf or hard of hearing to communicate with people who don't know sign language.
Translating is difficult work, the more so the further two languages are from one another. But sign language is a unique case, and translating it uniquely difficult, because it is fundamentally different from spoken and written languages. All the same, SignAll has been working hard for years to make accurate, real-time machine translation of ASL a reality.
Sign language translators are scarce. If you're hearing impaired, that's a huge problem. Three engineering students from the University of Antwerp have novel solution: Cheap 3D printed humanoids that can translate to sign language on the fly. It's a solution that's only become possible with the converge of 3D printing, the massive popularity of microcontrollers like the Arduino Due, and falling prices for robotics components. It's also the kind of "why didn't I think of that?" idea we'll see more of in the field of robotics as barriers to development keep falling.
While we usually see robotics applied to industrial or research applications, there are plenty of ways they could help in everyday life as well: an autonomous guide for blind people, for instance, or a kitchen bot that helps disabled folks cook. Or -- and this one is real -- a robot arm that can perform rudimentary sign language. It's part of a masters thesis from grad students at the University of Antwerp who wanted to address the needs of the deaf and hearing impaired. In classrooms, courts and at home, these people often need interpreters -- who aren't always available. Their solution is "Antwerp's Sign Language Actuating Node," or ASLAN.
An electric glove which can convert sign language into text messages has been unveiled by scientists. The $100 (£77) device will will allow deaf people to instantly send messages to those who don't understand sign language, according to its inventors. Researchers fitted a standard sports glove with nine flexible strain sensors which react when a user bends their fingers to create the new device. The device consists of a sports glove which has been fitted with nine stretchable sensors positioned over the knuckles. When a user bends their fingers or thumb to sign a letter, the sensors stretch, which causes an electrical signal to be produced.
Handwriting will never be the same again. A new glove developed at the University of California, San Diego, can convert the 26 letters of American Sign Language (ASL) into text on a smartphone or computer screen. Because it's cheaper and more portable than other automatic sign language translators on the market, it could be a game changer. People in the deaf community will be able to communicate effortlessly with those who don't understand their language. ASL is a language all of its own, but few people outside the deaf community speak it.
Machine translation systems that convert sign language into text and back again are helping people who are deaf or have difficulty hearing to communicate with those who cannot sign. KinTrans, a start-up based in Dallas, Texas, is trialling its technology in a bank and government offices in the United Arab Emirates, and plans to install it in more places over the next couple of months. SignAll, a company based in Budapest, Hungary, will begin its own trials next year. KinTrans uses a 3D camera to track the movement of a person's hands as they sign words. A sign language user can approach a bank teller and sign to the KinTrans camera that they'd like assistance, for example.
Aside from adding a funny spin to a message, GIFs can now teach you sign language. Giphy recently released a GIF library of more than 2,000 words and phrases in American Sign Language. The GIF's are based on the video series Sign with Robert with Robert DeMayo, who has been deaf since he was born. Aside from adding a funny spin to a message, GIFs can now teach you sign language. Giphy recently released a GIF library of more than 2,000 words and phrases in American Sign Language.
GIFs can do more than add a sassy quip to the end of your tweet. Now, they can even help you learn a new language. Giphy released an extensive GIF library on Thursday with more than 2,000 words and phrases in American Sign Language. To create the GIFs, Giphy cut videos from the popular educational series Sign With Robert, adding text descriptions to make the GIFs look like looping flash cards. At first glance, the GIFs might seem a bit unremarkable -- they simply show Sign with Robert creator Robert DeMayo, who has been deaf since birth, signing a word over and over.