Results


Arduino powered and 3D printed, this robot translates to sign language

ZDNet

Sign language translators are scarce. Three engineering students from the University of Antwerp have novel solution: Cheap 3D printed humanoids that can translate to sign language on the fly. It's a solution that's only become possible with the converge of 3D printing, the massive popularity of microcontrollers like the Arduino Due, and falling prices for robotics components. ASLAN is an abbreviation which stands for: "Antwerp's Sign Language Actuating Node."


This mind-reading system can correct a robot's error! Latest News & Updates at Daily News & Analysis

#artificialintelligence

A new brain-computer interface developed by scientists can read a person's thoughts in real time to identify when a robot makes a mistake, an advance that may lead to safer self-driving cars. By relying on brain signals called "error-related potentials" (ErrPs) that occur automatically when humans make a mistake or spot someone else making one, the new approach allows even complete novices to control a robot with their minds. This technology developed by researchers at the Boston University and the Massachusetts Institute of Technology (MIT) may offer intuitive and instantaneous ways of communicating with machines, for applications as diverse as supervising factory robots to controlling robotic prostheses. "When humans and robots work together, you basically have to learn the language of the robot, learn a new way to communicate with it, adapt to its interface," said Joseph DelPreto, a PhD candidate at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL).


Toshiba's new robot can speak in sign language

AITopics Original Links

The "communication android", as Toshiba is calling its creation, was unveiled this week at the Cutting-Edge IT & Electronics Comprehensive Exhibition (CEATEC), Japan, and has been designed for a maximum of movement fluidity in its hands and arms, employing 43 actuators in its joints, in order to speak in Japanese sign language. At this point, its range is fairly limited: it can mimic simple movements, such as greetings, but the company has plans to develop the robot -- named Aiko Chihira -- into a full communications robot by 2020. This will include speech synthesis, speech recognition, robotic control and other sensors. The end goal, the company said, is a robot that can serve as a "companion for the elderly and people with dementia, to offer telecounseling in natural speech, communicate through sign language and allow healthcare workers or family members to keep an eye on elderly people." If the robot looks familiar, that's because it was developed in collaboration with Osaka University, which has been developing humanoid robots for some time.


Humanoid Robot Demonstrates Sign Language

AITopics Original Links

With the DARPA Robotics Challenge looming large on the horizon, it's easy to overlook robots that aren't taking part. One of them was Nino, a humanoid unveiled earlier this year by the National Taiwan University's Robotics Laboratory. Unlike the DARPA robots, Nino may not find itself performing tasks in dangerous situations any time soon. But this robot has some special skills: It is likely the first full-sized humanoid to demonstrate sign language. "Sign language has a high degree of difficulty, requiring the use of both arms, hands, and fingers as well as facial expressions," said Professor Han-Pang Huang, who leads NTU's Robotics Lab.