We pay close attention to social cues--like eye gaze, emotions, politeness--whether these cues come from a person…or from a machine. There's even a classic book, published by Byron Reeves and Clifford Nass in 1996, titled, The Media Equation: How people treat computers, television, and new media like real people and places. Among their findings: people assign personalities to digital devices, people are polite to computers--for example, they evaluate a computer more positively when they had to tell it to its face. Research since that book has shown again and again that these findings still hold: Humans treat machines as social beings.
Madeline Gannon of Carnegie Mellon University is the designer of Mimus, a new gesture controlled robot featured in an art installation at The Design Museum in London, England. To many, RealDoll has crossed the "Uncanny Valley" of creepiness with sex dolls that look and talk like humans. Florence Gildea writes on the organization's blog: "The personalities and voices that doll owners project onto their dolls is pertinent for how sex robots may develop, given that sex doll companies like RealDoll are working on installing increasing AI capacities in their dolls and the expectation that owners will be able to customize their robots' personalities." The example given is how the doll expresses her "feelings" for her owner on Twitter: Obviously a robot companion has no feelings, however it is a projection of the doll owners'.
The robotics work programme implements the robotics strategy developed by SPARC, the Public-Private Partnership for Robotics in Europe (see the Strategic Research Agenda). The research and innovation projects focus on a wide variety of Robotics and Autonomous Systems and capabilities, such as navigation, human-robot interaction, recognition, cognition and handling. ANDY will develop the ANDYSUIT, a wearable technology for monitoring humans involved in whole-body physical interaction tasks. From ANDYDATASET, ANDY will develop ANDYMODEL, a set of models to describe human and robot behaviour when engaged in collaborative tasks.
An "emotional chatting machine" has been developed by scientists, signalling the approach of an era in which human-robot interactions are seamless and go beyond the purely functional. Huang and colleagues started by creating an "emotion classifying" algorithm that learned to detect emotion from 23,000 posts taken from the Chinese social media site Weibo. The resulting program could be switched into five possible modes – happy, sad, angry, disgusted, liking – depending on the user's preference. The latest study shows that chatbots, driven by a machine learning approach, are starting to make significant headway.
Médecins Sans Frontières International (MSF) designed a treatment capability system to identify patients based on the risk of transmission, exposure to others and patient care required. This was the premise of a recent paper out of Oregon State University that explored robotic and human collaboration specifically for the improvement of patient care when dealing with infectious disease environments. The third will use humans and robots to optimize the task set and balance the cognitive load on the human caregivers. Using machine learning, we'll run permutations to find the perfect mix balancing, task completion, task optimizations and cognitive load.
The traffic light's three colored circles work hard, managing the flow of traffic around the world. Evgeny Arinin acknowledges this, and admires the design for its enduring effectiveness. But the Russian industrial designer believes traffic signals could communicate instructions more clearly. With that in mind, he's crafted an alternative: an LED display that uses its shape, big arrows, and punchy icons to loudly articulate the rules of the road.
Don't get excited about buying the new robots created by Japanese company Dentsu in conjunction with Toyota and the University of Tokyo -- they won't be hitting stores anytime soon. In fact, he and his backup Mirata were endowed with voice recognition, natural language processing, speech synthesis, realistic body language and facial recognition for that very reason. They'll be participating in the "world's first conversational experiment" between people and robots in space, while also mixing it up with kids on earth with educational activities. Hopefully, the astronauts won't give Kirobo any HAL 9000-like control of the station, though the cute'bots seem malice-free, saying they "wanted to create a future where humans and robots live together and get along."
As robots are starting to look an awful lot like humans, science fiction is starting to look a lot less like fiction. At the University of Minnesota's Artificial Intelligence, Robotics, and Vision Laboratory (AIRVL), they're studying things like Intelligent Transportation Systems and building mini-robots (including adorable Lego-based mini-robots). The idea of robots replacing humans on basic jobs, like auto line assemblage or serving fast food, has never really scared us; we have always thought of ourselves as much more (collectively, at least) than minimum-wage drone work. We already have to contend with killing machines and drones equipped with the ability to make kill decisions.
In addition to the panel of judges from the first contest, Beauty.AI 2.0 featured three new robot judges including: "Average Face" built on the hypothesis that the closer the face is to the average face within the ethnic group, the more attractive it is "AntiAgeist" evaluating the difference between the predicted and actual chronological age "PIMPL" evaluating the number and distribution of pimples and other dark spots (but not freckles) The results were sent to the individual participants via secure link and winners were announced at http://winners2.beauty.ai/#win . About Beauty.AI Beauty.AI is the first beauty contest judged entirely by robot jury, where humans and robots can apply to participate. About Youth Laboratories Youth Laboratories is a company dedicated to helping people retain youthful state for as long as possible using advances in machine vision and artificial intelligence. The company develops series of mobile applications that track age-related changes, wrinkles, pimples, dark spots and other parameters affecting perception of beauty, health and youthfulness and help evaluate the effectiveness of multiple interventions.
Humans might take heart from the recent decision by Mercedes-Benz to replace robots with humans on some lines. The machines were just not agile enough to keep pace with the growing demand for customised products while we humans can "reprogram" ourselves in a fraction of a second. "We're moving away from trying to maximise automation, with people taking a bigger part in industrial processes again," says Markus Schaefer, head of production planning at the automaker. "When we have people and machines co-operate, such as a person guiding a part-automatic robot, we're much more flexible and can produce many more products on one production line.