Efforts to develop self-driving vehicles have largely focused on tracking what's going on outside the cars--think laser-based sensors to track other vehicles and digital mapping technologies to help navigate. Now, the industry is turning some of its attention to technologies that sense what's going on inside the vehicle. An initial goal is to better monitor driver alertness to help reduce the number of car accidents. But if fully autonomous vehicles one day become the norm, having technology that can understand the mood and preferences of passengers might enable the vehicle to automatically make adjustments that improve the riding experience. The jury is still out on whether vehicle occupants will prefer that to controlling changes themselves, but companies are trying to develop the technological capabilities anyway.
The Robotics Summit and Showcase is just a couple months away. Find out all about our agenda here and register by April 20 for a 20% discount to learn from the best in the robotics industry. Affectiva Automotive AI hopes to improve driver safety. Artificial intelligence (AI), to date, has helped autonomous vehicles mainly by monitoring the world around them. As we learned from the fatal Uber self-driving car crash, unfortunately, the technology is not perfect.
Softbank Robotics today announced that its robot Pepper will now use emotion recognition AI from Affectiva to interpret and respond to human activity. Pepper is about four feet tall, gets around on wheels, and has a tablet in the center of its chest. The humanoid robot made its debut in 2015 and was designed to interact with people. Cameras and microphones are used to help Pepper recognize human emotions, like hostility or joy, and respond appropriately with a smile or indications of sadness. This type of intelligence likely comes in handy for the environments where Pepper operates, like banks, hotels, and Pizza Huts in some parts of Asia.
Nuance Communications is already well known for its tech industry innovations. It's been at the forefront of speech recognition software and has also made substantial inroads into the automotive industry. In fact, you'll find its Dragon Drive software in more than a few cars out there on the roads. But the company also works in a stack of other business sectors including healthcare, telecommunications, financial services and even retail. Now, though, the company is working with Affectiva, an MIT Media Lab spin-off and a leading provider of AI software that detects complex and nuanced human emotions and cognitive states from face and voice.
Imagine if your car could pull itself over when you're drowsy or nauseous, or adjust the temperature and music when gridlock is stressing you out. Maybe it could even refuse to start if it knows you're intoxicated. With advanced ADAS systems already in place and the days of autonomous vehicles on the horizon, a lot of work is being done around sensing and machine learning to help vehicles better understand the roads and the world around them. But Boston-based startup Affectiva thinks more needs to be done around the internal world of the car--specifically the emotional state of the driver. Affectiva has built its business model around creating "emotional AI," algorithms capable of recognizing human emotional states.