This is part of a series on machine intelligence companies. We interviewed Beagle, Mariana, Beyond Verbal, Preteckt, and Eigen Innovations. Machines don't know what it's like to have feelings. Yet for most of us, emotions are the prism through which we view our lives. What seems like a benign request when we're in a good mood can seem like nagging when we're stressed.
Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions. Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an "a," not an "e"). In psychology, an "affect" is a term used to describe the experience of feeling or emotion. If you've seen "Solo: A Star Wars Story", then you've seen the poster child for artificial emotional intelligence: L3-37. Lando Calrissian's droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion.
It's a bright April day in Boston, and Gabi Zijderveld, a pioneer in the field of emotional artificial intelligence, is trying to explain why teaching robots to feel is as important as teaching them to think. "We live in a world surrounded by all these super-advanced technologies, hyper-connected devices, AI systems with super cognitive abilities -- or, as I like to say, lots of IQ but absolutely no EQ," says Zijderveld, chief marketing officer of Affectiva, the startup that spun out of the MIT Media Lab 10 years ago to build emotionally intelligent machines. "Just like humans that are successful in business and in life -- they have high emotional intelligence and social skills -- we should expect the same with technology, especially for these technologies that are designed to interact with humans." Giving machines a soul has been a dream of scientists, and sci-fi writers, for decades. But until recently, the idea of robots with heart was the stuff of moviemaking.
Is emotional AI ready to be a key component of our cars and other devices? Analysts are predicting huge growth for emotional AI in the coming years, albeit with widely differing estimates. A 2018 study by Market Research Future (MRFR) predicted that the "emotional analytics" market, which includes video, speech, and facial analytics technologies among others, will be worth a whopping $25 billion globally by 2025. Tractica has made a more conservative estimate in its own analysis, but still predicted the "emotion recognition and sentiment analysis" market to reach $3.8 billion by 2025. Researchers at Gartner have predicted that by 2022 10 percent of all personal electronic devices will have emotion AI capabilities, either on the device itself or via cloud-based services.