Affectiva, a startup developing "emotion recognition technology" that can read people's moods from their facial expressions captured in digital videos, raised 14 million in a Series D round of funding led by Fenox Venture Capital. According to co-founder Rana el Kaliouby, the Waltham, Mass.-based company wants its technology to become the de facto means of adding emotional intelligence and empathy to any interactive product, and the best way for organizations to attain unvarnished insights about customers, patients or constituents. She explained that Affectiva uses computer vision and deep learning technology to analyze facial expressions or non-verbal cues in visual content online, but not the language or conversations in a video. The company's technology ingests digital images--including video in chat applications, live-streamed or recorded videos, or even GIFs--through simple web cams typically. Its system first categorizes then maps the facial expressions to a number of emotional states, like happy, sad, nervous, interested or surprised.
In the evolution to humanize technology, Affectiva is carving a niche. Its software development kit (SDK) and cloud-based API allow developers to enrich digital experiences by adding "emotion awareness" to apps from games to medical devices. And that means that machines can collect data and respond to users' emotions in real time, mostly based on facial recognition techniques. It's what the company calls, Emotion AI. As noted in a recent Forbes article: "Affectiva's technology has proven transformative for industries like automotive, market research, robotics, education, and gaming, but also for use cases like teaching autistic children emotion recognition and nonverbal social cues."
Growing up in Egypt in the 1980s, Rana el Kaliouby was fascinated by hidden languages--the rapid-fire blinks of 1s and 0s computers use to transform electricity into commands and the infinitely more complicated nonverbal cues that teenagers use to transmit volumes of hormone-laden information to each other. Culture and social stigma discouraged girls like el Kaliouby in the Middle East from hacking either code, but she wasn't deterred. When her father brought home an Atari video game console and challenged the three el Kaliouby sisters to figure out how it worked, Rana gleefully did. When she wasn't allowed to date, el Kaliouby studied her peers the same way that she did the Atari. "I was always the first one to say'Oh, he has a crush on her' because of all of the gestures and the eye contact," she says.
Softbank Robotics today announced that its robot Pepper will now use emotion recognition AI from Affectiva to interpret and respond to human activity. Pepper is about four feet tall, gets around on wheels, and has a tablet in the center of its chest. The humanoid robot made its debut in 2015 and was designed to interact with people. Cameras and microphones are used to help Pepper recognize human emotions, like hostility or joy, and respond appropriately with a smile or indications of sadness. This type of intelligence likely comes in handy for the environments where Pepper operates, like banks, hotels, and Pizza Huts in some parts of Asia.
What did you think of the last commercial you watched? Would you buy the product? You might not remember or know for certain how you felt, but increasingly, machines do. New artificial intelligence technologies are learning and recognizing human emotions, and using that knowledge to improve everything from marketing campaigns to health care. These technologies are referred to as "emotion AI." Emotion AI is a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures, understands, simulates, and reacts to human emotions.