Collaborating Authors

Affectiva CEO: AI needs emotional intelligence to facilitate human-robot interaction


Affectiva, one in a series of companies to come out of MIT's Media Lab whose work revolves around affective computing, used to be best known for sensing emotion in videos. It recently expanded into emotion detection in audio with the Speech API for companies making robots and AI assistants.

Adding an Emotional Face to Machine Learning – MIT Initiative on the Digital Economy – Medium


In the evolution to humanize technology, Affectiva is carving a niche. Its software development kit (SDK) and cloud-based API allow developers to enrich digital experiences by adding "emotion awareness" to apps from games to medical devices. And that means that machines can collect data and respond to users' emotions in real time, mostly based on facial recognition techniques. It's what the company calls, Emotion AI. As noted in a recent Forbes article: "Affectiva's technology has proven transformative for industries like automotive, market research, robotics, education, and gaming, but also for use cases like teaching autistic children emotion recognition and nonverbal social cues."

Could AI Finally Learn To Be Emotionally Intelligent? -- AI Daily - Artificial Intelligence News


When we think of robots, we often think of mechanical objects that repeatedly carry out simple tasks or serve to the more basic roles in society and while the media may have portrayed robots that could mimic human behavior from movies such as Big hero 6 or Wall E, the idea of a robot not only interacting but understanding the nuances of human behavior seemed almost impossible. This is where Rana el Kaliouby comes in, an academic who studied at Cambridge and MIT, and spent her career tackling an increasingly important limitation of technology - that computers do not understand humans, became the co founder of a Boston based start up called Affectiva, and has been working in the dynamic field of Human Robot Interaction (HRI) for more than 20 years. In a recent interview, Ms Kaliouby stated "Technology today has a lot of cognitive intelligence, or IQ, but no emotional intelligence, or EQ",and goes on to say, "We are facing an empathy crisis. We need to redesign technology in a more human-centric way." While this isn't a main concern of AI that performs data driven, logical tasks such as data processing, but it does become a bigger concern when the AI is in contact with clients, whether it be an AI receptionist or a robot driver. Increasingly, artificial intelligence is being used to directly have contact with humans.

Can AI Learn to Understand Emotions? -- NOVA Next PBS


Growing up in Egypt in the 1980s, Rana el Kaliouby was fascinated by hidden languages--the rapid-fire blinks of 1s and 0s computers use to transform electricity into commands and the infinitely more complicated nonverbal cues that teenagers use to transmit volumes of hormone-laden information to each other. Culture and social stigma discouraged girls like el Kaliouby in the Middle East from hacking either code, but she wasn't deterred. When her father brought home an Atari video game console and challenged the three el Kaliouby sisters to figure out how it worked, Rana gleefully did. When she wasn't allowed to date, el Kaliouby studied her peers the same way that she did the Atari. "I was always the first one to say'Oh, he has a crush on her' because of all of the gestures and the eye contact," she says.

Emerging Artificial Intelligence (AI) Leaders: Rana el Kaliouby, Affectiva


"Without our emotions, we can't make smart decisions," says Rana el Kaliouby. In the field of artificial intelligence, this is sheer heresy. Isn't the goal of AI to create a machine with human-level intelligence but without the human "baggage" of emotions, biases, and intuitions that only get in the way of smart decisions? As the co-founder and CEO of Affectiva, el Kaliouby is on a mission to expand what we mean by "artificial intelligence" and create intelligent machines that understand our emotions. Surveying the evolution of how we have interacted with computers, she asks "what's the next more natural interface?"