Emotion AI, explained MIT Sloan


What did you think of the last commercial you watched? Would you buy the product? You might not remember or know for certain how you felt, but increasingly, machines do. New artificial intelligence technologies are learning and recognizing human emotions, and using that knowledge to improve everything from marketing campaigns to health care. These technologies are referred to as "emotion AI." Emotion AI is a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures, understands, simulates, and reacts to human emotions.

When your tech knows you better than you know yourself


For more on new technology that can read human emotions, check out the third episode of Should This Exist? the podcast that debates how emerging technologies will impact humanity. If we were sitting across a table from each other at a cafe and I asked about your day, you might answer with a polite response, like, "Fine." But if you were lying, I'd know from your expression, tone, twitches, and tics. We read subtext--unspoken clues--to get at the truth, to cut through what people say to understand what they mean. And now, with so many of our exchanges taking place in text online, much of our messaging, traditionally delivered via subtext, tells us less than ever before.

Don't look now: why you should be worried about machines reading your emotions

The Guardian

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short. While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train "behavior detection officers" to scan faces for signs of deception.

When our devices can read our emotions: Affectiva's Gabi Zijderveld


I've always had this vision too that we as humans would maybe perhaps carry with us, let's call it our emotion passport. It's our emotional digital footprint that we control.

A Crucial Step for Averting AI Disasters


The expanding use of AI is attracting new attention to the importance of workforce diversity. Although tech companies have stepped up efforts to recruit women and minorities, computer and software professionals who write AI programs are still largely white and male, Bureau of Labor Statistics data show. Developers testing their products often rely on data sets that lack adequate representation of women or minority groups. One widely used data set is more than 74% male and 83% white, research shows. Thus, when engineers test algorithms on these databases with high numbers of people like themselves, they may work fine.

Aptiv Partners With Affectiva To Enable Next-Gen In-Vehicle Experience


Aptiv has signed a commercial partnership agreement with Affectiva to deliver innovative, scalable software to enhance perception capabilities in advanced safety solutions, and reimagine the future of the in-cabin experience. Affectiva is a Boston-based MIT Media Lab spin-off and leader in Human Perception artificial intelligence (AI). This new software that aims to enhance in-vehicle experience will be derived from deep learning architectures, the company noted. Aptiv and Affectiva will be working closely in commercialising advanced sensing solutions for OEM and fleet customers, and to further support the commercial partnership, the former has made a minority investment in Affectiva. Affectiva's patented software is the first multi-modal interior sensing solution to unobtrusively identify complex cognitive states of vehicle occupants in real-time, Aptiv said.

7 Ways AI Is Changing How You Shop, Eat, and Live


Getting an autonomous vehicle to drive safely under idealized road conditions has technically been possible for a while now, but for the real world, the cars are going to have to learn to drive a little bit more like us. That's where, a startup founded by notorious iPhone hacker George Hotz, comes in. Rather than teaching its computer systems what a tree or a stop sign looks like,'s Openpilot technology analyzes the patterns of everyday drivers to train its self-driving models. The company is pulling in millions of miles of driving data from a dashcam app called Chffr and a plug-in module called Panda, then aggregating that data to create an autonomous system that mimics human drivers.

Softbanks Robotics enhances Pepper the robot's emotional intelligence


Softbank Robotics today announced that its robot Pepper will now use emotion recognition AI from Affectiva to interpret and respond to human activity. Pepper is about four feet tall, gets around on wheels, and has a tablet in the center of its chest. The humanoid robot made its debut in 2015 and was designed to interact with people. Cameras and microphones are used to help Pepper recognize human emotions, like hostility or joy, and respond appropriately with a smile or indications of sadness. This type of intelligence likely comes in handy for the environments where Pepper operates, like banks, hotels, and Pizza Huts in some parts of Asia.

Will an A.I. Ever Become Sentient? – Predict – Medium


Our planet is an amazing place, full of life that defies expectations at every turn. There are other animals on Earth aside from humans that exhibit BOTH intelligence and sentience, in every way you might choose to interpret those definitions. Is intelligence unique to Earth? We may never know for sure, but science so far has shown us that it is not unique to humanity. Consider the bottlenose dolphin, a creature that shares a similarly large and complex brain with humans, which is capable of understanding numerical continuity and perhaps even discriminate between numbers.

In-car AI could soon know if you're having a good or bad day


Nuance Communications is already well known for its tech industry innovations. It's been at the forefront of speech recognition software and has also made substantial inroads into the automotive industry. In fact, you'll find its Dragon Drive software in more than a few cars out there on the roads. But the company also works in a stack of other business sectors including healthcare, telecommunications, financial services and even retail. Now, though, the company is working with Affectiva, an MIT Media Lab spin-off and a leading provider of AI software that detects complex and nuanced human emotions and cognitive states from face and voice.