Mars "emotions" study shows which ads sell with 75% accuracy Netimperative - latest digital marketing news

#artificialintelligence

A study by Realeyes and Mars, Incorporated has revealed emotion measurement technology can distinguish between ads which deliver high or zero/low sales lift with 75% accuracy. The study involved 149 ads across 35 brands and 22,334 people in six countries. Realeyes measured how people felt while they watched the ads by using artificial intelligence to analyse their facial expressions through their webcams (with their consent). The study was designed in collaboration with the Mars Marketing Laboratory at the Ehrenberg-Bass Institute for Marketing Science. Realeyes' emotion data was cross-referenced with Mars, Incorporated's known sales lift data for each ad to investigate the relationship between emotions and sales performance.


Realeyes raises $12.4 million to help brands detect emotion using AI on facial expressions

#artificialintelligence

Artificial emotional intelligence, or "emotion AI," is emerging as a key component of the broader AI movement. The general idea is this: It's all very well having machines that can understand and respond to natural-language questions, and even beat humans at games, but until they can decipher non-verbal cues such as vocal intonations, body language, and facial expressions, humans will always have the upper hand in understanding other humans. And it's against that backdrop that countless companies are working toward improving computer vision and voice analysis techniques, to help machines detect the intricate and finely balanced emotions of a flesh-and-bones homo sapiens. One of those companies is Realeyes, a company that helps big brands such as AT&T, Mars, Hershey's, and Coca-Cola gauge human emotions through desktop computers' and mobile devices' cameras. The London-based startup, which was founded in 2007, today announced a fresh $12.4 million round of funding from Draper Esprit, the VC arm of Japanese telecom giant NTT Docomo, Japanese VC fund Global Brain, Karma Ventures, and The Entrepreneurs Fund.


What Is Artificial Emotional Intelligence & How Does Emotion AI Work?

#artificialintelligence

Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions. Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an "a," not an "e"). In psychology, an "affect" is a term used to describe the experience of feeling or emotion. If you've seen "Solo: A Star Wars Story", then you've seen the poster child for artificial emotional intelligence: L3-37. Lando Calrissian's droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion.


AI can read your emotions. Should it?

The Guardian

It is early July, almost 30C outside, but Mihkel Jäätma is thinking about Christmas. In a co-working space in Soho, the 39-year-old founder and CEO of Realeyes, an "emotion AI" startup which uses eye-tracking and facial expression to analyse mood, scrolls through a list of 20 festive ads from 2018. He settles on The Boy and the Piano, the offering from John Lewis that tells the life story of Elton John backwards, from megastardom to the gift of a piano from his parents as a child, accompanied by his timeless heartstring-puller Your Song. The ad was well received, but Jäätma is clearly unconvinced. He hits play, and the ad starts, but this time two lines – one grey (negative reactions), the other red (positive) – are traced across the action.


Higher quality screens make watching TV more enjoyable, study suggests

Daily Mail - Science & tech

Bingeing your favourite boxset will be more enjoyable if you're watching it on a modern TV screen, new research suggests. Identical twin brothers were monitored by AI as they sat down to enjoy the same episode of Game of Thrones in separate rooms on different TV sets. Experimenters found that the sibling watching on the most up to date screen displayed the greatest physical and emotional responses. Bingeing your favourite boxset will be more enjoyable if you're watching it on a modern TV screen, new research suggests. Realeyes' AI platform analysed the facial expressions, head movements and body language from more than 144,000 frames of video footage captured of each twin.