Collaborating Authors

Your Attendees' Emotions Can Be Measured, Analyzed, and Visualized


Lightwave, the groundbreaking bioanalytics company, has developed innovative solutions that enable brands to measure emotion and use that insight as a metric of success. Every event, brand activation, gala, or launch can make customers feel something. But how do you measure and visualize emotion? Learn how Lightwave uses human traits like heart rate and facial reactions to measure emotion at TIDE--the creative conference June 5 at the Park MGM in Las Vegas--where Rana June, C.E.O. of Lightwave, is a keynote speaker. June is one of a dozen innovative thinkers you'll hear from at TIDE--experts who have created experiences for Uber, Facebook, Nike, Pepsi, Unilever, 20th Century Fox, and many more.

'I felt scared because I lost my emotions for a time.'

National Geographic

Then 13, Takeuchi returned to find cinders where her home had been. Only an iron rice pot survived. The forbidden English dictionary, a gift from her father, was ash. She held a single page, which the wind soon swept away. A second firebombing on March 10 left her with images of running through a maelstrom of debris and smoke, and passing charred bodies--one, a mother who had tried to shield her infant beneath her.

Can AI Map Your Emotions?


The final step for many artificial intelligence (AI) researchers is the development of a system that can identify human emotion from voice and facial expressions. While some facial scanning technology is available, there is still a long way to go in terms of properly identifying emotional states due to the complexity of nuances in speech as well as facial muscle movement. The University of Science and Technology researchers in Hefei, China, believe that they have made a breakthrough. Their paper, "Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio-video Emotion Recognition," expresses how an AI system may be able to recognize human emotion through state-of-the-art accuracy on a popular benchmark. In their published paper, the researchers say, "Automatic emotion recognition (AER) is a challenging task due to the abstract concept and multiple expressions of emotion. Inspired by this cognitive process in human beings, it's natural to simultaneously utilize audio and visual information in AER … The whole pipeline can be completed in a neural network."

Language is just as important as expressions when reading someone's emotions, study shows

Daily Mail - Science & tech

Language is as important as expressions when reading emotion, a study has found -- meaning that being told someone looks'grumpy' can makes them seem grumpier. Researchers from Australia and the US asked volunteers to rate the emotions of people in either photographs or videos. The team found that when the participants were told that the subjects were feeling a specific emotion, this biased how they interpreted the expressions on show. The effect was most pronounced when dealing with angry, sad or scared faces -- as opposed to happy, disgusted, embarrassed, proud or surprised -- the team found. Language is as important as expressions when reading emotion, a study has found -- meaning that being told someone looks'grumpy' can makes them seem grumpier (stock image) 'The current studies demonstrate that language context alters the dimensional affective foundations that underlie our judgements of others' expressions,' the researchers wrote in their paper.

Detecting emotion with Machine Learning


Machine Learning is a very hot topic these days. Getting started can be fast and easy. In this video post, I walk through the steps to build a simple Universal Windows Application (UWP) that connects to the Microsoft Cognitive Services and the Emotion API. The Microsoft Cognitive Services are a set of APIs that enable your apps to leverage powerful algorithms using just a few lines of code. They work across lots of various devices and platforms such as iOS, Android, and Windows, keep improving and are easy to set up.