Collaborating Authors

Analyzing emotions in video with R


In the run-up to the election last year, Ben Heubl from The Economist used the Emotion API to chart the emotions portrayed by the candidates during the debates (note: auto-play video in that link). In his walkthrough of the implementation, Ben used Python to process the video files, and R to create the charts from the sentiment scores generated by the API. Now, the learn dplyr blog has recreated the analysis using R. A detailed walkthrough steps through the process of creating a free Emotion API key, submitting a video to the API using the httr package, and retrieving the emotion scores as an R data frame. For the complete details, including the R code used to interface with the Emotion API, follow the link below.

Detecting emotion with Machine Learning


Machine Learning is a very hot topic these days. Getting started can be fast and easy. In this video post, I walk through the steps to build a simple Universal Windows Application (UWP) that connects to the Microsoft Cognitive Services and the Emotion API. The Microsoft Cognitive Services are a set of APIs that enable your apps to leverage powerful algorithms using just a few lines of code. They work across lots of various devices and platforms such as iOS, Android, and Windows, keep improving and are easy to set up.

Your Attendees' Emotions Can Be Measured, Analyzed, and Visualized


Lightwave, the groundbreaking bioanalytics company, has developed innovative solutions that enable brands to measure emotion and use that insight as a metric of success. Every event, brand activation, gala, or launch can make customers feel something. But how do you measure and visualize emotion? Learn how Lightwave uses human traits like heart rate and facial reactions to measure emotion at TIDE--the creative conference June 5 at the Park MGM in Las Vegas--where Rana June, C.E.O. of Lightwave, is a keynote speaker. June is one of a dozen innovative thinkers you'll hear from at TIDE--experts who have created experiences for Uber, Facebook, Nike, Pepsi, Unilever, 20th Century Fox, and many more.

Survey and Perspective on Social Emotions in Robotics Artificial Intelligence

This study reviews research on social emotions in robotics. In robotics, emotions are pursued for a long duration, such as recognition, expression, and computational modeling of the basic mechanism behind them. Research has been promoted according to well-known psychological findings, such as category and dimension theories. Many studies have been based on these basic theories, addressing only basic emotions. However, social emotions, also called higher-level emotions, have been studied in psychology. We believe that these higher-level emotions are worth pursuing in robotics for next-generation social-aware robots. In this review paper, while summarizing the findings of social emotions in psychology and neuroscience, studies on social emotions in robotics at present are surveyed. Thereafter, research directions towards implementation of social emotions in robots are discussed.

Detecting Emotion Primitives from Speech and their use in discerning Categorical Emotions Machine Learning

Emotion plays an essential role in human-to-human communication, enabling us to convey feelings such as happiness, frustration, and sincerity. While modern speech technologies rely heavily on speech recognition and natural language understanding for speech content understanding, the investigation of vocal expression is increasingly gaining attention. Key considerations for building robust emotion models include characterizing and improving the extent to which a model, given its training data distribution, is able to generalize to unseen data conditions. This work investigated a long-shot-term memory (LSTM) network and a time convolution - LSTM (TC-LSTM) to detect primitive emotion attributes such as valence, arousal, and dominance, from speech. It was observed that training with multiple datasets and using robust features improved the concordance correlation coefficient (CCC) for valence, by 30\% with respect to the baseline system. Additionally, this work investigated how emotion primitives can be used to detect categorical emotions such as happiness, disgust, contempt, anger, and surprise from neutral speech, and results indicated that arousal, followed by dominance was a better detector of such emotions.