Goto

Collaborating Authors

Election 2016: Tracking Emotions with R and Python

#artificialintelligence

Temperament has been a key issue in the 2016 presidential election between Hillary Clinton and Donald Trump, and an issue highlighted in the series of three debates that concluded this week. Quantifying "temperament" isn't an easy task, but The Economist used the Microsoft Emotion API to chart the anger, contempt, sadness and surprised expressed in the faces of the candidates during key sequences of the debates, like this from the third debate: Economist Data Journalist Ben Heubl explains how you can analyze emotions in a video file using Python and R. The Emotion API provides scores for eight attributes of emotion as expressed by a face in a still image or video clip. For example, this expression by Donald Trump expresses mostly anger, with a touch of disgust and a soupçon of contempt. Ben provides Python code for passing a video clip into the Emotion API and retriving frame-by-frame emotion scores. He then uses R to analyze and chart the scores: mostly happiness for Clinton; mostly sadness for Trump.


R's tidytext turns messy text into valuable insight

@machinelearnbot

Check out "Text Mining with R: A tidy approach" to learn about how tidy data principles and the tidytext package can help you perform text mining in R.


Applying Face Emotion Recognition API Technology to Video of Nawaz Sharif's Address to Nation after Panama Papers

@machinelearnbot

Machine Learning is being applied to almost everything these days, and the results are immaculate. With the introduction of Machine Learning APIs, developers don't have to train their own Machine Learning Algorithms, rather they can use these Machine Learning API's to create most interesting applications. Recently, I red an article in which Ben Hubl applied Microsoft Cognitive Service Emotions API to do emotions analysis of video of Hillary and Trumps last debate. I was amazed to see the accuracy of results, and the way graphs of emotions were shown per frame of the video. This inspired me to do video emotions analysis of a Prime Minister Nawaz Sharif address to the nation after Panama Papers were published.


A Multi-task Neural Approach for Emotion Attribution, Classification and Summarization

arXiv.org Machine Learning

Emotional content is a crucial ingredient in user-generated videos. However, the sparsely expressed emotions in the user-generated video cause difficulties to emotions analysis in videos. In this paper, we propose a new neural approach---Bi-stream Emotion Attribution-Classification Network (BEAC-Net) to solve three related emotion analysis tasks: emotion recognition, emotion attribution and emotion-oriented summarization, in an integrated framework. BEAC-Net has two major constituents, an attribution network and a classification network. The attribution network extracts the main emotional segment that classification should focus on in order to mitigate the sparsity problem. The classification network utilizes both the extracted segment and the original video in a bi-stream architecture. We contribute a new dataset for the emotion attribution task with human-annotated ground-truth labels for emotion segments. Experiments on two video datasets demonstrate superior performance of the proposed framework and the complementary nature of the dual classification streams.


Jointly Learning to Detect Emotions and Predict Facebook Reactions

arXiv.org Machine Learning

The growing ubiquity of Social Media data offers an attractive perspective for improving the quality of machine learning-based models in several fields, ranging from Computer Vision to Natural Language Processing. In this paper we focus on Facebook posts paired with "reactions" of multiple users, and we investigate their relationships with classes of emotions that are typically considered in the task of emotion detection. We are inspired by the idea of introducing a connection between reactions and emotions by means of First-Order Logic formulas, and we propose an end-to-end neural model that is able to jointly learn to detect emotions and predict Facebook reactions in a multi-task environment, where the logic formulas are converted into polynomial constraints. Our model is trained using a large collection of unsupervised texts together with data labeled with emotion classes and Facebook posts that include reactions. An extended experimental analysis that leverages a large collection of Facebook posts shows that the tasks of emotion classification and reaction prediction can both benefit from their interaction.