Goto

Collaborating Authors

Analyzing emotions in video with R

#artificialintelligence

In the run-up to the election last year, Ben Heubl from The Economist used the Emotion API to chart the emotions portrayed by the candidates during the debates (note: auto-play video in that link). In his walkthrough of the implementation, Ben used Python to process the video files, and R to create the charts from the sentiment scores generated by the API. Now, the learn dplyr blog has recreated the analysis using R. A detailed walkthrough steps through the process of creating a free Emotion API key, submitting a video to the API using the httr package, and retrieving the emotion scores as an R data frame. For the complete details, including the R code used to interface with the Emotion API, follow the link below.


Where logic meets emotion

Science

As she sat in a taxi headed to Cairo International Airport in September 2001, Rana el Kaliouby remembers thinking, "Am I really going through with this?" A married woman and hijab-wearing Muslim, she would be on her own for the next 3 years, pursuing her doctorate in computer science at the University of Cambridge in the United Kingdom. In Girl Decoded, el Kaliouby and coauthor Carol Colman have created a riveting memoir of a "nice Egyptian girl" who, despite cultural conditioning that encouraged her to put her duties as a wife and mother first, went on to pursue her professional dreams.


This Honda concept car will have emotions of its own

Washington Post - Technology News

Chances are you either love or hate your car -- and soon the feeling could be mutual. Japanese automaker Honda will showcase a concept car at the Consumer Electronics Show next month that is capable of understanding the driver's emotions and developing emotions of its own, the company announced this week. The company provided few details as to how the technology will work or alter the driving experience. But we do know that the concept car, called the NeuV, is being touted as an automated electric vehicle that includes an "emotion engine." That's the name for artificial intelligence that Honda says will "enable machines to artificially generate their own emotions."


ICYMI: In the future, rooms will read your emotions

Engadget

Today on In Case You Missed It: MIT's Computer Science and Artificial Lab created a wireless detector that can read whether people are excited, happy, angry or sad by checking their breathing and heart rate. EQ Radio has an 87% success rate and doesn't use a single body sensor. If you're interested in the Ghost Robotics video, that's here and the hacking video showing how a Tesla Model S was remotely controlled is here. As always, please share any interesting tech or science videos you find by using the #ICYMI hashtag on Twitter for @mskerryd.


Kia AI tailors vehicle interiors to passengers' emotions

Engadget

Kia is preparing for a future with autonomous cars and at CES it will be showing off its Real-time Emotion Adaptive Driving System, or R.E.A.D. The company says its AI-based system can adapt vehicle interiors to a passenger's emotional state by using sensors to monitor their facial expressions, heart rate and electrodermal activity. Based on its readings, the R.E.A.D. System personalizes the cabin interior, taking into account all five senses. Part of the system also includes music-response vibration seats, which match seat vibrations to the frequencies of whatever music is being played in the cabin. The seats can also provide massages and haptic alerts connected to the vehicle's driver-assist system. Additionally, Kia will be revealing V-Touch -- gesture control technology that uses a 3D camera to track eye and finger motions and allows riders to manage in-car features like lighting, air-conditioning and entertainment systems without the use of physical buttons or touchscreens.