Using Deep Learning To Measure The Facial Emotion Of Television
Deep learning is increasingly capable of assessing the emotion of human faces, looking across an image to estimate how happy or sad the people in it appear to be. What if this could be applied to television news, estimating the average emotion of all of the human faces seen on the news over the course of a week? While AI-based facial sentiment assessment is still very much an active area of research, an experiment using Google's cloud AI to analyze a week's worth of television news coverage from the Internet Archive's Television News Archive demonstrates that even within the limitations of today's tools, there is a lot of visual sentiment in television news. To better understand the facial emotion of television, CNN, MSNBC and Fox News and the morning and evening broadcasts of San Francisco affiliates KGO (ABC), KPIX (CBS), KNTV (NBC) and KQED (PBS) from April 15 to April 22, 2019, totaling 812 hours of television news, were analyzed using Google's Vision AI image understanding API with all of its features enabled, including facial detection. Facial detection is very different from facial recognition. It only counts that a human face is present in an image, it does not actually attempt to discern who that person is.
May-23-2019, 12:49:48 GMT
- Country:
- North America > United States > California > San Francisco County > San Francisco (0.25)
- Industry:
- Leisure & Entertainment (1.00)
- Media > Television (1.00)
- Technology: