Goto

Collaborating Authors

Results


Reaction GIFs Offer A New Key To Emotion Recognition In NLP

#artificialintelligence

New research out of China is offering a novel method for Natural Language Processing (NLP) to perform sentiment analysis on social media forums and language research datasets – by categorizing and labeling animated GIFs that are posted in response to text announcements. The researchers, led by Boaz Shmueli of National Tsing Hua University at Taiwan, have used Twitter's in-built database of reaction GIFs as an index to quantify the affective state of a user's response, obviating the need to negotiate multiple language responses, the challenge of detecting sarcasm, or of identifying core emotional temperature from ambiguous or excessively brief responses. Clicking the'GIF' button when composing a Twitter post offers a standard set of labeled animated GIFs that are potentially easier for NLP to parse into'identified' emotions than plain-text language. The paper characterizes the use of reaction GIFs in this way as'a new type of label, not yet available in NLP emotion datasets', and notes that existing datasets either use the dimensional model of emotion or the discrete emotions model, neither of which offers this kind of insight. An animated GIF response to a user post.


Why is AI Considered a Misfit to Read Human Emotions?

#artificialintelligence

AI has been reigning in the industries and business ecosystems with its unending capabilities to accelerate automation and provide business intelligence. Disruptive technologies like artificial intelligence, machine learning, blockchain, etc. have enabled companies to create better user experiences and advance business growth. Emotional AI is a rather recent development in the field of modern technology, and it claims that AI systems can read facial expressions and analyze human emotions. This method is also known as affect recognition technology. Recently Article 19, a British human rights organization published a report stating the increasing use of AI-based emotion recognition technology in China by the law enforcement authorities, corporate bodies, and the state itself.


Smile for the camera: dark side of China's emotion-recognition tech

The Guardian

"Ordinary people here in China aren't happy about this technology but they have no choice. If the police say there have to be cameras in a community, people will just have to live with it. So says Chen Wei at Taigusys, a company specialising in emotion recognition technology, the latest evolution in the broader world of surveillance systems that play a part in nearly every aspect of Chinese society. Emotion-recognition technologies – in which facial expressions of anger, sadness, happiness and boredom, as well as other biometric data are tracked – are supposedly able to infer a person's feelings based on traits such as facial muscle movements, vocal tone, body movements and other biometric signals. It goes beyond facial-recognition technologies, which simply compare faces to determine a match. But similar to facial recognition, it involves the mass collection of sensitive personal data to track, monitor and profile people and uses machine learning to analyse expressions and other clues. The industry is booming in China, where since at least 2012, figures including President Xi Jinping have emphasised the creation of "positive energy" as part of an ideological campaign to encourage certain kinds of expression and limit others. Critics say the technology is based on a pseudo-science of stereotypes, and an increasing number of researchers, lawyers and rights activists believe it has serious implications for human rights, privacy and freedom of expression. With the global industry forecast to be worth nearly $36bn by 2023, growing at nearly 30% a year, rights groups say action needs to be taken now. The main office of Taigusys is tucked behind a few low-rise office buildings in Shenzhen. Visitors are greeted at the doorway by a series of cameras capturing their images on a big screen that displays body temperature, along with age estimates, and other statistics. Chen, a general manager at the company, says the system in the doorway is the company's bestseller at the moment because of high demand during the coronavirus pandemic. Chen hails emotion recognition as a way to predict dangerous behaviour by prisoners, detect potential criminals at police checkpoints, problem pupils in schools and elderly people experiencing dementia in care homes. Taigusys systems are installed in about 300 prisons, detention centres and remand facilities around China, connecting 60,000 cameras. "Violence and suicide are very common in detention centres," says Chen. "Even if police nowadays don't beat prisoners, they often try to wear them down by not allowing them to fall asleep.


This AI reads children's emotions as they learn

#artificialintelligence

Hong Kong (CNN Business)Before the pandemic, Ka Tim Chu, teacher and vice principal of Hong Kong's True Light College, looked at his students' faces to gauge how they were responding to classwork. Now, with most of his lessons online, technology is helping Chu to read the room. An AI-powered learning platform monitors his students' emotions as they study at home. The software, 4 Little Trees, was created by Hong Kong-based startup Find Solution AI. While the use of emotion recognition AI in schools and other settings has caused concern, founder Viola Lam says it can make the virtual classroom as good as -- or better than -- the real thing.


China's growing use of emotion recognition tech raises rights concerns

The Japan Times

Technology that measures emotions based on biometric indicators such as facial movements, tone of voice or body movements is increasingly being marketed in China, researchers say, despite concerns about its accuracy and wider human rights implications. Drawing upon artificial intelligence, the tools range from cameras to help police monitor a suspect's face during an interrogation to eye-tracking devices in schools that identify students who are not paying attention. A report released this week from U.K.-based human rights group Article 19 identified dozens of companies offering such tools in the education, public security and transportation sectors in China. "We believe that their design, development, deployment, sale and transfers should be banned due to the racist foundations and fundamental incompatibility with human rights," said Vidushi Marda, a senior program officer at Article 19. Human emotions cannot be reliably measured and quantified by technology tools, said Shazeda Ahmed, a doctoral candidate studying cybersecurity at the University of California, Berkeley and the report's co-author. Such systems can perpetuate bias, especially those sold to police that purport to identify criminality based on biometric indicators, she added.


Emotion Recognition From Gait Analyses: Current Research and Future Directions

arXiv.org Machine Learning

Human gait refers to a daily motion that represents not only mobility, but it can also be used to identify the walker by either human observers or computers. Recent studies reveal that gait even conveys information about the walker's emotion. Individuals in different emotion states may show different gait patterns. The mapping between various emotions and gait patterns provides a new source for automated emotion recognition. Compared to traditional emotion detection biometrics, such as facial expression, speech and physiological parameters, gait is remotely observable, more difficult to imitate, and requires less cooperation from the subject. These advantages make gait a promising source for emotion detection. This article reviews current research on gait-based emotion detection, particularly on how gait parameters can be affected by different emotion states and how the emotion states can be recognized through distinct gait patterns. We focus on the detailed methods and techniques applied in the whole process of emotion recognition: data collection, preprocessing, and classification. At last, we discuss possible future developments of efficient and effective gait-based emotion recognition using the state of the art techniques on intelligent computation and big data.


China touts dubious emotion recognition tech to find criminals

#artificialintelligence

What's happening: China says it's rolling out the tech in Xinjiang, where Uighur Muslims are kept in mass detainment camps, and in subway stations and airports to "identify criminal suspects," per FT. "At present only a few schools and public security bureaus have products that include this type of technology," Zhen Wenzhuang told FT, adding that emotion recognition has "not been fully developed for commercial use" in China. Between the lines: Even if the tech doesn't track emotions as advertised, being watched or even thinking you're being watched can still have a psychological effect and encourage people to change their behavior, as seen in workplace polling. In the U.S., Microsoft claims that its Face API program can identify emotions like contempt, happiness and disgust. Amazon's Reokognition points out that when its API identifies someone's facial expression, it "is not a determination of the person's internal emotional state."


Emotional intelligence and AI could be a winning combination

#artificialintelligence

This article was first published in the April 2019 China edition of Accounting and Business magazine. Emotional intelligence is an increasingly important factor for success for finance professionals, particularly in the era of artificial intelligence (AI). The link between the two was explored at the ACCA Hong Kong CFO Summit, held late last year, at which delegates were asked to consider how finance leaders can seek to thrive in the digital age. 'Our strength in this increasingly digital age is being human, exercising judgment, scepticism and emotional maturity,' said Jane Cheng, head of ACCA Hong Kong. 'Emotional competencies are critical to becoming a trusted and capable professional accountant, someone who can combine analytical figures with emotional maturity.'


EmoSense: an AI-powered and wireless emotion sensing system

#artificialintelligence

Researchers at Hefei University of Technology in China and various universities in Japan have recently developed a unique emotion sensing system that can recognize people's emotions based on their body gestures. They presented this new AI- powered system, called EmoSense, in a paper pre-published on arXiv. "In our daily life, we can clearly realize that body gestures contain rich mood expressions for emotion recognition," Yantong Wang, one of the researchers who carried out the study, told TechXplore. "Meanwhile, we can also find out that human body gestures affect wireless signals via shadowing and multi-path effects when we use antennas to detect behavior. Such signal effects usually form unique patterns or fingerprints in the temporal-frequency domain for different gestures."


Can AI Map Your Emotions?

#artificialintelligence

The final step for many artificial intelligence (AI) researchers is the development of a system that can identify human emotion from voice and facial expressions. While some facial scanning technology is available, there is still a long way to go in terms of properly identifying emotional states due to the complexity of nuances in speech as well as facial muscle movement. The University of Science and Technology researchers in Hefei, China, believe that they have made a breakthrough. Their paper, "Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio-video Emotion Recognition," expresses how an AI system may be able to recognize human emotion through state-of-the-art accuracy on a popular benchmark. In their published paper, the researchers say, "Automatic emotion recognition (AER) is a challenging task due to the abstract concept and multiple expressions of emotion. Inspired by this cognitive process in human beings, it's natural to simultaneously utilize audio and visual information in AER … The whole pipeline can be completed in a neural network."