Goto

Collaborating Authors

Results


Why is AI Considered a Misfit to Read Human Emotions?

#artificialintelligence

AI has been reigning in the industries and business ecosystems with its unending capabilities to accelerate automation and provide business intelligence. Disruptive technologies like artificial intelligence, machine learning, blockchain, etc. have enabled companies to create better user experiences and advance business growth. Emotional AI is a rather recent development in the field of modern technology, and it claims that AI systems can read facial expressions and analyze human emotions. This method is also known as affect recognition technology. Recently Article 19, a British human rights organization published a report stating the increasing use of AI-based emotion recognition technology in China by the law enforcement authorities, corporate bodies, and the state itself.


China's growing use of emotion recognition tech raises rights concerns

The Japan Times

Technology that measures emotions based on biometric indicators such as facial movements, tone of voice or body movements is increasingly being marketed in China, researchers say, despite concerns about its accuracy and wider human rights implications. Drawing upon artificial intelligence, the tools range from cameras to help police monitor a suspect's face during an interrogation to eye-tracking devices in schools that identify students who are not paying attention. A report released this week from U.K.-based human rights group Article 19 identified dozens of companies offering such tools in the education, public security and transportation sectors in China. "We believe that their design, development, deployment, sale and transfers should be banned due to the racist foundations and fundamental incompatibility with human rights," said Vidushi Marda, a senior program officer at Article 19. Human emotions cannot be reliably measured and quantified by technology tools, said Shazeda Ahmed, a doctoral candidate studying cybersecurity at the University of California, Berkeley and the report's co-author. Such systems can perpetuate bias, especially those sold to police that purport to identify criminality based on biometric indicators, she added.