UK watchdog warns against AI for emotional analysis, dubs 'immature' biometrics a bias risk

#artificialintelligence 

The U.K.'s privacy watchdog has warned against use of so-called "emotion analysis" technologies for anything more serious than kids' party games, saying there's a discrimination risk attached to applying "immature" biometric tech that makes pseudoscientific claims about being able to recognize people's emotions using AI to interpret biometric data inputs. Such AI systems'function', if we can use the word, by claiming to be able to'read the tea leaves' of one or more biometric signals, such as heart rate, eye movements, facial expression, skin moisture, gait tracking, vocal tone etc, and perform emotion detection or sentiment analysis to predict how the person is feeling -- presumably after being trained on a bunch of visual data of faces frowning, faces smiling etc (but you can immediately see the problem with trying to assign individual facial expressions to absolute emotional states -- because no two people, and often no two emotional states, are the same; hence hello pseudoscience!). The watchdog's deputy commissioner, Stephen Bonner, appears to agree that this high tech nonsense must be stopped -- saying today there's no evidence that such technologies do actually work as claimed (or that they will ever work). "Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever," he warned in a statement. "While there are opportunities present, the risks are currently greater.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found