'Facial-profiling' could be dangerously inaccurate and biased, experts warn

#artificialintelligence 

Israeli startup Faception made headlines this year by claiming it could predict how likely people are to be terrorists, pedophiles, and more by analyzing faces with deep learning. Experts and research in the field, however, suggest that it is more fantasy than reality. Faception assigns ratings after training artificial intelligence on faces of terrorists, pedophiles, Mensa members, professional poker players, and more. Through deep learning--that emerging technique found in everything from Alpha Go to Siri to Netflix--the AI can supposedly predict how likely a new face is to belong to any given group. While this may sound believable, there's no evidence that face-based personality predictions are more than a tiny bit accurate.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found