How Artificial Intelligence Can Be Fooled with 3D Printing…and Stickers

#artificialintelligence 

This was, in fact, the reaction the scientists were hoping for. Using subtle alterations imperceptible to the human eye, they changed the objects in a way that would make them unrecognizable to artificial intelligence. The technique is referred to as an adversarial attack, a way to fool AI without being evident to humans. Song also mentioned a trick in which a Hello Kitty was placed in an image recognition AI's view of a street scene. The cars in the scene simply disappeared.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found