How Artificial Intelligence Can Be Fooled with 3D Printing…and Stickers
This was, in fact, the reaction the scientists were hoping for. Using subtle alterations imperceptible to the human eye, they changed the objects in a way that would make them unrecognizable to artificial intelligence. The technique is referred to as an adversarial attack, a way to fool AI without being evident to humans. Song also mentioned a trick in which a Hello Kitty was placed in an image recognition AI's view of a street scene. The cars in the scene simply disappeared.
Jul-25-2018, 01:37:23 GMT
- Industry:
- Information Technology > Security & Privacy (0.37)
- Machinery > Industrial Machinery (0.42)
- Technology: