New tool lets artists fight AI image bots by hiding corrupt data in plain sight

Engadget 

From Hollywood strikes to digital portraits, AI's potential to steal creatives' work and how to stop it has dominated the tech conversation in 2023. The latest effort to protect artists and their creations is Nightshade, a tool allowing artists to add undetectable pixels into their work that could corrupt an AI's training data, the MIT Technology Review reports. University of Chicago professor Ben Zhao and his team created Nightshade, which is currently being peer reviewed, in an effort to put some of the power back in artists' hands. They tested it on recent Stable Diffusion models and an AI they personally built from scratch. Nightshade essentially works as a poison, altering how a machine-learning model produces content and what that finished product looks like.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found