This new data poisoning tool lets artists fight back against generative AI
Meta, Google, Stability AI, and OpenAI did not respond to MIT Technology Review's request for comment on how they might respond. Zhao's team also developed Glaze, a tool that allows artists to "mask" their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows. The team intends to integrate Nightshade into Glaze, and artists can choose whether they want to use the data-poisoning tool or not. The team is also making Nightshade open source, which would allow others to tinker with it and make their own versions.
Oct-23-2023, 17:26:40 GMT