Poison pill tool could break AI systems stealing unauthorized data, allowing artists to safeguard their works
AI image generators Midjourney and Stable Diffusion trained their models with the works of countless artists without their permission or compensation, artist says. A new image protection tool was designed to poison AI programs that are trained using unauthorized data, giving creators a new way to safeguard their pieces and harm systems they say are stealing their works. Nightshade, a new tool from a University of Chicago team, puts data into an image's pixels that damage AI image generators that scour the web looking for pictures to train on, causing them to not work properly. An AI program might interpret a Nightshade-protected image of a dog, for example, as a cat, a photo of a car could be seen as a cow, and so on, causing the machine to malfunction, according to the team's research. A visitor takes a picture with his mobile phone of an image designed with artificial intelligence by Berlin-based digital creator Julian van Dieken inspired by Johannes Vermeer's painting "Girl with a Pearl Earring" at the Mauritshuis museum in The Hague on March 9, 2023.
Oct-26-2023, 20:30:03 GMT
- Country:
- Europe > Netherlands
- South Holland > The Hague (0.25)
- North America > United States
- California
- Alameda County > Berkeley (0.05)
- San Francisco County > San Francisco (0.19)
- Illinois > Cook County
- Chicago (0.29)
- California
- Europe > Netherlands
- Industry:
- Law > Litigation (1.00)
- Technology: