Goto

Collaborating Authors

 clothoff


Teen sues AI tool maker over fake nude images

FOX News

A 17-year-old's lawsuit against an AI clothes removal company highlights growing privacy concerns as fake nude images spread through schools and social media.


Using AI to Humiliate Women: The Men Behind Deepfake Pornography

Der Spiegel International

The whistleblower confirmed to DER SPIEGEL that all Clothoff employees work in countries that used to belong to the Soviet Union. That is consistent with the fact that all of the company's internal communications that DER SPIEGEL has in its possession are completely in Russian, and the company's email service is also based in Russia. The four central players declined to respond to attempts by DER SPIEGEL to contact them for the story published in December 2024. A person named Elias did get in touch, however, claiming to be a spokesperson for the app. He said the four people mentioned above were unknown to him.


Artificial Intelligence and Deepfakes: The Growing Problem of Fake Porn Images

Der Spiegel International

In San Francisco, meanwhile, a lawsuit is underway against the operators of a number of nudify apps. In some instances, the complaint identifies the defendants by name, but in the case of Clothoff, the accused is only listed as "Doe," the name frequently used in the U.S. for unknown defendants. According to the website's imprint, Clothoff is operated out of the Argentinian capital Buenos Aires. But the company has concealed the true identities of its operators through the use of shell companies and other methods. For a time, operators even sought to mislead the public with a fake image, presumably generated by AI, of the purported head of Clothoff.


Revealed: the names linked to ClothOff, the deepfake pornography app

The Guardian

The first Miriam al-Adib learned of the pictures was when she returned home from a business trip. "I want to show you something." The girl, 14, opened her phone to show an explicit image of herself. "It's a shock when you see it," said Adib, a gynaecologist in the southern Spanish town of Almendralejo and a mother of four daughters. "The image is completely realistic … If I didn't know my daughter's body, I would have thought that image was real."


The true story of the devastating 2015 Mariana dam disaster

The Guardian

Who is behind the most notorious "deepfake" app on the internet? Trying to answer that question these past few months, for a new Guardian podcast series, Black Box, has been like wandering through a hall of mirrors. The app, ClothOff, has hundreds of thousands of followers and has already been used in a least two cases to generate dozens of images of underage girls – pictures that have left the girls traumatised, their parents outraged and the police baffled at how to stop it. Producers Josh Kelly, Alex Atack and I have followed ClothOff's trail to nondescript addresses in central London that appear to be unoccupied. We have encountered sham businesses, distorted voices and photographs of fake employees.