Goto

Collaborating Authors

 nudify


Deepfake 'Nudify' Technology Is Getting Darker--and More Dangerous

WIRED

Sexual deepfakes continue to get more sophisticated, capable, easy to access, and perilous for millions of women who are abused with the technology. Open the website of one explicit deepfake generator and you'll be presented with a menu of horrors. With just a couple of clicks, it offers you the ability to convert a single photo into an eight-second explicit videoclip, inserting women into realistic-looking graphic sexual situations. "Transform any photo into a nude version with our advanced AI technology," text on the website says. The options for potential abuse are extensive.


Australia clamps downs on 'nudify' sites used for AI-generated child abuse

Al Jazeera

Australia clamps downs on'nudify' sites used for AI-generated child abuse Internet users in Australia have been blocked from accessing several websites that used artificial intelligence to create child sexual exploitation material, the country's internet regulator has announced. The three "nudify" sites withdrew from Australia following an official warning, eSafety Commissioner Julie Inman Grant said on Thursday. Grant said such "nudify" services, which allow users to make images of real people appear naked using AI, have had a "devastating" effect in Australian schools. "We took enforcement action in September because this provider failed to put in safeguards to prevent its services being used to create child sexual exploitation material and were even marketing features like undressing'any girl,' and with options for'schoolgirl' image generation and features such as'sex mode,'" Grand said in a statement. The development comes after Grant's office issued a formal warning to the United Kingdom-based company behind the sites in September, threatening civil penalties of up to 49.5 million Australian dollars ($32.2m) if it did not introduce safeguards to prevent image-based abuse.


Australia moves to stamp out 'nudify' and stalking apps

Al Jazeera

Australia has announced plans to ban apps used for stalking and creating deepfake nudes. Tech platforms will be responsible for preventing access to "nudify" and undetectable online stalking tools under the reforms announced on Tuesday by the Australian government. Minister for Communications Anika Wells said Australia would work with firms to stamp out "abhorrent technologies" while ensuring "legitimate and consent-based" artificial intelligence (AI) and online tracking services were not adversely affected. "Abusive technologies are widely and easily accessible and are causing real and irreparable damage now," Wells said in a statement. "These new, evolving, technologies require a new, proactive, approach to harm prevention – and we'll work closely with industry to achieve this." "While this move won't eliminate the problem of abusive technology in one fell swoop, alongside existing laws and our world-leading online safety reforms, it will make a real difference in protecting Australians," she added.


AI 'Nudify' Websites Are Raking in Millions of Dollars

WIRED

For years, so-called "nudify" apps and websites have mushroomed online, allowing people to create nonconsensual and abusive images of women and girls, including child sexual abuse material. Despite some lawmakers and tech companies taking steps to limit the harmful services, every month, millions of people are still accessing the websites, and the sites' creators may be making millions of dollars each year, new research suggests. An analysis of 85 nudify and "undress" websites--which allow people to upload photos and use AI to generate "nude" pictures of the subjects with just a few clicks--has found that most of the sites rely on tech services from Google, Amazon, and Cloudflare to operate and stay online. The findings, revealed by Indicator, a publication investigating digital deception, say that the websites had a combined average of 18.5 million visitors for each of the past six months and collectively may be making up to 36 million per year. Alexios Mantzarlis, a cofounder of Indicator and an online safety researcher, says the murky nudifier ecosystem has become a "lucrative business" that "Silicon Valley's laissez-faire approach to generative AI" has allowed to persist.


Millions of People Are Using Abusive AI 'Nudify' Bots on Telegram

WIRED

In early 2020, deepfake expert Henry Ajder uncovered one of the first Telegram bots built to "undress" photos of women using artificial intelligence. At the time, Ajder recalls, the bot had been used to generate more than 100,000 explicit photos--including those of children--and its development marked a "watershed" moment for the horrors deepfakes could create. Since then, deepfakes have become more prevalent, more damaging, and easier to produce. Now, a WIRED review of Telegram communities involved with the explicit nonconsensual content has identified at least 50 bots that claim to create explicit photos or videos of people with only a couple of clicks. The bots vary in capabilities, with many suggesting they can "remove clothes" from photos while others claim to create images depicting people in various sexual acts.


AI-powered deepfake nude websites are targeted by San Francisco city attorney's lawsuit

Los Angeles Times

David Chiu announced Thursday that his office is suing the operators of 16 A.I.-powered "undressing" websites that help users create and distribute deepfake nude photos of women and girls. The lawsuit, which city officials said was the first of its kind, accuses the websites' operators of violating state and federal laws that ban deepfake pornography, revenge pornography and child pornography, as well as California's unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday. Chiu's office has yet to identify the owners of many of the websites, but officials say they hope to find their names and hold them accountable. Chiu said the lawsuit has two goals: shutting down these websites and sounding the alarm about this form of "sexual abuse."


AI-powered 'Nudify' apps that digitally undress fully-clothed teenage girls are soaring in popularity

Daily Mail - Science & tech

Tens of millions of people are using AI-powered'nudify' apps, according to a new analysis that shows the dark side of the technology. More than 24 million people visited nudity AI websites in September, which digitally alter images, primarily women, to make them appear naked in the photo using deep-learning algorithms. These algorithms are trained on existing images of women which allows it to overlay realistic images of nude body parts, regardless of whether the photographed person is clothed. Spam ads across major platforms are also directing people to the sites and apps increased by more than 2,000 percent since the beginning of 2023. The rise in nudity-promoted apps is particularly prevalent on social media, including Google's YouTube, Reddit, and X - and 52 Telegram groups were also found to be used to access non-consensual intimate imagery (NCII) services.


'Nudify' Apps That Use AI to 'Undress' Women in Photos Are Soaring in Popularity

TIME - Tech

Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers. In September alone, 24 million people visited undressing websites, the social network analysis company Graphika found. Many of these undressing, or "nudify," services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude.