Goto

Collaborating Authors

 abuse image


Child predators are using AI to create sexual images of their favorite 'stars': 'My body will never be mine again'

The Guardian

Predators active on the dark web are increasingly using artificial intelligence to create sexually explicit images of children, fixating especially on "star" victims, child safety experts warn. Child safety groups tracking the activity of predators chatting in dark web forums say they are increasingly finding conversations about creating new images based on older child sexual abuse material (CSAM). Many of these predators using AI obsess over child victims referred to as "stars" in predator communities for the popularity of their images. "The communities of people who trade this material get infatuated with individual children," said Sarah Gardner, chief executive officer of the Heat Initiative, a Los Angeles non-profit focused on child protection. "They want more content of those children, which AI has now allowed them to do."


AI-created child sexual abuse images 'threaten to overwhelm internet'

The Guardian

The "worst nightmares" about artificial intelligence-generated child sexual abuse images are coming true and threaten to overwhelm the internet, a safety watchdog has warned. The Internet Watch Foundation (IWF) said it had found nearly 3,000 AI-made abuse images that broke UK law. The UK-based organisation said existing images of real-life abuse victims were being built into AI models, which then produce new depictions of them. It added that the technology was also being used to create images of celebrities who have been "de-aged" and then depicted as children in sexual abuse scenarios. Other examples of child sexual abuse material (CSAM) included using AI tools to "nudify" pictures of clothed children found online.


The AI-Generated Child Abuse Nightmare Is Here

WIRED

A horrific new era of ultrarealistic, AI-generated, child sexual abuse images is now underway, experts warn. Offenders are using downloadable open source generative AI models, which can produce images, to devastating effects. The technology is being used to create hundreds of new images of children who have previously been abused. Offenders are sharing datasets of abuse images that can be used to customize AI models, and they're starting to sell monthly subscriptions to AI-generated child sexual abuse material (CSAM). The details of how the technology is being abused are included in a new, wide-ranging report released by the Internet Watch Foundation (IWF), a nonprofit based in the UK that scours and removes abuse content from the web.


Apple defends scanning iPhones for child abuse images, saying algorithm only identifies flagged pics

Daily Mail - Science & tech

Apple is pushing back against criticism over its plan to scan photos on users iPhones and in iCloud storage in search of child sexual abuse images. In a Frequently Asked Questions document focusing on its'Expanded Protections for Children,' Apple insisted its system couldn't be exploited to seek out images related to anything other than child sexual abuse material (CSAM). The system will not scan photo albums, Apple says, but rather looks for matches based on a database of'hashes' - a type of digital fingerprint - of known CSAM images provided by child safety organizations. While privacy advocacies worry about'false positives, Apple boasted that'the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year.' Apple also claims it would'refuse any such demands' from government agencies, in the US or abroad.