Goto

Collaborating Authors

 internet watch foundation


AI tool Grok used to create child sexual abuse imagery, watchdog says

The Guardian

Criminals have claimed to have used Grok to create the imagery on a dark web forum. Criminals have claimed to have used Grok to create the imagery on a dark web forum. Online criminals are claiming to have used Elon Musk's Grok AI tool to create sexual imagery of children, as a child safety watchdog warned the technology risked bringing such material into the mainstream. The UK-based Internet Watch Foundation (IWF) said users of a dark web forum boasted of using Grok Imagine to create sexualised and topless imagery of girls aged between 11 and 13. IWF analysts said the images would be considered child sexual abuse material (CSAM) under UK law.


4 Arrested Over Scattered Spider Hacking Spree

WIRED

WIRED reported this week on public records that show the United States Department of Homeland Security urging local law enforcement around the country to interpret common protest activities and surrounding logistics--including riding a bike, livestreaming a police encounter, or skateboarding--as "violent tactics." The guidance could influence cops to use everyday behavior as a pretext for police action. An AI hiring bot used on the McDonald's "McHire" site exposed tens of millions of job applicants' personal data because of a group of web-based security vulnerabilities--including use of the classically guessable password "123456" on an administrator account. The site's chatbot, known as Olivia, was built by the artificial intelligence software firm Paradox.ai. Meanwhile, in the wake of last week's devastating floods in Texas that killed at least 120 people, conspiracy theories about the extreme weather event have gained enough traction among anti-government extremists, GOP influencers, and others with large platforms to produce real-world consequences like death threats.


Child predators are using AI to create sexual images of their favorite 'stars': 'My body will never be mine again'

The Guardian

Predators active on the dark web are increasingly using artificial intelligence to create sexually explicit images of children, fixating especially on "star" victims, child safety experts warn. Child safety groups tracking the activity of predators chatting in dark web forums say they are increasingly finding conversations about creating new images based on older child sexual abuse material (CSAM). Many of these predators using AI obsess over child victims referred to as "stars" in predator communities for the popularity of their images. "The communities of people who trade this material get infatuated with individual children," said Sarah Gardner, chief executive officer of the Heat Initiative, a Los Angeles non-profit focused on child protection. "They want more content of those children, which AI has now allowed them to do."