Pedophiles on dark web turning to AI program to generate sexual abuse content
Kara Frederick, tech director at the Heritage Foundation, discusses the need for regulations on artificial intelligence as lawmakers and tech titans discuss the potential risks. An internet watchdog is sounding the alarm over the growing trend of sex offenders collaborating online to use open source artificial intelligence to generate child sexual abuse material. "There's a technical community within the offender space, particularly dark web forums, where they are discussing this technology," Dan Sexton, the chief technology officer at the Internet Watch Foundation (IWF), told The Guardian in a report last week. "They are sharing imagery, they're sharing [AI] models. Sexton's organization has found that offenders are increasingly turning to open source AI models to create illegal child sexual abuse material (CSAM) and distribute it online. Unlike closed AI models such as OpenAI's Dall-E or Google's Imagen, open source AI technology can be downloaded and adjusted by users, according to the report. Sexton said the ability to use such technology has spread among offenders, who take to the dark web to create and distribute realistic images. An internet watchdog is sounding the alarm over the growing trend of sex offenders collaborating online to use open source artificial intelligence to generate child sexual abuse material. "The content that we've seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people's computers and then modified.
Sep-19-2023, 06:00:45 GMT
- Industry:
- Technology: