More than 1 in 10 students say they know of peers who created deepfake nudes, report says
When news broke that AI-generated nude pictures of students were popping up at a Beverly Hills Middle School in February, many district officials and parents were horrified. But others said no one should have been blindsided by the spread of AI-powered "undressing" programs. "The only thing shocking about this story," one Carlsbad parent said his 14-year-old told him, "is that people are shocked." Now, a newly released report by Thorn, a tech company that works to stop the spread of child sexual abuse material, shows how common deepfake abuse has become. The proliferation coincides with the wide availability of cheap "undressing" apps and other easy-to-use, AI-powered programs to create deepfake nudes.
Aug-29-2024, 10:00:57 GMT
- Country:
- North America > United States > California
- Los Angeles County > Beverly Hills (0.25)
- San Francisco County > San Francisco (0.05)
- North America > United States > California
- Genre:
- Research Report (0.52)
- Industry:
- Education > Educational Setting
- K-12 Education (0.72)
- Information Technology > Security & Privacy (0.85)
- Law (1.00)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (0.90)
- Education > Educational Setting
- Technology: