Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from
Amazon discovered a'high volume' of CSAM in its AI training data but isn't saying where it came from The company's reports were inactionable, according to a child safety organization. The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The vast majority of that content was reported by Amazon, which found the material in its training data, according to an investigation by . In addition, Amazon said only that it obtained the inappropriate content from external sources used to train its AI services and claimed it could not provide any further details about where the CSAM came from. This is really an outlier, Fallon McNulty, executive director of NCMEC's CyberTipline, told . The CyberTipline is where many types of US-based companies are legally required to report suspected CSAM.
Jan-29-2026, 22:47:49 GMT
- Industry:
- Law (0.75)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (0.75)
- Marketing (0.54)
- Technology: