UK seeking to curb AI child sex abuse imagery with tougher testing
The UK government will allow tech firms and child safety charities to proactively test artificial intelligence tools to make sure they cannot create child sexual abuse imagery. An amendment to the Crime and Policing Bill announced on Wednesday would enable authorised testers to assess models for their ability to generate illegal child sexual abuse material (CSAM) prior to their release. Technology Secretary Liz Kendall said the measures would ensure AI systems can be made safe at the source - though some campaigners argue more still needs to be done. It comes as the Internet Watch Foundation (IWF) said the number of AI-related CSAM reports had doubled over the past year. The charity, one of only a few in the world licensed to actively search for child abuse content online, said it had removed 426 pieces of reported material between January and October 2025.
Nov-12-2025, 00:10:41 GMT
- Country:
- Africa (0.06)
- Antarctica (0.05)
- Asia
- China (0.06)
- India (0.05)
- Middle East
- Israel (0.05)
- Palestine > Gaza Strip
- Gaza Governorate > Gaza (0.05)
- Europe
- Ukraine (0.05)
- United Kingdom
- Northern Ireland (0.06)
- Scotland (0.06)
- Wales (0.06)
- North America
- Bermuda (0.05)
- Canada (0.05)
- Central America (0.15)
- United States (0.50)
- Oceania > Australia (0.06)
- South America (0.15)
- Industry:
- Technology:
- Information Technology
- Artificial Intelligence (1.00)
- Communications > Networks (0.35)
- Information Technology