UK seeking to curb AI child sex abuse imagery with tougher testing

BBC News 

The UK government will allow tech firms and child safety charities to proactively test artificial intelligence tools to make sure they cannot create child sexual abuse imagery. An amendment to the Crime and Policing Bill announced on Wednesday would enable authorised testers to assess models for their ability to generate illegal child sexual abuse material (CSAM) prior to their release. Technology Secretary Liz Kendall said the measures would ensure AI systems can be made safe at the source - though some campaigners argue more still needs to be done. It comes as the Internet Watch Foundation (IWF) said the number of AI-related CSAM reports had doubled over the past year. The charity, one of only a few in the world licensed to actively search for child abuse content online, said it had removed 426 pieces of reported material between January and October 2025.