Artificial Intelligence Toolkit Spots New Child Sexual Abuse Media Online
New artificial intelligence software designed to spot new child sexual abuse media online could help police catch child abusers. The toolkit, described in a paper published in Digital Investigation, automatically detects new child sexual abuse photos and videos in online peer-to-peer networks. The research behind this technology was conducted in the international research project iCOP - Identifying and Catching Originators in P2P Networks - founded by the European Commission Safer Internet Program by researchers at Lancaster University, the German Research Center for Artificial Intelligence (DFKI), and University College Cork, Ireland. There are hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year. The people who produce child sexual abuse media are often abusers themselves - the US National Center for Missing and Exploited Children found that 16 percent of the people who possess such media had directly and physically abused children.
Dec-9-2016, 02:36:13 GMT