Apple defends scanning iPhones for child abuse images, saying algorithm only identifies flagged pics

Daily Mail - Science & tech 

Apple is pushing back against criticism over its plan to scan photos on users iPhones and in iCloud storage in search of child sexual abuse images. In a Frequently Asked Questions document focusing on its'Expanded Protections for Children,' Apple insisted its system couldn't be exploited to seek out images related to anything other than child sexual abuse material (CSAM). The system will not scan photo albums, Apple says, but rather looks for matches based on a database of'hashes' - a type of digital fingerprint - of known CSAM images provided by child safety organizations. While privacy advocacies worry about'false positives, Apple boasted that'the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year.' Apple also claims it would'refuse any such demands' from government agencies, in the US or abroad.