Action on sexual abuse images is overdue, but Apple's proposals bring other dangers Ross Anderson

The Guardian 

Last week, Apple announced two backdoors in the US into the encryption that protects its devices. One will monitor iMessages: if any photos sent by or to under-13s seem to contain nudity, the user may be challenged and their parents may be informed. The second will see Apple scan all the images on a phone's camera roll and if they're similar to known sex-abuse images flag them as suspect. If enough suspect images are backed up to an iCloud account, they'll be decrypted and inspected. If Apple thinks they're illegal, the user will be reported to the relevant authorities. Action on the circulation of child sexual abuse imagery is long overdue.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found