Action on sexual abuse images is overdue, but Apple's proposals bring other dangers Ross Anderson
Last week, Apple announced two backdoors in the US into the encryption that protects its devices. One will monitor iMessages: if any photos sent by or to under-13s seem to contain nudity, the user may be challenged and their parents may be informed. The second will see Apple scan all the images on a phone's camera roll and if they're similar to known sex-abuse images flag them as suspect. If enough suspect images are backed up to an iCloud account, they'll be decrypted and inspected. If Apple thinks they're illegal, the user will be reported to the relevant authorities. Action on the circulation of child sexual abuse imagery is long overdue.
Aug-14-2021, 09:00:35 GMT
- Country:
- Asia
- Afghanistan (0.05)
- China (0.06)
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.05)
- North America > United States (0.36)
- Oceania > Australia (0.16)
- Asia
- Industry:
- Information Technology (1.00)
- Law (1.00)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (1.00)
- Technology:
- Information Technology
- Artificial Intelligence > Machine Learning (0.50)
- Communications > Mobile (0.43)
- Security & Privacy (0.49)
- Information Technology