From RT.com…
US iPhone users’ photos will be scanned by Apple’s automated “neuralMatch” system for pictures of child porn and abuse, according to reports. Security researchers are alarmed the scheme threatens privacy and encryption.
Financial Times reported on the plan Thursday, citing anonymous sources briefed on Apple’s plans. The scheme was reportedly shared with some US academics earlier in the week in a virtual meeting.
Apple plans to scan US iPhones for child abuse imagery https://t.co/wptzpVjEdN
— Financial Times (@FT) August 5, 2021
Dubbed “neuralMatch,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said. The program is initially intended to be rolled out in the US only.