Apple to scan iPhones of US customers for CSAM
Apple has announced rolling out of a system to detect Child Sexual Abuse Material (CSAM) from iPhones of its US customers. The new versions of iOS and iPadOS, slated to be released later this year are expected to have “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy”.
Before an image is stored onto iCloud Photos, Apple said that the technology will search for matches of already known CSAM from images compiled by the the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations. If a match is found, then a human reviewer will assess and report the user to law enforcement.
However , privacy concerns have been raised that the technology may be used by authoritarian governments to spy on its citizens or expanded to scan phones for prohibited content, political speeches, etc.