Apple has attempted to allay fears that the company’s upcoming photo-scanning feature, designed to detect images of child abuse among photos uploaded to Apple’s iCloud, threatens user privacy.
Last week Apple announced a new function for products including iPhones and iPads, that will see users’ iCloud photos scanned to find matches with known images of child sexual abuse material (CSAM) provided by the National Center for Missing and Exploited Children (NCMEC).
When such images are detected by the automated system, they will be reviewed by a human moderator. The user who owned the photo could then be reported to authorities for CSAM possession.
Matthew Green, a professor at Johns Hopkins University who has worked on cryptographic technologies, tweeted that the news that Apple could directly scan customer’s devices was “disturbing”.
He wrote: “So when you tell the general public that you’re going to start rifling through their personal photos *on their own device* they don’t like it. It is, to be blunt and US-centric, unamerican.”
Tim Sweeney, CEO and founder of Epic Games, tweeted: “Inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.”
Apple released a document attempting to quell concerns that the function could be abused by governments seeking access to users’ personal information.
The company said: “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”
That promise will be harder to keep in some countries beyond the US, which seemed to be the focus of Apple’s communications. Apple was criticised by tech observers for opening a data center in China earlier this year, hosting iCloud capacity for Chinese customers at the facility in the country’s Guizhou province.
For that move Apple was compelled to partner with a Chinese company, Guizhou Cloud, to operate Apple’s Chinese iCloud service. China’s authoritarian ruling Chinese Communist Party, which regularly jails and ‘disappears’ critics and activists, has tight government compliance rules for Chinese tech firms.
In the document Apple fell short of saying that its CSAM detection system could not be used to detect images of things other than CSAM, but instead said that the “process is designed to prevent that from happening.”
“There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC,” Apple added.
Apple claims there is a “less than one in one trillion per year” chance that the scanning system would incorrectly flag an account for CSAM images.
Read next: Apple bans anti-vax dating app Unjected from App Store
Leave a Reply