Apple’s iCloud photo-scanning delay welcomed by privacy advocates

Apple has delayed plans to automatically scan Apple devices for CSAM

Apple’s decision to delay the introduction of technology that would scan photos in devices including iPhones has been welcomed by tech privacy advocates, amid a huge backlash against the company.

The US tech and communications giant planned to roll out an automated photo scanning system that would search Apple’s iCloud for images of child abuse, with moderators then potentially reporting photo owners to law enforcement. Some critics called the system spyware in disguise.

Apple’s September announcement that the company would “take additional time” to release the system — following much online criticism about user privacy — came as a relief to many critics of the plan, including tech privacy groups.

Evan Greer, director of digital rights group Fight for the Future, said it’s encouraging that “the backlash has forced Apple to delay this reckless and dangerous surveillance plan, but the reality is that there is no safe way to do what they are proposing. Apple’s current proposal will make vulnerable children less safe, not more safe. They should shelve it permanently.”

Apple’s photo scan system would scan the iCloud, looking for images that match with known photos of child abuse supplied by the National Center for Missing and Exploited Children (NCMEC). The company said the move was “intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material.”

“At no point in the rollout does anybody say, ‘Is this objectively the right thing to do?'”

Matthew Green, Johns Hopkins University professor

Following the announcement of the system, floods of criticism about privacy and the potential for authoritarian governments to abuse the system appeared online. Apple employees reportedly spoke out about it, raising internal discussion about repressive governments that might want to use the technology to censor content or detain people.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said.

Following the company’s reverse-ferret, Fight for the Future director Greer said of Apple: “Instead of undercutting encryption, they should be expanding it to protect more people, including children, by encrypting iCloud and addressing security vulnerabilities in iMessage.”

Matthew Green, a cryptography researcher at Johns Hopkins University and prominent Apple critic, suggested that the company’s secrecy and lack of user consultation process may have led to the company being blindsided by the negative response.

At no point in the rollout of these systems does anyone say, ‘Is this objectively the right thing to do?’ Nor do they consult with users. In fact many of these scanning systems require companies to sign NDAs [non-disclosure agreements] prior to deployment,” Green tweeted:

Before Apple’s reversal announcement, Greg Bensinger of the New York Times had already hit the nail on the head by pointing out that “it’s a good indication that things are headed in the wrong direction when your company’s anti-child-pornography initiative gets panned.”

Read next: Apple bans anti-vax dating app Unjected from App Store

Affiliate Disclosure
Some articles contain affiliate links that allow us to earn money if you decide to purchase any of these products or services. This does not cost you any extra money, and it allows us to continue to run this website. Affiliate links have no relation to review ratings or other editorial coverage. You can read the full policy here.

Jamie F

Jamie F

Jamie is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice.

Be the first to leave a comment

Leave a reply