Apple explains why it backed away from scanning for abuse materials

iCloud CSAM scanner would have been a 'slippery slope of unintended consequences'

Apple explains why it backed away from scanning for abuse materials

Image:
Apple explains why it backed away from scanning for abuse materials

In an exchange between a child safety group and Apple, the tech giant's has explained why the company abandoned its 2021 plans to scan the contents of customers iCloud accounts for child sexual abuse materials (CSAM).

In a letter published by Wired, the CEO of child safety group Heat Initiative, Sarah Gardener, said she was disappointed that Apple had killed its plans last year.

"We firmly believe that the solution you unveiled not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud," Gardener wrote.

"The detection of these images and videos respects the privacy of survivors who have endured these abhorrent crimes - a privilege they undeniably deserve."

Gardener said Apple should create a robust reporting system for users to report CSAM, and informed the company that Heat Initiative planed to publicly request these actions from Apple within a week.

In his response the following day, Erik Neuenschwander, director of user privacy and child safety, said that while Apple shared Gardener's abhorrence of child abuse, "scanning every user's privately stored iCloud content would in our estimation pose serious unintended consequences."

He said that after consulting widely, Apple had come to the conclusion that the planned scanner would be "practically possible to implement without ultimately imperilling the security and privacy of our users."

With sophisticated attacks increasing, "scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," he added.

"It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories."

Neuenschwander went on to make the argument that tools designed for one type of surveillance can be reconfigured to detect political or religious materials which could lead to persecution of certain groups.

"Tools of mass surveillance have widespread negative implications for freedom of speech and, by extension, democracy as a whole," he wrote, adding that scanning technologies are "not foolproof".

In place of scanning iCloud Neuenschwander said Apple has "deepened [its] commitment" to the Communication Safety feature on Messages. This warns children when they receive or attempt to send messages that contain nudity and offers ways for them to seek help. It has been extended to cover AirDrop, the Photo picker, FaceTime messages and Contact Posters in the Phone app. The features are privacy-preserving, he said, and available to developers of third party apps.

Neuenschwander insisted that Apple is working with the child safety community to make law enforcement easier, and collaborating with other companies to create shared resources to tackle exploitation.

Scanning of cloud resources and devices for CSAM has become a key sticking point in legislation like the UK Online Safety Bill. Technologists and civil liberties campaigners often argue that selective surveillance cannot work, with child safety groups insisting the issue is so important that online safety should be put above privacy concerns in this case.