Apple urged to halt plans to roll out new photo scanning feature in open letter

Apple urged to halt plans to roll out new photo scanning feature in open letter

Image:
Apple urged to halt plans to roll out new photo scanning feature in open letter

The feature could be exploited by threat actors in the long run, experts warn

More than 5,000 individual and organisations have signed an open letter urging Apple to rethink roll out of its new photo scanning feature that has been designed to identify child sexual abuse material (CSAM) on iPhones and iPads.

Apple announced the feature last week, saying that its upcoming versions of iOS and iPadOS will be equipped with 'new applications of cryptography' - enabling the company to identify CSAM images as they are uploaded to iCloud Photos, Apple's online storage.

However, the open letter from industry experts and privacy advocates cautions that upcoming changes have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy.

"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," reads the letter that was posted on Friday and has already been signed by over 5,000 tech executives, privacy supporters, legal experts, researchers, professors and more.

The letter cautions that the photo-scanning feature amounts to creating backdoors in Apple's software, which could be exploited by threat actors in the long run.

It requests that Apple halts implementation of the photo-scanning feature and also issues a statement "reaffirming their commitment to end-to-end encryption and to user privacy".

On Thursday, the Electronic Frontier Foundation (EEF) published a blog post, warning that Apple was "opening the door to broader abuses".

"Apple's compromise on end-to-end encryption may appease government agencies in the US and abroad, but it is a shocking about-face for users who have relied on the firm's leadership in privacy and security," the EEF said.

It argued that it was impossible to create a client-side scanning system that "can only be used for sexually explicit images sent or received by children".

"That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change," it added.

The Center for Democracy and Technology said it was "deeply concerned that Apple's changes in fact create new risks to children and all users, and mark a significant departure from long-held privacy and security protocols".

In a series of tweets, Will Cathcart, CEO of WhatsApp, said that WhatsApp will never use such image-scanning systems, although they do intend to combat CSAM content itself.

While announcing the new feature on Thursday, Apple said it system ensures that nobody cannot learn about images stored on a device if they are not sexually explicit.

Before an image is stored in iCloud Photos, an on-device matching process will be performed for that image against the database of known CSAM images, compiled by the US National Center for Missing and Exploited Children (NCMEC).

The image being checked will be converted into a hash key or unique set of numbers, and then the system will try to match the key against NCMEC's database using cryptography.

If the system flags an image, a human reviewer will review the image, to confirm a match. If it is found that the image contains child abuse material, the user's account will be disabled, and the findings reported to the NCMEC.

Obviously, Apple cannot check images for users who have iCloud Photos disabled on their devices. Similarly, images that are stored in iCloud backups will not be scanned.

The only time Apple will run CSAM image-scanning tools it when the image it being uploaded to iCloud Photos.

Apple claims that its system has an error rate of 'less than one in 1 trillion' per year, and that it does not breach users' privacy.