Apple delays rollout of photo scanning feature after backlash

Apple delays rollout of photo scanning feature after backlash

Image:
Apple delays rollout of photo scanning feature after backlash

The company wants additional time to collect input before releasing the feature

Apple has delayed plans to roll out an automated scanning feature in iCloud Photos that, it says, will help protect children from predators.

The company said that, based on the feedback from advocacy groups, researchers, customers and others, it has decided to take additional time to make improvements before releasing the new child safety feature.

Apple announced the scanning tech last month, saying it was intended to limit the spread of child sexual abuse material (CSAM) online.

The original plan was to build 'new applications of cryptography' into future versions of iOS and iPadOS, to enable Apple to detect CSAM images as they are uploaded to iCloud Photos, Apple's online storage.

Before an image is stored in iCloud Photos, an on-device matching process would be performed for the image against a database of known CSAM images, compiled by the US National Center for Missing and Exploited Children (NCMEC).

However, the decision met pushback from security and privacy advocates, who said it could open a backdoor to iPhones - allowing authoritarian governments and hackers to access devices without permission.

Apple argued that its use of neuralHash technology - which looks for digital 'fingerprints' to match against the CSAM database, rather than directly scanning photos - would protect users' privacy.

The system analyses an image and converts it to a hash key or unique set of numbers, then matches the key against NCMEC's database using cryptography.

The system is designed so nobody can learn about images that don't match the database. Moreover, a human reviewer will examine any images the system flags, to confirm a match and pass the information to law enforcement, if necessary.

Despite those clarifications, more than 5,000 individuals and organisations signed an open letter urging the tech firm to rethink its decision.

"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," read the letter.

The Center for Democracy and Technology said it was 'deeply concerned that Apple's changes in fact create new risks to children and all users, and mark a significant departure from long-held privacy and security protocols'.

The Electronic Frontier Foundation (EEF) warned that Apple was 'opening the door to broader abuses'.

Cindy Cohn, EEF executive director, told the BBC: "The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely."

"The enormous coalition that has spoken out will continue to demand that user phones - both their messages and their photos - be protected, and that the company maintains its promise to provide real privacy to its users," she added.

Apple's photo scanning feature was supposed to go live for customers this year. It is unclear at the moment how long it will delay the feature, following Friday's announcement.