Apple sued over child sexual abuse material
The company failed to protect victims, the lawsuit alleges
Apple is facing legal action over its decision to shelve a planned system designed to detect and remove child sexual abuse material (CSAM) from iCloud.
The class-action lawsuit, filed by a survivor of child sexual abuse, alleges that Apple knowingly allowed the storage of images and videos documenting their abuse on iCloud and other Apple platforms.
The plaintiffs claim that Apple possessed the technology to detect and remove such harmful content but deliberately chose not to implement it.
In 2021, Apple announced plans to implement a "CSAM Detection" system that would use digital signatures from the US National Center for Missing and Exploited Children to identify and flag known CSAM content within iCloud Photo Library.
However, the company faced significant backlash from privacy advocates who expressed concerns about potential government surveillance and the erosion of user privacy.
Under mounting criticism, Apple finally decided to scrap the initiative.
The plaintiff, a 27-year-old woman represented by Marsh Law Firm, argues that Apple's negligence has resulted in continued harm to survivors. She says she continues to receive law enforcement notices nearly every day related to the distribution of her abuse images, which remain accessible on iCloud.
Heat Initiative, an organisation which supports survivors of child sexual abuse, is providing legal and advocacy support to the plaintiffs. The group says it is important to hold technology companies accountable for their role in the proliferation of child sexual abuse material.
Unlike competitors such as Google and Facebook, which reported millions of instances of CSAM in recent years, Apple has faced criticism for its relatively low reporting numbers – just 267 cases in 2019. The lawsuit alleges this discrepancy reflects Apple's prioritisation of user privacy over child safety.
"Today, thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet," said Margaret E. Mabie, Partner at Marsh Law Firm, representing the plaintiffs.
"Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims."
The lawsuit seeks significant changes to Apple's practices. James Marsh, a leading attorney in the case, said the suit represents over 2,600 victims eligible to join the class.
The potential damages stem from a federal law that awards a minimum of $150,000 per victim in cases involving child sexual exploitation.
This is the second lawsuit filed against Apple in recent months regarding its handling of CSAM.
In August, a nine-year-old girl and her guardian sued the company for failing to address the issue of child sexual abuse material on its platform.
Apple says it remains committed to combating child sexual abuse while prioritising user privacy and security.
"Child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk," Fred Sainz, an Apple spokesman, told The New York Times.
"We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users."