Apple Facing Lawsuit for Inadequate Protection of Sexual Abuse Victims

Apple is being sued for failing to protect victims of sexual abuse. But is this an industry-wide problem?

Apple is being sued over its failure to implement a system to detect child sexual abuse material (CSAM) among users’ iCloud photos. The lawsuit argues that by failing to implement adequate detection measures, Apple is forcing victims to relive their trauma.

The tech giant announced a crackdown on CSAM in 2021, promising to roll out a new system that uses digital signatures to stop the spread of exploitative imagery. However, perhaps prompted by fears over privacy and surveillance, the company appeared to shelve these plans.

This issue has long pervaded the Big Tech space, with Biden a few months ago enacting measures to end sexually violent online imagery. It is not the first time Apple has fallen foul of child protection advocates and other activists, with the company demonstrating a somewhat patchy approach to safeguarding its users against abusive material.

Apple Hit With Lawsuit Over Safeguarding Failures

Apple is being sued by a 27-year-old woman over its failure to prevent the spread of sexually-abusive images of her as a child. The victim alleges that she is notified every time someone is charged with possessing the images, which happens nearly every day, forcing her to continuously relive her trauma.

The suit attacks Apple for announcing, “a widely touted improved design aimed at protecting children,” but failing to, “implement those designs or take any measures to detect and limit” sexually exploitative material.

 

About Tech.co Video Thumbnail Showing Lead Writer Conor Cawley Smiling Next to Tech.co LogoThis just in! View
the top business tech deals for 2025 👨‍💻
See the list button

Representing the plaintiff, attorney James Marsh claims there could be as many as 2,680 people entitled to compensation.

“Apple has not only rejected helping these victims, it has advertised the fact that it does not detect CSAM on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims.” – Margaret E. Mabie, Marsh Law Firm Partner

Tech Giant’s Actions Leave a Lot to be Desired

The lawsuit stems from Apple’s decision not to enact abusive imagery reform that it previously promised. In 2021, the company announced a series of measures to combat CSAM, including a new system for detecting such content with the help of digital signatures from the National Center for Missing and Exploited Children.

However, Apple has failed to deliver on its promise. It is thought that growing pushback from privacy advocates might have deterred the iPhone retailer. At the time of the announcement, concerns were raised that the company would be opening a “backdoor” to its users’ private lives.

The lawsuit points to a wider dissatisfaction with Apple’s approach to combating CSAM. In September 2023, Heat Initiative, a child advocacy group, made headlines when it put up posters in New York and San Francisco that read: “Child sexual abuse material is stored on iCloud. Apple allows it.”

Tech Sector Not Doing Enough, Report Finds

Big Tech has long faced accusations of failing to adequately protect its users from abusive imagery. In response to this mounting pressure, it would appear that some leading players are starting to take people’s concerns seriously.

In September this year, a number of businesses from across the space committed to stopping the spread of explicit imagery. Notably, the Elon Musk-helmed X was not among them. TikTok, meanwhile, has introduced new restrictions for its younger users, while Roblox has made similar forays.

However, with Internet Watch Foundation reporting that CSAM has “more than doubled” since 2020, these actions may be a case of “too little, too late.” As generative AI continues to progress at a scarcely believable rate, the threat of sexually abusive imagery will only get worse.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Gus is a Senior Writer at Tech.co. Since completing his studies, he has pursued a career in fintech and technology writing which has involved writing reports on subjects including web3 and inclusive design. His work has featured extensively on 11:FS, The Fold Creative, and Morocco Bound Review. Outside of Tech.co, he has an avid interest in US politics and culture.
Explore More See all news
Back to top
close Thinking about your online privacy? NordVPN is Tech.co's top-rated VPN service See Deals