Apple is being sued over its failure to implement a system to detect child sexual abuse material (CSAM) among users’ iCloud photos. The lawsuit argues that by failing to implement adequate detection measures, Apple is forcing victims to relive their trauma.
The tech giant announced a crackdown on CSAM in 2021, promising to roll out a new system that uses digital signatures to stop the spread of exploitative imagery. However, perhaps prompted by fears over privacy and surveillance, the company appeared to shelve these plans.
This issue has long pervaded the Big Tech space, with Biden a few months ago enacting measures to end sexually violent online imagery. It is not the first time Apple has fallen foul of child protection advocates and other activists, with the company demonstrating a somewhat patchy approach to safeguarding its users against abusive material.
Apple Hit With Lawsuit Over Safeguarding Failures
Apple is being sued by a 27-year-old woman over its failure to prevent the spread of sexually-abusive images of her as a child. The victim alleges that she is notified every time someone is charged with possessing the images, which happens nearly every day, forcing her to continuously relive her trauma.
The suit attacks Apple for announcing, “a widely touted improved design aimed at protecting children,” but failing to, “implement those designs or take any measures to detect and limit” sexually exploitative material.
This just in! View
the top business tech deals for 2025 👨💻
Representing the plaintiff, attorney James Marsh claims there could be as many as 2,680 people entitled to compensation.
“Apple has not only rejected helping these victims, it has advertised the fact that it does not detect CSAM on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims.” – Margaret E. Mabie, Marsh Law Firm Partner
Tech Giant’s Actions Leave a Lot to be Desired
The lawsuit stems from Apple’s decision not to enact abusive imagery reform that it previously promised. In 2021, the company announced a series of measures to combat CSAM, including a new system for detecting such content with the help of digital signatures from the National Center for Missing and Exploited Children.
However, Apple has failed to deliver on its promise. It is thought that growing pushback from privacy advocates might have deterred the iPhone retailer. At the time of the announcement, concerns were raised that the company would be opening a “backdoor” to its users’ private lives.
The lawsuit points to a wider dissatisfaction with Apple’s approach to combating CSAM. In September 2023, Heat Initiative, a child advocacy group, made headlines when it put up posters in New York and San Francisco that read: “Child sexual abuse material is stored on iCloud. Apple allows it.”
Tech Sector Not Doing Enough, Report Finds
Big Tech has long faced accusations of failing to adequately protect its users from abusive imagery. In response to this mounting pressure, it would appear that some leading players are starting to take people’s concerns seriously.
In September this year, a number of businesses from across the space committed to stopping the spread of explicit imagery. Notably, the Elon Musk-helmed X was not among them. TikTok, meanwhile, has introduced new restrictions for its younger users, while Roblox has made similar forays.
However, with Internet Watch Foundation reporting that CSAM has “more than doubled” since 2020, these actions may be a case of “too little, too late.” As generative AI continues to progress at a scarcely believable rate, the threat of sexually abusive imagery will only get worse.