A landmark lawsuit has been filed against Apple, alleging the company’s failure to adequately address the storage of child sexual abuse material (CSAM) on its platforms. The class action, filed on behalf of thousands of survivors, accuses Apple of negligence for not implementing proactive measures to detect and remove such material, despite possessing the technology to do so. The plaintiffs, represented by Marsh Law Firm with support from the Heat Initiative, argue that Apple’s actions—or lack thereof—have perpetuated the ongoing harm caused by these images and videos.
The lawsuit centers on Apple’s decision in 2021 to develop a “CSAM Detection” feature using NeuralHash technology, which could have identified known abuse material in iCloud accounts. However, Apple ultimately abandoned the program after pushback from privacy advocates who feared it might create vulnerabilities for government surveillance. The plaintiffs contend that this decision left Apple as one of the few major technology companies not engaging in proactive CSAM detection, despite other firms reporting millions of such materials annually. In stark contrast, Apple reported just 267 instances in 2023, highlighting the gap in its efforts.
Among the plaintiffs is Jane Doe, who expressed the anguish of knowing that images of her abuse remain available online. She emphasized Apple’s responsibility to implement available technology to protect survivors, framing the company’s inaction as a failure to uphold basic human dignity. Margaret E. Mabie, representing the plaintiffs, echoed these sentiments, criticizing Apple for prioritizing its corporate image over the safety of vulnerable individuals.
Our clients have endured unimaginable abuse, and yet Apple’s top executives continue to ignore their pleas, fully aware that this illegal contraband remains on their platform. By abandoning their state-of-the-art detection program without offering an alternative, Apple has chosen to prioritize its own corporate agenda over the lives and dignity of survivors. This lawsuit is a call for justice and a demand for Apple to finally take responsibility and protect these victims.
Sarah Gardner, CEO of the Heat Initiative, further condemned Apple’s stance, challenging its justification that privacy concerns outweigh the fundamental rights of abuse survivors.
The plaintiffs and countless other survivors of child sexual abuse are forced to relive the worst moments imaginable because Apple refuses to implement common sense practices that are standard across the tech industry. They will argue that this is a privacy issue, but they are failing to acknowledge the privacy and basic humanity of the children being raped and sexually assaulted in the videos and images Apple stores on iCloud.
The lawsuit seeks injunctive relief, compelling Apple to implement child safety measures to detect and remove CSAM on its platforms. The plaintiffs argue that Apple’s failure to act constitutes a violation of its duty of care, perpetuating trauma for survivors whose abuse is continually revisited through the dissemination of these materials. Attorney James Marsh estimates that over 2,600 victims may qualify for compensation under the case, highlighting the potential scale of the issue.
Apple, for its part, maintains that it is “urgently and actively innovating” to address the issue without compromising user security and privacy. However, critics argue that the company’s abandonment of the CSAM Detection program demonstrates prioritizing corporate interests over survivor safety. This lawsuit emphasizes the balance between privacy and accountability, challenging Apple to reconcile its policies with the urgent need to protect vulnerable individuals from further exploitation.
The full complaint can be found here.
Subscribe to our email newsletter to get the latest posts delivered right to your email.
Comments