West Virginia has filed a lawsuit against Apple, accusing the company of failing to adequately detect and prevent the storage of child sexual abuse material (CSAM) on its iCloud platform, Fox News’ Ashley Oliver reported Thursday. The case marks the first time a state has brought legal action against Apple over its cloud storage practices related to abuse material.
Attorney General JB McCuskey is leading the lawsuit, which was filed in Mason County Circuit Court. The complaint alleges that Apple has not implemented sufficient scanning or filtering technologies to identify and report CSAM stored within iCloud.
In an interview, McCuskey described Apple as an “outlier in the marketplace,” arguing that other major technology companies generate substantially more reports to law enforcement concerning CSAM.
“They’re producing millions and millions and millions of reports for federal and state law enforcement officials about people trying to store child pornographic images in their clouds,” McCuskey said. “Apple, on the other hand, their total number of reports is in the hundreds.”
McCuskey further contended that Apple’s emphasis on encryption and user privacy may create conditions that allow offenders to conceal illegal material. He suggested that Apple’s business model, which monetizes cloud storage usage, reduces incentives to aggressively monitor stored content.
“Every single byte of data that you’re using to store in the iCloud is a way for Apple to make money,” McCuskey said. “And so they’re using user privacy as a guise for what is really a bonanza for them to make money as child predators store their images, distribute their images through the Apple cloud.”
The lawsuit seeks to compel Apple to adopt technologies capable of detecting CSAM within its cloud infrastructure.
Apple responded in a statement emphasizing its safety and privacy features, particularly those designed to protect children. The company did not directly address its handling of potential abuse material stored by adult users.
“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do,” a spokesperson said. “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”
Apple highlighted existing tools such as parental controls and Communication Safety, which intervenes when nudity is detected across Messages, shared Photos, AirDrop, and FaceTime.
The complaint references internal communications attributed to Eric Friedman, Apple’s former anti fraud chief. In text messages cited in the filing, Friedman allegedly described iCloud as “the greatest platform for distributing child porn.”
When asked by a colleague whether there was “a lot of this in our ecosystem,” Friedman responded, “Yes.”
In another message, Friedman reportedly wrote, “But — and here’s the key — we have chosen to not know in enough places where we really cannot say.”
The lawsuit emerges amid broader national debates over technology company liability and content moderation. Apple has previously invoked Section 230 of the Communications Decency Act in civil litigation, arguing that courts cannot require technology companies to design their software in specific ways.
Section 230 has faced sustained scrutiny in Congress, where lawmakers have proposed reforms or repeal. Supporters of reform argue that legal immunity discourages companies from implementing stronger protections, while critics warn that weakening the statute could reshape online privacy and free expression.
Privacy advocates have raised concerns about proposals requiring device or cloud scanning technologies, arguing such measures risk expanding digital surveillance and increasing government pressure on technology companies.
McCuskey said West Virginia’s legal action reflects particular vulnerabilities within the state’s child welfare system.
“There is a direct and causal link between children who are in and out of the foster care system and children who end up being exploited in so many of these dangerous and disgusting ways,” McCuskey said.
The case adds to growing legal challenges confronting technology platforms over their role in preventing online exploitation and the dissemination of CSAM.
Institutional Sexual Abuse Lawsuits and Survivor Resources
Abuse connected to schools, youth programs, religious institutions, and other organizations can raise urgent legal questions. Learn how institutional sexual abuse lawsuits work, what deadlines may apply, and what options survivors may have.
Visit the Institutional Sexual Abuse Lawsuit Guide
If you are ready to explore your legal options, you may request a free case review by completing the confidential, secure form below.



