West Virginia sues Apple over allegations of child abuse material in iCloud

West Virginia sues Apple over allegations of child abuse material in iCloud


NEWYou can now listen to Fox News articles!

West Virginia brought a lawsuit against Apple on Thursday over allegations the big tech giant lets predators easily hide child sexual abuse material in its iCloud storage, marking the first time a state has sued the company over the issue.

Attorney General JB McCuskey, who is leading the lawsuit, told Fox News Digital in an interview that Apple is an “outlier in the marketplace” when it comes to cloud-based storage and that the company has long refused to run adequate filters through its storage, unlike Meta and Google, although those companies differ from Apple in that they run massive social media platforms.

“They’re producing millions and millions and millions of reports for federal and state law enforcement officials about people trying to store child pornographic images in their clouds,” McCuskey said. “Apple, on the other hand, their total number of reports is in the hundreds.”

HOUSE MOVES TO PROTECT CHILDREN FROM ONLINE PREDATORS AS AUSTRALIA CLAMPS DOWN ON SOCIAL MEDIA

West Virginia sues Apple over allegations of child abuse material in iCloud

West Virginia Attorney General John McCuskey speaks outside the U.S. Supreme Court on Jan. 13, 2026, in Washington, D.C. (Oliver Contreras / AFP via Getty Images)

McCuskey argued that Apple, which prides itself on its encryption features of its iCloud which are lauded by privacy hawks, is incentivized to manage its iCloud data in a way that is lucrative for the company.

“Every single byte of data that you’re using to store in the iCloud is a way for Apple to make money, and so they’re using user privacy as a guise for what is really a bonanza for them to make money as child predators store their images, distribute their images through the Apple cloud,” McCuskey said.

West Virginia’s complaint against Apple, filed in Mason County Circuit Court, demands that the company begin employing detective measures that scan cloud storage for child sexual abuse material.

An Apple spokesperson said in a statement to Fox News Digital that its products effectively shield young users from harmful content, though the spokesperson did not address how it manages possible child sexual abuse material on the iCloud that adults could access.

GRAHAM LEADS BIPARTISAN DEMAND FOR TECH REFORM VOTE TO ‘BRING SOCIAL MEDIA COMPANIES TO HEEL’

The Apple Fifth Avenue store in New York, US, on Tuesday, Oct. 28, 2025.

The Apple Fifth Avenue store in New York, US, on Tuesday, Oct. 28, 2025. (Michael Nagle/Bloomberg via Getty Images)

“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do. We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” the spokesperson said. “All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls — are designed with the safety, security, and privacy of our users at their core.”

Central to West Virginia’s complaint are internal text messages attributed to Eric Friedman, Apple’s former anti-fraud chief, describing iCloud as “the greatest platform for distributing child porn.”

In an iMessage exchange about Apple’s perceived emphasis on privacy over child safety, Friedman wrote, “which is why we are the greatest platform for distributing child porn, etc.”

When asked by a colleague whether there was “a lot of this in our ecosystem,” Friedman responded, “Yes.”

In another message, Friedman described Apple’s approach to oversight: “But — and here’s the key — we have chosen to not know in enough places where we really cannot say.”

Friedman’s messages underscore a defense Apple has raised in other similar lawsuits, brought by alleged victims. One of the major lawsuits is pending, though a judge dismissed some of the claims in favor of Apple’s argument that it was protected by Section 230 of the Communications Decency Act. Under that defense, Apple asserted that it had immunity under Section 230, which states that the court cannot force tech companies to design their software in a specific way.

Section 230 has been a top source of scrutiny in Congress for years as lawmakers spanning the political spectrum grapple with how to regulate big tech companies and artificial intelligence platforms in a rapidly evolving industry. Sens. Lindsey Graham, R-S.C., and Dick Durbin, D-Ill., recently introduced a bill to repeal Section 230 altogether to force tech giants to negotiate new protections.

Privacy advocates have argued that proposals to use child sexual abuse detection systems on Apple products represent a dangerous shift toward surveillance because Apple would be using scanning software on users’ devices, making the company more vulnerable to government pressure to scan for more expansive sets of data.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

McCuskey noted that West Virginia was unique in that the state, located in the heart of Appalachia, is rife with child welfare inadequacies and that kids are at higher risk of exploitation in his state than in others.

“There is a direct and causal link between children who are in and out of the foster care system and children who end up being exploited in so many of these dangerous and disgusting ways,” McCuskey said.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *