AI & TechBigTech CompaniesCybersecurityDigital MarketingNewswireTechnology

Meta’s court loss could cost far more than $375 million

▼ Summary

– New Mexico AG Raúl Torrez won a $375 million child safety case against Meta and now seeks court-ordered changes to Meta’s platforms, including age verification, limits on encryption for minors, and caps on daily use.
– Torrez argues the financial penalty alone won’t change Meta’s behavior, as the company may view it as a cost of doing business, making the upcoming public nuisance trial more consequential.
– The proposed changes, such as age verification and encryption bans, raise privacy and security concerns, with critics warning they could push users to less regulated platforms.
– Meta opposes the mandates, calling them ill-informed and infringing on parental rights, while noting it has already implemented some safety measures.
– A court ruling could influence other tech cases and policy debates, though its direct impact is limited to Meta’s operations in New Mexico.

New Mexico Attorney General Raúl Torrez secured a historic $375 million settlement against Meta earlier this year in a landmark child safety case, but the real battle is just beginning. Starting Monday, both sides will return to a Santa Fe courthouse for a three-week public nuisance trial that could reshape how the social media giant operates, not just in New Mexico but potentially nationwide.

The trial will focus on the sweeping injunctive remedies the AG wants a judge to impose on Facebook, Instagram, and WhatsApp. These include mandatory age verification for New Mexico users, a ban on end-to-end encryption for users under 18, a 90-hour monthly usage cap for minors, and limits on engagement-driving features like infinite scroll and autoplay. The state also wants Meta to detect 99 percent of new child sexual abuse material (CSAM).

“From the outset, our goal was to try and change the way the company’s doing business,” Torrez told The Verge during a recent visit to Washington, DC, where he pushed for new kids safety legislation. “I recognize that even at $375 million for a company this big and this profitable, it’s not enough in and of itself to change the way they’re doing business. In fact, there’s probably some folks in that company who think of it as the cost of doing business.”

While any court-ordered changes would technically apply only to Meta’s operations in New Mexico, the company could voluntarily adopt them elsewhere for simplicity. Alternatively, it has threatened to shut down its services in the state entirely. A ruling could also send a powerful signal to other tech platforms that courts are willing to impose structural changes when companies are found liable for public harm.

During the trial, New Mexico will argue that Meta constitutes a public nuisance by creating a public health hazard. The AG’s office plans to call roughly 15 witnesses, including experts on the feasibility of the proposed remedies and fact witnesses on Meta’s alleged harms. After Meta presents its defense, Judge Bryan Biedscheid will evaluate which proposals are both relevant and achievable, a process that may take longer than the rapid jury verdict delivered in March.

A sweeping victory for New Mexico could energize Torrez and the thousands of other plaintiffs currently pursuing similar cases against tech companies. Conversely, a narrow ruling could be a significant setback. While the outcome won’t directly affect other lawsuits, it will almost certainly influence settlement negotiations across the industry.

Several of Torrez’s demands touch on deeply contentious tech policy debates. Age verification would likely require Meta or a third party to collect more personal information from adults and minors alike, a move privacy advocates warn could make users less safe. Don McGowan, a former board member of the National Center for Missing and Exploited Children (NCMEC), argued that blocking encrypted communications on platforms like Facebook “is a great way to make sure that nobody uses Facebook Messenger anymore and just moves their activity to other platforms that aren’t touched by this lawsuit.”

The encryption mandate may have limited practical impact, however. Meta recently announced it was removing end-to-end encrypted messaging from Instagram, a feature it said “very few people” actually used.

Peter Chapman, associate director of the Knight-Georgetown Institute, noted there could be “significant tradeoffs” to banning encryption, and that other changes might be more effective. For instance, evidence presented by the state showed that Meta’s own profile recommendation algorithms were connecting adults and minors, a feature with clear potential for harm and little benefit. Torrez is also asking the court to stop that practice. “There’s an opportunity to intervene at that level and try to prevent more of these harmful interactions from taking place without having to tackle encryption,” Chapman said.

No single feature change is likely to solve the entire child and teen safety problem, Chapman added, which is why Torrez’s multi-layered approach is notable. But the effectiveness of any remedy will depend heavily on how it is implemented and monitored. For example, what methodology would Meta use to report a 99 percent detection rate of new CSAM? How does it count what it hasn’t caught? The same questions apply to the accuracy of any mandated age verification.

Meta has seized on these uncertainties in its legal filings. “Regardless of where the accuracy threshold is set, Meta would never be able to prove that the system met that standard, because doing the calculation would require that Meta detect 100% of CSAM to use as the denominator,” the company wrote. Torrez’s chief deputy, James Grayson, said on a press call that the court and an appointed independent monitor would have discretion over tracking, though the office has not yet identified who that monitor would be.

“The demands that are being made in New Mexico are ill-informed and provide massive additional exposure for other kinds of exploitation,” said Maureen Flatley, president of Stop Child Predators, a group that advocates for stronger enforcement of criminal laws against child predators and has received funding from the Meta-backed trade group NetChoice. “This notion that the platforms have to be responsible for pushing all these people out would be like saying to the US Bankers Association, ‘By the way, you are responsible for all the bank robberies from now on,’ which is ludicrous.”

“The New Mexico Attorney General’s focus on a single platform is a misguided strategy that ignores the hundreds of other apps teens use daily,” Meta spokesperson Chris Sgro said in a statement. “The state’s proposed mandates infringe on parental rights and stifle free expression for all New Mexicans. Regardless, we remain committed to providing safe, age-appropriate experiences and have already launched many of the protections the state seeks, including 13 safety measures this past year.”

Torrez has also taken aim at the broader tech industry. During his recent trip to Washington, DC, he advocated for new online protections for kids and an overhaul of Section 230, the law that shields tech platforms from liability for user-generated content. “While we were able to prevail in our district court in Santa Fe, I still think the law as it currently exists creates a lot of ambiguity,” he said. “If Section 230 were not something that these companies could hide behind, then it increases the chances that they’re going to have to actually make their case to a jury.”

Chapman noted that regulation through lawsuits is not unprecedented in the US. “Whether that’s tobacco, opioids, e-cigarettes, there is precedent for legal action moving a broader policy conversation.”

(Source: The Verge)

Topics

child safety 98% public nuisance 95% age verification 92% end-to-end encryption 90% content moderation 88% csam detection 86% tech regulation 84% section 230 82% legal precedent 80% privacy concerns 78%