Meta warns it may pull apps from New Mexico over impractical demands

▼ Summary
– Meta warns it may pull Facebook, Instagram, and WhatsApp from New Mexico if forced to comply with the state attorney general’s proposed changes, which it calls impossible.
– New Mexico Attorney General Raúl Torrez seeks a court order prohibiting end-to-end encryption for minors, requiring age verification, and mandating 99% detection of new child sexual abuse material.
– Meta argues the demands are vague, violate due process, and are technologically infeasible, stating it cannot prove a 99% CSAM detection rate without detecting all CSAM first.
– Torrez contends Meta’s resistance shows it prioritizes profit over child safety, noting the company has made changes for other governments and can do so here.
– Meta suggests minor changes like improving age assurance and funding law enforcement training, while Torrez warns that other states are pursuing similar actions against social media companies.
Meta is warning that it could be forced to pull Facebook, Instagram, and WhatsApp from New Mexico entirely if the state’s attorney general succeeds in pushing forward what the company calls impractical and overly burdensome demands. The dispute stems from a legal battle in which a jury awarded the state $375 million after finding that Meta misled users about the safety of its products. Now, Attorney General Raúl Torrez is asking the court to impose sweeping changes on the platforms, including a ban on end-to-end encryption for minors, mandatory age verification, and a requirement to detect 99 percent of new child sexual abuse material (CSAM) uploaded to its services.
Meta argues in a new court filing that many of these requests are “so hopelessly vague or ambiguous” that enforcing them would violate the company’s due process rights. The company describes several proposed mandates as “technologically or practically infeasible,” warning that compliance would essentially require building New Mexico-specific versions of its apps. “Therefore, granting this onerous relief could compel Meta to entirely withdraw Facebook, Instagram, and WhatsApp from the State as the only feasible means of compliance,” the filing states.
One of the most contentious demands, according to Meta, is the requirement to achieve a 99 percent accuracy rate for detecting new CSAM and rejecting underage accounts. The company argues that no detection system can ever prove it meets such a threshold, because calculating that rate would require knowing the total amount of CSAM on the platform , a number that is inherently unknowable. “Demanding a specific level of accuracy in detection appears to be based on the false premise that any system or tool can rid any social application or website with billions of users of all abuse or all CSAM,” Meta adds. It points out that CSAM is an internet-wide problem, not unique to its platforms, and that no other social app or website has achieved a zero-CSAM environment , a point multiple state witnesses conceded during the trial.
Meta also pushes back on the state’s call for more stringent age verification methods, such as requiring ID uploads or facial scans. The company claims these approaches could actually be less accurate than its current system, which relies on asking for birthdays at sign-up, implementing protections when users attempt to change their age, and using predictive models. Meta warns that more invasive methods could lead to widespread circumvention and may not perform as well in real-world conditions as they do in controlled tests. Additionally, the company argues that federal children’s privacy law would prevent it from retaining the data needed to classify users under 13 in the state.
Torrez, however, dismisses Meta’s resistance as a clear sign of unwillingness. “Meta’s refusal to follow the laws that protect our kids tells you everything you need to know about this company and the character of its leaders,” he says in a statement. “We know Meta has the ability to make these changes. For years the company has rewritten its own rules, redesigned its products, and even bent to the demands of dictators to preserve market access. This is not about technological capability. Meta simply refuses to place the safety of children ahead of engagement, advertising revenue, and profit.”
Meta counters that recent initiatives like Teen Accounts already address many of New Mexico’s concerns. The company proposes far more modest changes, such as tweaking its age assurance models and funding law enforcement training in the state for internet crimes against children for a limited period. “In targeting a single platform, the State ignores the hundreds of other apps teens use, leaving parents without the comprehensive support they actually deserve,” Meta spokesperson Chris Sgro says in a statement.
Torrez warns that even if Meta were to “take their ball and leave the state,” it may soon find fewer places in the US to operate. During a call with reporters, he noted that dozens of attorneys general across the country are pursuing similar actions against social media companies. “It feels, to me, like a shortsighted and temporary attempt to deflect and delay the inevitable,” he says. “And it would be better for them, it would be better for our community, but [also] communities all over the country if they just started to do the real work to prioritize safety.”
(Source: The Verge)




