AI & TechBigTech CompaniesBusinessNewswireTechnology

Meta Seeks to Block Mental Health, Zuckerberg Mentions in Child Safety Trial

▼ Summary

– Meta is on trial in New Mexico for allegedly failing to protect minors from sexual exploitation and is aggressively seeking to exclude certain information from the proceedings.
– The company has requested the judge exclude research on youth mental health, a recent teen suicide case, its financial resources, employee activities, and Mark Zuckerberg’s Harvard years.
– The state’s complaint alleges Meta’s platforms served pornographic content to minors and failed to act on fake underage accounts receiving explicit messages or being targeted for trafficking.
– Meta argues the jury should only consider whether it violated New Mexico’s Unfair Practices Act, not other issues like election interference or privacy violations.
– The company defends its child safety efforts, citing tools like Teen Accounts, while dismissing the state’s arguments as sensationalist and irrelevant.

As Meta prepares for a high-stakes trial in New Mexico over allegations it failed to protect children from sexual exploitation, the company is aggressively seeking to limit the scope of evidence presented to the jury. The social media giant has filed pretrial motions asking the judge to exclude a wide range of information, including studies on youth mental health, references to its vast financial resources, and even mentions of CEO Mark Zuckerberg’s personal history. These legal maneuvers aim to narrow the focus strictly to whether Meta violated state consumer protection laws through its handling of child safety, while preventing what it calls “irrelevant and prejudicial” details from influencing the proceedings.

The case, brought by New Mexico Attorney General Raúl Torrez, represents a significant legal challenge. The state alleges that Meta’s platforms, Facebook and Instagram, enabled the online solicitation, trafficking, and sexual abuse of minors. Investigators claim they easily created fake accounts posing as underage girls, which were then bombarded with explicit messages and algorithmically promoted pornographic content. In a particularly disturbing test, a fabricated account for a mother offering to traffic her daughter reportedly attracted suggestive comments without intervention, even after other users flagged the violating accounts.

Meta’s pretrial requests, known as motions in limine, are a standard legal tool to shape a trial’s narrative. However, the breadth of the company’s demands has drawn attention. Beyond blocking mental health research, Meta wants to exclude any mention of a recent high-profile teen suicide case linked to social media, discussions about its AI chatbots, and details regarding employee activities or Zuckerberg’s time at Harvard. The company argues that introducing such topics would unfairly bias the jury and distract from the core legal questions.

In response to the allegations, a Meta spokesperson emphasized the company’s long-term efforts to protect young users. The statement highlighted initiatives like Teen Accounts with enhanced safety settings and parental control tools, asserting a commitment to ongoing improvement. The spokesperson characterized New Mexico’s arguments as sensationalist, maintaining that Meta’s focus remains on demonstrating its substantive work in this area.

The outcome of these pretrial motions will critically influence the trial’s direction. By attempting to wall off discussions about broader industry impacts and its own corporate power, Meta seeks to keep the jury’s attention fixed on specific, narrow legal interpretations. Meanwhile, the state aims to present a wider picture of alleged systemic failures. How the judge rules on these requests will determine which narrative the jury ultimately hears, setting the stage for a pivotal courtroom battle over accountability in the digital age.

(Source: Wired)

Topics

legal proceedings 95% child safety 93% youth mental health 90% evidence exclusion 88% social media platforms 85% state allegations 85% algorithmic amplification 82% company reputation 80% legal scholars 78% unfair practices act 77%