BigTech CompaniesBusinessNewswireTechnologyWhat's Buzzing

Zuckerberg Faces Court in Ray-Ban Meta Glasses Case

▼ Summary

– Mark Zuckerberg testified in a trial where plaintiffs allege Meta’s platform designs caused mental health harms, which the company denies.
– He defended Meta’s decisions by arguing they balance user expression against potential harms, citing a review of AR filters as an example.
– Zuckerberg was questioned about alleged contradictions between public statements on protecting young users and internal documents discussing their value.
– He acknowledged internal disagreement on safety decisions but maintained that a lack of compelling data justified not restricting user expression.
– Parents of children who died from harms they attribute to social media attended the trial, hoping their presence would foster empathy and change.

Arriving at a Los Angeles courthouse for a pivotal legal proceeding, Meta CEO Mark Zuckerberg was accompanied by a group wearing the company’s Ray-Ban smart glasses, a detail that stood out among the gathered legal teams and observers. His path to the courtroom took him past families who hold social media platforms, including Meta’s, responsible for contributing to their children’s deaths. For roughly eight hours, Zuckerberg delivered testimony in his characteristically measured tone, consistently rejecting the assertion that his company’s platforms are legally responsible for the alleged harms.

The morning questioning was led by plaintiffs’ attorney Mark Lanier, who represents a young woman claiming that design features on Meta and Google apps fostered compulsive use and mental health issues, allegations the companies dispute. Lanier’s expressive style, informed by his background as a pastor, contrasted sharply with Zuckerberg’s more technical responses. The Meta CEO frequently sought to add nuance to internal discussions about safety decisions, at times directly challenging Lanier’s interpretations. Meanwhile, the presiding judge issued a warning against wearing Meta’s AI-enabled glasses in the courtroom, stating that failing to delete any recordings could result in contempt charges.

Zuckerberg faced intense scrutiny over his past statements and corporate decisions. He was questioned about apparent discrepancies between public assurances regarding keeping young children off Facebook and Instagram and internal documents discussing the strategic value of attracting youthful users. Another line of inquiry focused on specific product choices, such as the decision not to institute a permanent ban on augmented reality filters that digitally alter facial features to mimic cosmetic surgery outcomes.

In defending the choice on AR filters, Zuckerberg outlined a recurring theme in his defense: the need to balance user expression with potential risks. He described a 2019 executive discussion about lifting a temporary ban on such filters, testifying that after reviewing research, he found the evidence of harm insufficient to warrant restricting that form of speech. “On some level you don’t really build social media apps unless you care about people being able to express themselves,” Zuckerberg stated, emphasizing a requirement for clear evidence before limiting user expression. His final ruling allowed third-party creators to make certain filters, with exceptions like those simulating surgical marks, but prohibited Instagram from promoting or creating them itself.

Lanier suggested Meta’s true priority was maximizing user engagement over wellbeing, but Zuckerberg countered that the company has deliberately shifted its internal goals toward enhancing product value, even at the potential cost of reduced usage time. While internal documents indicated some employee concern that banning filters might affect engagement, Zuckerberg maintained this was not a significant factor in his decision, noting the filters’ limited popularity.

The CEO acknowledged internal dissent regarding the filter policy. He described a division between wellbeing experts who had concerns but could not present compelling data, and his own assessment that the evidence did not justify restricting expression. Lanier presented an email from a Meta executive who respectfully disagreed with Zuckerberg’s call, citing potential risks and personal experience with a daughter facing body dysmorphia, and noting that definitive causal data might take years to emerge.

When Zuckerberg again stated the research was not compelling enough for a ban, Lanier inquired if he held any professional degrees. “I don’t have a college degree in anything,” Zuckerberg replied.

His marathon testimony concluded a portion of a trial expected to extend for several more weeks. The jury will next hear from former Meta employees, including some who disagreed with the company’s teen safety strategies, and executives from YouTube, which is also named in the lawsuit. Parents observing from the gallery expressed that the testimony revealed little new information, but many stressed the importance of their visible presence. One mother, Amy Neville, whose son died from fentanyl poisoning at age 14 in an incident allegedly connected to Snapchat, shared her hope that the sight of grieving families would foster empathy and ultimately drive change, remarking that the day’s impact remained to be seen.

(Source: The Verge)

Topics

social media liability 95% platform design 90% teen safety 88% free expression 85% corporate accountability 82% legal testimony 80% mental health 78% user wellbeing 75% ar filters 73% parental advocacy 70%