BigTech CompaniesBusinessNewswireTechnology

Meta Faces Juries Over Child Safety Concerns

▼ Summary

– Two juries are deliberating separate cases that could result in significant legal and financial consequences for Meta.
– In New Mexico, Meta is accused of facilitating child predators and misleading the public about platform safety for young users.
– In California, a case concerns whether Meta and Google should be held liable for making addictive products that harmed a young woman.
– Prosecutors presented evidence from internal Meta discussions and undercover operations showing the platform failed to protect minors.
– Meta denies the allegations, arguing it works hard to protect users and that the plaintiffs’ evidence is misleading or misapplied.

Two simultaneous jury deliberations could determine whether Meta faces a historic legal penalty or continues its current operational path relatively unchanged. The outcomes hinge on whether the courts find the company liable for alleged harms to young users, with potential damages reaching into the billions of dollars. These cases represent a critical test for applying consumer protection laws to social media platforms.

In New Mexico, closing arguments concluded in a trial where the state accuses Meta of enabling child predators on Facebook and Instagram. The company firmly denies these allegations. Separately, a Los Angeles jury is nearing a verdict in a case examining if Meta and Google should be held responsible for creating addictive products that allegedly harmed a young woman. A ruling against Meta could trigger a wave of similar lawsuits, breaking a pattern of stalled legal challenges against major tech firms.

This litigation is only the beginning for Meta and its peers, with more trials scheduled throughout the year. The company’s platforms are frequently central to debates about online child safety, criticism amplified by whistleblowers like former employee Frances Haugen. Meta maintains that user safety is integral to its business model.

A company spokesperson previously stated that Meta remains focused on its “longstanding commitment to supporting young people,” dismissing New Mexico’s arguments as sensationalist. Regarding the California lawsuits, Meta “strongly disagree[s]” with the allegations and is confident the evidence supports its position. The Los Angeles jury has deliberated for over a week after a five-week trial.

During the New Mexico closings, state attorney Linda Singer argued that Meta neglected to implement sufficient safeguards for minors and misled the public about platform safety. The six-week trial featured evidence from internal company discussions and undercover investigations. Singer asserted that algorithmic design choices prioritize engagement over safety. “Meta could choose to program its algorithm to get better at safety, to get better at integrity, to get better at things that keep kids safe,” she told jurors. She characterized Meta’s added safety features as akin to “adding a filter to a cigarette,” arguing they don’t address the product’s fundamental risks.

Both juries reviewed comparable evidence, including testimony from ex-Meta employees about internal worries over platform safeguards, strategies to attract young users, and knowledge of potential harms. Singer claimed Meta ignored obvious signs of under-13 users, citing a principal who wrote to Instagram’s head stating nearly all her students were on the app.

State investigators presented evidence from operations that led to three arrests. Using decoy accounts posing as minors, they were inundated with adult friend requests and explicit chats, even after stating their age. The state contends Meta’s systems flagged these violating accounts repeatedly but only suspended them after arrests were made public.

In Meta’s closing statement, attorney Kevin Huff countered that the company transparently disclosed its safety systems’ limitations and acted decisively. He accused the state of focusing on a “small amount of bad content” and using “cherry-picked” statements. “We believe the evidence has shown that Meta works incredibly hard to protect users including teens,” Huff said. He also challenged the investigation’s methods, arguing decoy accounts using “hacked and stolen” or AI-generated images did not replicate an authentic teen experience.

Singer refuted these claims, clarifying that one decoy used an age-regressed photo of an investigator and another used an AI-generated image. She argued that Meta’s attempt to question the investigation’s integrity was audacious given the scale of the alleged failures, including not detecting that a decoy 13-year-old account was contacted by sex offenders.

A central legal obstacle for the plaintiffs is Section 230, which typically shields platforms from liability for third-party content. Singer clarified the state’s case is not about the content itself but about Meta’s “misrepresentations” regarding what it knew about harmful content on its platforms. Huff repeatedly directed the jury’s attention to Section 230, contending the misrepresentation claim “doesn’t even get out of the starting gate.”

Singer urged the jury to impose maximum civil penalties if they find Meta willfully misled the public and engaged in “unconscionable trade practices.” If every teen user in New Mexico were awarded the maximum $5,000 for not being properly informed of risks, the total could surpass $2 billion.

Huff argued the state provided “zero evidence” that teens use Instagram due to a lack of risk awareness and called the user calculation “a fake number.” He stated there was no proof any teen in the state saw the alleged misstatements, therefore no basis for any penalty. The state attorney countered that the user count originated from Meta’s own data.

(Source: The Verge)

Topics

meta legal cases 98% child safety online 96% algorithmic responsibility 94% section 230 defense 92% jury deliberations 90% legal reckoning 88% platform addiction 86% internal evidence 84% undercover investigations 82% civil penalties 80%