AI & TechBigTech CompaniesBusinessNewswireTechnology

Meta’s Child Safety Leak: A Damaging Turn for the Worse

▼ Summary

– Former Meta researchers testified that children under 13 are widespread on Meta’s VR platforms despite official age restrictions, exposing them to risks like predators.
– Whistleblowers alleged Meta suppressed internal research on harms to children, with lawyers discouraging data collection to avoid legal liability and creating a “legal surveillance” system.
– Meta disputed the claims, stating it has approved numerous studies on youth safety, but critics called this misleading due to alleged manipulation of research methods and findings.
– The company reportedly learned from past whistleblowing by tightening internal controls on research rather than addressing safety concerns, limiting transparency.
– Despite Senate passage of the Kids Online Safety Act in response to these issues, legislative efforts have stalled in the House, showing little progress in regulation.

The recent Senate hearing on child safety in virtual reality has cast a stark new light on Meta’s internal practices, revealing what whistleblowers describe as a deliberate corporate strategy to conceal evidence of harm to young users. Former Meta user experience researcher Cayce Savage testified that the company has shifted from addressing known risks to actively suppressing research that might expose its platforms’ dangers to children.

Savage and another former researcher, Jason Sattizahn, appeared before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, building on a Washington Post report that Meta used its legal team to intimidate researchers and bury findings related to child safety in VR. Lawmakers expressed frustration that despite earlier promises of reform, the company appears to have refined its methods of evasion rather than improving protections for minors.

Both witnesses described an environment where children under 13 routinely access Meta’s VR social platforms, despite official age restrictions. Savage emphasized the uniquely invasive risks posed by immersive technology, noting, “In VR, someone can stand behind your child and whisper in their ear, and your child will feel their presence as though it’s real.” She stressed that because VR tracks physical movement, harmful interactions in virtual spaces can mirror real-world violations.

According to their testimony, Meta’s legal department systematically discouraged, and at times explicitly forbade, research into youth safety. Sattizahn stated that attorneys monitored studies from their inception, restricting topics, questions, and methodologies to avoid creating a “paper trail” that could establish legal liability. He recalled being told to delete data related to emotional and psychological harm because possessing such information would prove that Meta was aware of the dangers.

A Meta spokesperson responded to earlier reporting by claiming the company had “approved nearly 180 Reality Labs-related studies on social issues, including youth safety.” Sattizahn dismissed this as misleading, calling it a “lie by avoidance,” and insisted that relevant research is “being pruned and manipulated” to align with corporate interests rather than scientific integrity.

This pattern echoes the aftermath of the 2021 disclosures by Frances Haugen, which exposed internal studies showing Instagram’s harmful impact on teen mental health. Rather than reforming its practices, Savage argued, Meta learned to stop generating incriminating documents altogether. Senator Richard Blumenthal concurred, noting that the company had “learned the wrong lesson.”

The whistleblowers expressed concern that CEO Mark Zuckerberg is fully aware of these issues. Savage remarked, “The only way that he would not be aware is if he had never used his own headset.”

Despite heightened congressional attention following Haugen’s revelations, legislative progress has stalled. The Kids Online Safety Act (KOSA), introduced in 2022, passed the Senate with overwhelming bipartisan support but never reached a House vote. Advocates like Maurine Molak, whose son died by suicide after experiencing cyberbullying, continue to press for action. Molak reaffirmed her commitment after Senator Ted Cruz pledged to continue championing the bill.

Savage admitted she hesitated before coming forward, fearing retaliation against former colleagues and the broader field of user research at Meta. “Meta responded to Frances Haugen’s disclosure in 2021 by cracking down on research internally,” she explained. “Researchers were subjected to sudden censorship under the guise of protection.” Her testimony underscores a troubling corporate culture that prioritizes legal safety over the well-being of its youngest users.

(Source: The Verge)

Topics

whistleblower testimony 98% child safety 97% research suppression 96% tech policy 95% vr dangers 94% legal liability 93% senate hearings 92% internal research 91% kosa legislation 90% corporate accountability 88%