Whistleblowers Allege Meta Suppressed Research on Children’s Safety in VR
Four current and former employees of Meta have disclosed documents to Congress, claiming the company suppressed internal research on children’s safety, according to a report by The Washington Post. These allegations come amid ongoing scrutiny of Meta’s handling of youth protection across its platforms.
Policy Changes Following Instagram Teen Mental Health Leak
The whistleblowers assert that, six weeks after Frances Haugen’s 2021 leak exposed Meta’s research linking Instagram to teen girls’ mental health harm, the company revised policies governing research on sensitive topics such as politics, children, gender, race, and harassment. These changes reportedly limited researchers’ ability to openly discuss findings, encouraging them to involve legal counsel to protect communications under attorney-client privilege and to use vague language in reports to avoid terms like “not compliant” or “illegal.”
Allegations of Suppression in Meta’s Virtual Reality Research
Jason Sattizahn, a former Meta virtual reality researcher, told The Washington Post that he was ordered to delete recordings of an interview in which a teenager described his ten-year-old brother being sexually propositioned on Meta’s Horizon Worlds platform. The whistleblowers claim their documents reveal a broader pattern of discouraging research and discussion about children under 13 using Meta’s VR applications.
Meta responded to TechCrunch, emphasizing compliance with global privacy regulations that require deleting data collected from minors under 13 without verifiable parental consent. The company also stated that since early 2022, it has approved nearly 180 Reality Labs studies on social issues, including youth safety and well-being, dismissing the whistleblowers’ examples as isolated and misleading.
Related Legal Actions Highlight Ongoing Safety Concerns
In a separate February lawsuit, Kelly Stonelake, a former Meta employee of 15 years, raised similar concerns. She alleged that under her leadership of go-to-market strategies for Horizon Worlds targeting teenagers and international users, the app lacked adequate safeguards to exclude users under 13 and had persistent issues with racial harassment. The lawsuit claims leadership was aware that users with Black avatars faced racial slurs within an average of 34 seconds on the platform.
Stonelake has also filed lawsuits against Meta alleging sexual harassment and gender discrimination.
Broader Criticism of Meta’s Child Safety Measures
Beyond VR, Meta faces criticism over other products impacting minors. Reuters reported last month that Meta’s AI chatbot guidelines previously permitted “romantic or sensual” conversations with children, raising further concerns about the company’s approach to youth safety.
FinOracleAI — Market View
The whistleblower allegations add to regulatory and reputational pressures on Meta, particularly in its efforts to expand VR platforms among younger users. While Meta denies systematic suppression, ongoing lawsuits and public scrutiny could prompt increased regulatory intervention and impact user trust.
Investors should monitor developments in child safety regulations and Meta’s responses, as these issues pose risks to adoption of its VR products and broader corporate reputation. The company’s ability to demonstrate robust safety measures will be critical to sustaining growth in its metaverse ambitions.
Impact: negative