Court accusations against Meta for hiding crucial evidence

- US lawsuit accuses Meta of hiding internal findings linking Facebook and Instagram to mental health harms
- Allegations over halting research project that showed improved mental health after stopping platform use
US court documents have accused Meta of stopping internal research after it found a causal link between using Facebook and Instagram and deteriorating mental health among users, as part of a class action filed by school districts in the United States against Meta and other social media platforms.
According to the filings, Meta researchers conducted an internal study in 2020 called “Project Mercury” to measure the effect of deactivating Facebook and Instagram accounts for a set period.
The results showed that users who stopped using the platforms for a week reported lower levels of depression, anxiety, loneliness, and social comparison.
The lawsuit indicates that instead of publishing the findings or continuing the research, the company halted the project, claiming the study had been influenced by the “negative media narrative” surrounding Meta.
The complaint adds that, despite having internal evidence of harmful effects, Meta told Congress it lacked the ability to determine whether its platforms harmed teenage girls.
In a statement, Meta spokesperson Andy Stone said the study was canceled due to “methodological flaws,” asserting that the company has worked for years to improve user safety, especially for young people.
The lawsuit against Meta, alongside Google, TikTok, and Snapchat, accuses the companies of hiding internal risks from users, parents, and educators, including implicitly allowing children under 13 to use their platforms, failing to address child exploitation content, and promoting teen engagement during school hours.
Court documents reveal that TikTok attempted to influence child-focused organizations through sponsorship and funding, while internal Meta files show the company designed ineffective safety tools, blocked testing of new protection features, continued to expose teens to harmful content, ignored internal warnings, and did not prioritize child safety.
Meta has denied the allegations, asserting the effectiveness of its safety measures, while a court hearing is scheduled for January 26 in the Northern California District Court.
