Unredacted court filings in a class action lawsuit brought by U.S. school districts against Meta and other social media giants allege that Meta shut down internal research after finding causal evidence that its products, Facebook and Instagram, harmed users’ mental health.
According to internal Meta documents obtained via discovery, a 2020 research project code-named “Project Mercury” worked with the survey firm Nielsen to gauge the effect of deactivating the platforms. The internal documents reportedly showed that, to the company’s disappointment, “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison.”
Rather than publishing these findings or continuing research, the filing alleges, Meta halted further work and internally declared that the negative findings were tainted by the “existing media narrative” surrounding the company.
However, staff privately assured Nick Clegg, Meta’s then-head of global public policy, that the conclusions were valid. One unnamed staff researcher allegedly wrote, “The Nielsen study does show causal impact on social comparison.” Another staffer reportedly worried that suppressing the findings was akin to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.” Despite this internal documentation of a causal link to negative mental health effects, the filing claims Meta told Congress it could not quantify whether its products harmed teenage girls.
In a statement on Saturday, Meta spokesman Andy Stone disputed the allegations, asserting the study was stopped due to flawed methodology and that the company works diligently to improve safety. “We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions,” Stone said.
Plaintiffs Allege Intentional Concealment
The lawsuit, filed by Motley Rice on behalf of school districts across the U.S. against Meta, Google, TikTok, and Snapchat, broadly alleges that the companies intentionally concealed internally recognized risks of their products from users, parents, and teachers.
Allegations against Meta and its rivals include encouraging use by children under 13, failing to address child sexual abuse content, and seeking to expand teen usage even while they are at school. The plaintiffs also accuse platforms of trying to pay child-focused organizations to publicly defend their safety. The filing points to an instance where TikTok sponsored the National PTA and internal officials boasted about their ability to influence the organization, stating the PTA would “do whatever we want going forward.”
The internal documents cited in the filing make several specific claims against Meta:
Meta allegedly required users to be caught 17 times attempting to traffic people for sex before removal, a “very, very, very high strike threshold.”
Meta reportedly recognized that optimizing products for teen engagement resulted in serving them more harmful content but proceeded anyway.
In a 2021 text message, Mark Zuckerberg allegedly said he would not claim child safety was his top concern “when I have a number of other areas I’m more focused on like building the metaverse.”
Stone responded to these specific claims, stating the company’s teen safety measures are effective and that its current policy is to remove accounts as soon as they are flagged for sex trafficking. He maintained that the suit misrepresents the company’s safety work.
The underlying Meta documents are not public, and Meta has filed a motion to strike them from the court record, though Stone clarified the objection is only to the scope of what the plaintiffs are seeking to unseal. A hearing regarding the filing is scheduled for January 26 in Northern California District Court. (Reuters & DDN)
For more details: Navamalayalam.com