
The documents reveal that in 2020, under a project code-named Project Mercury, Meta researchers partnered with Nielsen to study the effects of temporarily “deactivating” Facebook and Instagram accounts.
To Meta’s surprise, internal files show that users who stayed off Facebook for a week reported significantly lower levels of depression, anxiety, loneliness, and social comparison—outcomes that contradicted Meta’s public stance.
Instead of releasing the findings or commissioning further research, the company shut the project down, claiming the results were influenced by the prevailing “negative media narrative” surrounding social media.
Privately, however, Meta employees told Nick Clegg, then the company’s global policy chief, that the study’s conclusions were solid.
One researcher reportedly wrote that the Nielsen study “does show causal impact on social comparison,” followed by an unhappy emoji.
Another staff member warned internally that burying the results would be comparable to the tobacco industry hiding evidence of the harms of smoking.
Despite having internal evidence of mental health risks—particularly for teens—the filings say Meta went on to tell U.S. lawmakers that it had no way of measuring whether its platforms harmed young girls.
In response, Meta spokesperson Andy Stone said Saturday that the study was discontinued due to flawed methodology and insisted the company has taken extensive steps to improve user safety.
“For over a decade, we have listened to parents, researched the issues that matter most, and made significant changes to protect teens,” Stone said.
'Hidden product risks'
The allegation of Meta burying evidence of social media harms is just one of many in a late Friday filing by Motley Rice, a law firm suing Meta, Google, TikTok and Snapchat on behalf of school districts around the country.
Broadly, the plaintiffs argue the companies have intentionally hidden the internally recognised risks of their products from users, parents and teachers.
TikTok, Google and Snapchat did not immediately respond to a request for comment.
Allegations against Meta and its rivals include tacitly encouraging children below the age of 13 to use their platforms, failing to address child sexual abuse content and seeking to expand the use of social media products by teenagers while they were at school.
The plaintiffs also allege that the platforms attempted to pay child-focused organisations to defend the safety of their products in public.
In one instance, TikTok sponsored the National PTA and then internally boasted about its ability to influence the child-focused organisation.
Per the filing, TikTok officials said the PTA would "do whatever we want going forward in the fall […] they'll announce things publicly, their CEO will do press statements for us".
By and large, however, the allegations against the other social media platforms are less detailed than those against Meta. The internal documents cited by the plaintiffs allege:
Meta intentionally designed its youth safety features to be ineffective and rarely used, and blocked testing of safety features that it feared might be harmful to growth.
Meta required users to be caught 17 times attempting to traffic people for sex before it would remove them from its platform, which a document described as "a very, very, very high strike threshold".
Meta recognised that optimising its products to increase teen engagement resulted in serving them more harmful content, but did so anyway.
Meta stalled internal efforts to prevent child predators from contacting minors for years due to growth concerns, and pressured safety staff to circulate arguments justifying its decision not to act.
In a text message in 2021, Mark Zuckerberg said that he wouldn’t say that child safety was his top concern "when I have a number of other areas I'm more focused on like building the metaverse". Zuckerberg also shot down or ignored requests by Clegg to better fund child safety work.
Meta's Stone disputed these allegations, saying the company's teen safety measures are effective and that the company's current policy is to remove accounts as soon as they are flagged for sex trafficking.
He said the suit misrepresents its efforts to build safety features for teens and parents, and called its safety work "broadly effective".
"We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions," Stone said.
The underlying Meta documents cited in the filing are not public, and Meta has filed a motion to strike the documents.
Stone said the objection was to the over-broad nature of what plaintiffs are seeking to unseal, not unsealing in its entirety.
A hearing regarding the filing is set for January 26 in Northern California District Court.
No comments:
Post a Comment