Facebook owner Meta Platforms Inc. should have an “ethical and moral obligation” to establish independent oversight and follow high scientific standards in its research on young people’s mental health, says a global group of psychology and social media researchers.
In an open letter to the company released late Sunday, professors and researchers from Britain, the United States, Canada and elsewhere argue that the secrecy with which Meta conducts its internal research on mental health is “misguided” and “in its present state, doomed to fail.”
But they further argue that, if they followed accepted standards and ethical procedures, “data collected by Meta could inform how we understand digital technology use and its influence on mental health in unprecedented ways.”
Better data and research from the platform with independent oversight, they suggest, would help the world understand digital life much more clearly. “What we’ve been able to gather from the outside is just very limited,” said Andrea Howard, an associate professor focused on emotion and addictions development at Carleton University in Ottawa, and one of the contributors to the letter.
“It’s very trendy to talk about the digital age ruining our youth. But there simply isn’t good evidence that it’s true.”
Meta’s Facebook, Instagram and WhatsApp platforms have plenty of data, however, that could provide more clear evidence of the true effects of time spent using apps. Dr. Howard and the letter’s co-authors argue that sharing such data, in ways that protect users’ privacy, could enhance not just global research but perhaps even Meta’s own reputation.
The letter was released after revelations in recent months from whistle-blower Frances Haugen, who has provided journalists, politicians and regulators with a series of internal documents and studies about the company. Those documents, first obtained by The Wall Street Journal, included internal research that suggested those services could have detrimental effects on the mental health of adolescents and teens and acknowledged that they can detrimentally affect body-image issues for teen girls.
Yet when Dr. Howard looked at the leaked research on youths’ body images, she found it to be “extremely low-quality.” The way questions were asked may have misled users, she said, leading to inaccurate findings. This research was one of the major reasons, she said, that the co-authors decided to release the letter.
Asked about the letter, Meta spokesperson Lisa Laventure did not comment on the content as described by The Globe and Mail, or acknowledge if the company would plan to follow its recommendations.
“This is an industrywide challenge,” she said in an e-mailed statement. “A survey from just last month suggested that more U.S. teens are using TikTok and YouTube than Instagram or Facebook, which is why we need an industrywide effort to understand the role of social media in young people’s lives.”
The survey Meta referred to was released last month by the market research company Forrester.
The letter’s contributors were largely marshalled by Andrew Przybylski, an experimental psychologist and director of research at the Oxford Internet Institute in Britain. It argues that Meta should keep its research and methodologies open to scrutiny by independent experts to ensure the results are trustworthy and follow acceptable standards of data collection. This could further engender public trust in the company, the authors write.
“You and your organizations have an ethical and moral obligation to align your internal research on children and adolescents with established standards for evidence in mental-health science,” the letter says.
Because the company has access to such a massive trove of user data, the authors also say that combining the company’s research with long-term studies following young people’s mental health over time could help the world better understand how technology affects them. They add that Meta’s data are truly global, whereas much academic research on youth mental health is not – meaning that more transparent research would also help scientists get a clearer picture of how technology affects mental health worldwide, both for good and bad.
In 2018, when the company was called Facebook, it established an independent “oversight board” to oversee significant decisions on content moderation – such as whether to keep former U.S. president Donald Trump suspended from its platforms after his incitement of the Capitol riots earlier this year. (The board agreed to uphold the ban for a six-month period, and the company later extended the ban until early 2023.)
The psychology professors said in their letter that Meta could extend this model to provide scientific oversight to the company’s research in the form of an independent, global trust. The trust could focus on mental-health science for young people, they write, both partnering with existing researchers and expanding research capacity in regions where little currently exists.
Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.