Toward the end of Thursday’s hearing on Facebook and teen mental health, Senator Dan Sullivan (R-Alaska) alluded to the Chinese government’s recent decision to impose strict limits on kids’ video game time. “They have told teenagers to take a real break,” he said, addressing Facebook’s head of global safety, Antigone Davis. “Do you think the United States government needs to look at doing something like that?”
The moment was revealing. A law like China’s would be unthinkably draconian in the US. Yet Sullivan seemed almost wistful. Can you imagine? A country that actually regulates its technology sector?
You can see where Sullivan was coming from, because we sure don’t live in that country. Congress has been hauling in Facebook executives to testify since early 2018, during the height of the Cambridge Analytica scandal. In those three and a half years, it has passed precisely zero laws significantly regulating the conduct of social media platforms. Instead, with some notable exceptions, it tends to do what it spent most of the latest hearing doing: browbeating the companies into fixing things themselves.
Thursday’s hearing was prompted by a series in The Wall Street Journal based on a trove of leaked internal research, and one story in particular: “Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show.” The hearing was styled as a cross between Watergate—what did Facebook know, and when did it know it?—and the corporate exposés of yesteryear. In his opening remarks, Senator Richard Blumenthal (D-Connecticut ) accused the company of hiding its own research and lying about what it knows. “Facebook,” he declared, “has taken Big Tobacco’s playbook.”
In fact, the research about teen mental health was hardly revelatory. Facebook sits on a massive trove of data about its recommendation algorithms, policy enforcement, and user behavior that is inaccessible to outside researchers. Some of the documents leaked to the Journal appear to contain just that sort of data. One article in the series revealed that millions of people around the world are subject to white-glove enforcement through Facebook’s “XCheck system,” leading to high-profile users getting away with flagrant and shocking violations of the platform’s policies. Another article described researchers’ findings that certain changes to the NewsFeed algorithm had inadvertently rewarded “misinformation, toxicity, and violent content,” and that Mark Zuckerberg had resisted fixing the problem. Yet another provided horrifying detail about Facebook’s underinvestment in platform safety outside the US—a choice that potentially affects 90 percent of the company’s 3 billion users.
That’s the kind of internal research that provides fresh insights into Facebook’s effect on the world. The teen mental health research, not so much. The documents, which the Journal made public the night before the hearing, aren’t based on data that only Facebook has access to. The company simply surveyed teens on their views of how its product affects them. That’s something anyone can do, and which indeed has been done too many times to count. It’s also not very revealing. While the Journal’s headline stems from troubling statistics—most notably, one-third of teen girls who struggle with body image issues said Instagram makes those issues worse—the top-line finding of one document is that most teens say Instagram improves their mental health. Either way, people’s subjective accounts of their experiences are unreliable, and many of the teens surveyed were surely aware of the argument that Instagram is bad for them, which might have affected their answers. As Robbie Gonzalez noted for WIRED in 2018, even big-picture correlations between social media use and mental health outcomes don’t prove anything about causality.