The fallout from The Wall Street Journal’s Facebook Files series continues. On Sunday, the company published a point-by-point rebuttal to the Journal’s story on Instagram’s effects on teenage girls — and then on Monday morning, the company said it would “pause” plans to build Instagram Kids while it consults with more outside groups. A Senate hearing looms on Thursday.
I spent the weekend talking to people in and around Facebook about the situation, and today I want to talk about what I think the company ought to do.
Inside Facebook, some people I’ve spoken with are feeling exasperated. They argue that the Journal series uses relatively few data points to paint Facebook in the worst possible light. To them, it’s more evidence of bias from a press working to bring the company to its knees, reaching predetermined conclusions with whatever scraps of information they can find.
For others, though, particularly those who have worked on research and integrity initiatives, the Facebook Files have been a welcome opportunity to discuss their greatest fear: that despite researchers’ most worrisome findings, Facebook lacks the organizational structure and leadership necessary to prevent it from causing a wide range of preventable harms.
Last week I said this situation represents Facebook’s most significant challenge since the Cambridge Analytica data privacy scandal. It’s not as big as Cambridge Analytica; the Journal series has gotten less coverage overall. (Though that Senate hearing means the balance will continue to shift.) But if another story has generated a news cycle this intense or sustained since 2018, it’s news to me.
In the internal divisions over the Facebook Files, though, I find another echo of Cambridge Analytica. Then, too, there was a set of executives determined to fight back against what they perceived as an almost entirely bogus narrative — and another set that, while mostly in agreement with their peers, understood that the story had raised real fears about the company’s power and influence that would have to be addressed.
Last week I argued here that Facebook ought to address this situation by committing to doing more research like that found in the Facebook Files, rather than less. We know Facebook executives believe that the company has positive overall benefits for the world, and we also know that they are meticulous students of their own data. It’s hard to understand why, if the data is so positive, Facebook is often so reluctant to share it.
So why is that the case? One possibility is suggested by the Facebook Files: that the data about Facebook’s effects on societal issues like polarization, vaccine hesitancy, and children’s self-esteem are substantially negative, and must therefore be hidden. Another is that the data is substantially positive but must be hidden anyway, for reasons owing to run-of-the-mill corporate secrecy or a desire to deploy data more strategically, for PR reasons.
Whatever the case, it seems clear that the current state of affairs is making everyone miserable. So today I want to expand my argument: Not only should Facebook commit to doing more research like the Facebook Files, it should release the Facebook Files, period. And not just the Instagram-related ones, as Nick Clegg suggested Monday. Whatever documents the Journal relied on, Facebook should make them publicly available. Redact them as needed to protect users’ privacy, if need be. Add context, where context is missing.
But release them, and soon.
Here’s my rationale.
One, the files are in the public interest. Among other things, according to the Journal, they contain discussions of political parties who changed their policies based on changes to Facebook’s algorithms, they document negative effects of using Instagram on mental health, and they reveal that the company devotes vastly more moderation resources to the United States than the rest of the world. On these subjects and more, the public simply has a right to know what Facebook does. One frustration I’ve had over the past week is that Facebook continues to be focused on the public-relations dimension of the story, when the public interest is much more important.
Two, the files will likely come out soon anyway: the whistleblower who leaked them to the Journal is apparently cooperating with Congress. Copies were shown in advance of publication to various researchers. The Journal may yet release them itself (I wish it would.) In any case, it seems likely that they are going to be available for all of us to read soon. Facebook could generate some (admittedly minor) amount of goodwill by doing it voluntarily. (Company spokesman Andy Stone told me the company is sharing the decks with Congress this week.)
Three, Facebook’s primary complaint about the series is that reporters allegedly took key points out of context. The only way to credibly make that charge is to provide people with the full context. It’s not enough for the company’s head of research to describe one set of slides; to have an honest conversation about all this, we should all be looking at the same set of documents. If, as Facebook says, the majority of the research shows benign or even positive effects, it should have all the more reason to want us to read them.
To be sure, the people inside Facebook arguing against the documents’ release have compelling points on their side, too. As soon as the files are made public, every tech reporter on earth will scour them in an effort to find angles that the Journal missed, extending the life of the story and perhaps even worsening the damage. Even if there are positive angles to be found within the data, there’s no guarantee that reporters will actually write them. And a narrow-minded focus on these documents crowds out a larger and equally important discussion of why we aren’t demanding similar research out of YouTube, Twitter, TikTok, and all the rest.
Moreover, the company was taken aback by the largely negative response that its Sunday night blog post received, I’m told. (I was one of the people negatively responding. So was Samidh Chakrabarti, Facebook’s just-departed former head of civic integrity efforts, who pointed out the blog post would have been more credible if it had been signed by the actual researchers who did the analysis.)
I’d find blog posts like this much more persuasive if they were co-authored by the individual researchers who actually did the analysis. That would be a signal that they are willing to stand behind the comms team’s characterization of their work. https://t.co/2sO0UqedDQ
— Samidh (@samidh) September 27, 2021
The Sunday blog post by Pratiti Raychoudhury, the company’s head of research, is detailed and thoughtful in the way that it reflects on both the good and bad news in the company’s studies on how young Instagram users feel about themselves after using the app. The data is mixed, and people will draw different conclusions from it. The fact that so many critics dismissed her report out of hand, though, may have made the company reluctant to share more. If this is the response we get, the argument goes, what’s the point?
But none of these complaints is more important than the fact that sharing this data with the public is ultimately the right thing to do. And it will be better for Facebook to share it on its own terms than on Congress’.
And if Facebook really wanted to change perception, it could go a step further. Releasing the Facebook Files quickly is the company’s least-bad option. But the company knows that outside researchers will be skeptical of any findings they contain, because they can’t see the raw data. Even to the extent that the files exonerate Facebook from some criticisms, the underlying data is likely to remain under a cloud of suspicion.
That’s why, in addition to making the files public, Facebook should share the underlying data with qualified independent researchers in a privacy-preserving way. Let’s get a second, third, and fourth opinion of what the data shows about Instagram and teenagers. Given the recent revelation that political-science data shared with researchers in 2020 was fatally flawed due to a bug, an unexpected gift of important new research material could help the company rebuild trust with researchers.
Not everyone thinks this would be much of a gift: anyone can survey teens about their experiences on Instagram, after all, and among other things an independent study could recruit a larger sample. But to the extent that data in the Facebook Files can’t be easily accessed or replicated by independent researchers, Facebook should share as much as it can. The company’s efforts to share data with researchers to date have been halting and ineffectual. More transparency is coming to the platform one way or another; there’s still value in staking out a leadership position while the rest of the industry cowers.
I say release the Facebook Files for short-term goodwill, and release at least some of the data to qualified researchers for long-term credibility. Since it was founded, Facebook has relentlessly analyzed our actions and behavior, to its great benefit. However unjust it may feel today, it’s only fair that the company now take its turn under the microscope.
This column was co-published with Platformer, a daily newsletter about Big Tech and democracy.