The whistleblower behind the leak of an enormous cache of Facebook documents to the Wall Street Journal, Frances Haugen, went public on 60 Minutes on Sunday, revealing more of the inner workings the most powerful social media platform in the world. Revealing her identity on national television, Haugen described a company so committed to product optimization that it embraced algorithms that amplify hate speech.
“It’s paying for its profits with our safety,” Haugen told 60 Minutes host Scott Pelley.
According to a since deleted LinkedIn profile Haugen was a product manager at Facebook assigned to the Civic Integretity group. She chose to leave the company in 2021 after the dissolving of the group. She said she didn’t “trust that they’re willing to invest what actually needs to be invested to keep Facebook from being dangerous.”
Consequently she leaked a cache of internal research to the SEC in the hopes of driving better regulation of the company. She noted that she had worked at a number of companies, including Google and Pinterest, but that “it was substantially worse at Facebook” due to the company’s desire to put its profits over the welfare of its users
“There was conflict… between what was good for the public and what was good for Facebook,” Haugen told Pelley, “and Facebook chose over and over again to optimize for its own interests — like making more money.”
While the company repeatedly claims it is helping stop hate speech, at least on its own products, one internal Facebook document leaked by Haugen says, “We estimate that we may action as little as 3-5% of hate and ~0.6% of V&I [Violence and Incitement] on Facebook despite being the best in the world at it.”
Another document was even more blunt. “We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world.”
Haugen claims the root of the problem is the algorithims rolled out in 2018 that govern what you see on the platform. According to her they are meant to drive engagement and the company has found that the best engagement is the kind instilling fear and hate in users. “Its easier to inspire people to anger than it is to other emotions,” Hagen said.
At the time, Mark Zuckerberg presented the algorithm changes as positive. “We feel a responsibility to make sure our services aren’t just fun to use, but also good for people’s well-being.”
But according to the Wall Street Journal’s reporting on Haugen’s concerns, the result was a sharp turn towards anger and hate. “Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” said one internal memo quoted by the Journal, assessing the affects of the change.
The Wall Street Journal began publishings its finding from the cache under the name “The Facebook Files” in September. One report alleging Facebook had research proving Instagram harmed teenage girls has since led to a Congressional hearing. Ahead of the hearing Facebook attempted to change the narrative in a blog post, which reproduced two of the reports referred to in the Journal’s reporting.
Ahead of the 60 Minutes report, Facebook attempted the same deflections in a different form. Facebook Vice President of Global Affair Nick Clegg appeared on CNN’s Reliable Sources to defend the company on Sunday afternoon, just hours before Haugen would appear.
“I think that’s ludicrous,” Clegg said of the allegation that social media was responsible for the January 6 riots. “I think it gives people false comfort to assume that there must be a technological, or technical, explanation for the issues of political polarization in the United States.”
Haugen ended the interview by calling for regulation of social networks more broadly, something Facebook itself has called for in more limited form. She is scheduled to appear before a Senate Commerce panel on Tuesday.