Facebook reportedly fielded complaints from political parties saying a major News Feed change pushed them toward negative, polarizing posts. Today, The Wall Street Journal posted leaked reports from Facebook after it boosted “meaningful social interactions” on the platform. While Facebook framed the move as helping friends connect, internal reports said it had “unhealthy side effects on important slices of public content, such as politics and news,” calling these effects an “increasing liability.”
The news is part of a larger Wall Street Journal series based on internal Facebook research. Today’s report delves into the fallout of a 2018 decision to prioritize posts with lots of comments and reactions. Facebook allegedly made the change after noticing that comments, likes, and reshares had declined throughout 2017 — something it attributed partly to people viewing more professionally produced video. Publicly, CEO Mark Zuckerberg described it as a way to increase “time well spent” with friends and family instead of passive video consumption.
After the change, internal research found mixed results. Daily active users increased and users found content shared by close connections more “meaningful,” but reshared content (which the change rewarded) contained “inordinate” levels of “misinformation, toxicity, and violent content.” People tended to comment on and share controversial content, and in the process they apparently made Facebook in general angrier.
A report flagged concerns by unnamed political parties in the European Union, including one in Poland. “Research conducted in the EU reveals that political parties ‘feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook, with the downstream effect of leading them into more extreme policy positions,’” it says. Facebook apparently heard similar concerns from parties in Taiwan and India.
In Poland, “one party’s social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80 percent negative, explicitly as a function of the change to the algorithm.” And “many parties, including those that have shifted strongly to the negative, worry about the long-term effects on democracy.”
News publishers — a frequent victim of Facebook’s algorithm tweaks — unsurprisingly also weren’t happy with the change. Facebook flagged that BuzzFeed CEO Jonah Peretti complained that the change promoted things like “junky science” and racially divisive content.
Facebook frequently tweaks the News Feed to promote different types of content, often clearly responding to public concern as well as financial considerations. (The “time well spent” movement, for instance, did harshly stigmatize “mindless scrolling” on social media.) Facebook engineering VP Lars Backstrom told the Journal that “like any optimization, there’s going to be some ways that it gets exploited or taken advantage of.”
But the Journal writes that when Facebook’s researchers proposed fixes, Zuckerberg was hesitant to implement them if they threatened to reduce user engagement. Ultimately, however, Facebook would reduce the importance of commenting and sharing to the News Feed algorithm — putting more weight on what people actually said they wanted to see.