Facebook’s Oversight Board, an independent body that reviews Facebook moderation decisions, has accepted its first cases. The six appeals involve content removed under Facebook’s hate speech rules, nudity ban, and misinformation policies. They’re now open for seven days of public comment, after which the board will determine whether the posts should have been removed.
Most of the cases involve users outside the US posting non-English content — a known weak point for Facebook moderation — and at least two hinge on the nuance of someone publishing hate content to implicitly criticize it. One user posted screenshots of offensive tweets from former Malaysian Prime Minister Mahathir Mohamad, for instance, allegedly to raise awareness of “horrible words.” Another post involved a user who shared an alleged Joseph Goebbels quote, but who appealed by saying they were comparing Goebbels’s words to a “fascist model” in US politics.
Each case will be referred to a five-member panel that includes one person from the same region as the original content. These panels will make their decisions — and Facebook will act on them — within 90 days. The oversight board, whose first members were announced in May, includes digital rights activists and former European Court of Human Rights judge András Sajó. Their decisions will be informed by public comments.
Five of the incidents were submitted by users, who have appealed over 20,000 decisions since the option opened in October. The last was referred by Facebook itself and deals with coronavirus-related misinformation — one of the platform’s touchiest subjects. Moderators removed a video that criticized French health officials for not authorizing unproven COVID-19 treatment hydroxychloroquine, which the video inaccurately referred to as a “cure.” The company later submitted it as “an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.”
Facebook CEO Mark Zuckerberg has compared the Oversight Board to a Supreme Court for Facebook. It’s supposed to offer a fair appeals process for users who get their content removed — something that often feels missing on social networks, especially as they take stricter steps to remove false information or offensive speech. At the same time, it eases the pressure on Facebook to make moderation calls. Cases like the pandemic video decision, for instance, will set an independently decided precedent for when Facebook removes similar content in the future.
The Oversight Board — similar to the US Supreme Court — is largely supposed to interpret policies, not make new ones. Facebook has said it may also turn to the board for policy recommendations in the future, however.
Many of Facebook’s problems involve the speed and scale of content moderation, not the exact nuances of interpreting its policies. The Oversight Board obviously can’t hear all the appeals cases, and we don’t know exactly how rank-and-file moderators will apply its rulings to everyday decisions. But it’s the start of a long-awaited experiment in managing Facebook (a little) more like a government.