Let’s conclude what turned out to be Free Speech Week on The Interface with a look at a case involving the co-chairman of Facebook’s new Oversight Board, a Zoom recording of his law school class, and the N-word.
Can you use a racist slur on Facebook? The answer is probably not, but also maybe. The company’s community standards prohibit “direct attacks” on people based on their race. But the company also published a blog post in 2017 laying out some of the nuances involved in deciding whether a slur is, in fact, an attack, which often depends heavily on context that goes beyond the written word.
Later this year, some of the hardest decisions about whether a post should stay up on Facebook will be made by an independent Oversight Board. The board, whose initial members were announced last month, will allow Facebook and Instagram users to appeal when they believe that their posts have been removed in error. Facebook says it will abide by the board’s decisions, and will also ask it to issue advisory opinions on emerging policy questions.
In the wake of this week’s controversy over President Trump’s Facebook posts about nationwide protests, which Twitter and Snap concluded promote violence, some asked the Oversight Board to weigh in. On Wednesday, the board published a somewhat apologetic note explaining why it couldn’t: for a bunch of reasons, it just isn’t ready.
But one of the things we know about the Oversight Board is that its initial members were selected for their commitment to free speech values. Visit the board’s website and the first message you see reads: “Ensuring respect for free expression, through independent judgment.”
In fact, one of the co-chairs of the board was involved in a speech controversy just this week. Here’s Nick Anderson in the Washington Post:
Stanford University law professor Michael W. McConnell was nearing the end of a course on the creation of the Constitution last week when he decided to read a quote attributed to Patrick Henry from an 18th-century debate in Virginia.
But first, McConnell paused the Zoom video recording, according to one of his students, who spoke on the condition of anonymity, fearing backlash. Then the professor read the statement, which he said was intended to stoke racist opposition to ratification of the Constitution.
The quote included the n-word. McConnell, who is white, resumed recording and turned to other topics, the student said.
The incident took place that same week that global protests against the recent murders of George Floyd, Breonna Taylor, and Ahmaud Arbery, among others, galvanized the country — and brought a heightened sensitivity to ongoing racist oppression in the United States.
I spoke with some Stanford law students about McConnell’s class, and they told me they found his reading of the N-word painful in part because a similar incident had taken place at the law school in November. Then, a guest lecturer from Stanford’s history department read the N-word out loud while discussing racist cigarette marketing. And just last month, another Stanford professor apologized after using a form of the N-word twice while discussing the hip-hop group N.W.A.
“There is a feeling of exasperation and people just feeling a lot of pain,” said Donovan Hicks, co-president of the Black Law Students Association at Stanford.
In his initial note to students, McConnell defended reading what he referred to as “the horrific Henry quotation” out loud: “I do not think history should be stripped of its ugliness.”
In response, the Black Law Students Association sent a letter complaining about the incident to Stanford law students and faculty. “If there is one thing black students know, it’s our own history,” the letter reads. “Ahmaud Arbery is our history. Breonna Taylor is our history. George Floyd is our history. White men refusing to stop saying “n—” is our history.”
The students noted that, in the wake of the November incident, McConnell had written a note to students saying “it is hard for me to see the pedagogical purpose” of using what he called “this most extreme racial epithet known to our language.” He went on:
To my mind, political correctness as it exists in the modern university is a problem, because it can stifle discussion and silence minority views. But that does not mean that all standards of civility should be dismissed as examples of “political correctness.” The use of some terms, especially when blatant, intentional, extreme, or devoid of legitimate context, can also stifle discussion and silence minority views. We should not be quick to censure the speech of others, but we should not let worries about freedom of speech and political correctness stop us from condemning what should be condemned.
McConnell sent an email to students and faculty May 29th saying he had made the decision to read the passage with “good will,” and noted he had placed the speech in its proper historical context and condemned the use of the words. He said he would not use the word again in class, but stopped short of apologizing.
McConnell told the Post he would have no further comment. Jamal Greene, one of the other Oversight Board co-chairs, told Protocol he “might have made a different choice” but did not condemn McConnell. Issie Lapowsky writes:
McConnell’s co-chair, Jamal Greene, wrote that he has “tremendous respect for [McConnell] as a person and a scholar.”
“Striking the right tone in surfacing the ugliness of our constitutional history is a difficulty I myself have struggled with,” Greene wrote. “While I might have made a different choice in this instance, I take professor McConnell at his word that he has learned from his experience, as we all must strive to do as educators.”
This incident seems relevant to anyone looking to understand what the Oversight Board is, and how it might act. When I saw tweets this week from people begging the board to weigh in on Facebook’s decision about the Trump posts, the implication was that the board would step in and remove what Facebook would not.
In fact, Facebook has said from the beginning that initially the board will only restore posts that the board concludes it has removed in error. Eventually the board will issue opinions on what Facebook ought to take down; one person who is closely involved told me that could come within a few months. But it’s not clear any of that will be up and running before, say, the 2020 US presidential election.
More than that, though, the McConnell incident — and his co-chairman’s reaction to it — helps us understand how the board is likely to see the world. For some vocal subset of Facebook’s user base, the primary concern is that the platform allows too much speech. The board’s initial makeup and starting assignment reflect that fear that Facebook might not be allowing enough speech.
This is, not coincidentally, something Mark Zuckerberg fears. He told employees on Tuesday:
Over time in general, we just we tend to add more policies to restrict things more and more. And I think that this, while each one is thoughtful and good and we’re articulating specific harms — and I think that’s important — I do think that expression and voice is also a thing that routinely needs to be stood up for because it has the property that, you know, when something is uniformly positive, no one argues for taking it down. It’s always only when there’s something that’s controversial. Every time there’s something that’s controversial, your instinct is, “Okay, let’s restrict a lot,” then you do end up restricting a lot of things that I think will be eventually good for everyone.
For some, the McConnell classroom incident shouldn’t even qualify as an “incident” at all — a professor simply taught history, using the language of history, while condemning it to his students. To others, though, including some of his students, McConnell failed a basic test of empathy: can you avoid using a word you know to be harmful, as a show of support to Stanford’s black community and its allies?
It all feels related to a question Facebook is being confronted with more and more — and increasingly, by its own employees. Will it be a simple mirror for society, warts and all, or will it put a thumb on the scale for progressive change — including anti-racism? Zuckerberg has long said that, given his near-total control over Facebook as a company, he wants to avoid rigging the company’s services in favor of any particular viewpoint. Instead, whenever possible, he hopes to fight bad speech with more speech.
McConnell is just one member of the board, which will eventually include 40 members. He won’t hear most or even all of the cases brought to the board. And perhaps, as he begins to review cases involving offensive and dangerous speech later, he’ll find reason to vote for their removal from Facebook. But until then I’ll probably find myself thinking of the decision he made last week — the moment in his Stanford classroom, with protests raging in the world around him, when he shut off his Zoom recording to leave no record of his words.
Pushback
After I wrote about yesterday’s decision by Snap to remove Trump from promotion in its Discover tab, a reader asked whether the president had violated policy. “There’s a big difference between randomly deciding something is a good idea because it’s in a news cycle versus doing it based on a general policy that people under you can do, too,” the reader wrote.
I checked in with Snap, and the company said the decision was not based on a violation of Snapchat’s community standards. Instead, the decision was based on Spiegel’s Sunday memo to his team, which said the company “cannot promote accounts in America that are linked to people who incite racial violence, whether they do so on or off our platform.”
The Ratio
Today in news that could affect public perception of the big tech platforms.
⬆️ Trending up: Apple CEO Tim Cook published a blog post to address the senseless killing of George Floyd and the long history of racism. Apple is also donating to organizations like the Equal Justice Initiative, which challenge racial injustice and mass incarceration. (Apple)
Trending sideways: Amid the pandemic, Amazon is funding research into a potential COVID-19 treatments, developing its own testing capabilities, and backing a study on immunity. But these efforts aren’t doing much to quell the fears of warehouse workers. (Emily Mullin / OneZero)
Virus tracker
Total cases in the US: More than 1,872,000
Total deaths in the US: At least 108,117
Reported cases in California: 120,896
Total test results (positive and negative) in California: 2,131,294
Reported cases in New York: 379,977
Total test results (positive and negative) in New York: 2,229,473
Reported cases in New Jersey: 162,530
Total test results (positive and negative) in New Jersey: 837,420
Reported cases in Illinois: 124,279
Total test results (positive and negative) in Illinois: 959,175
Data from The New York Times. Test data from The COVID Tracking Project.
Governing
⭐ Campaign staffers on Donald Trump and Joe Biden’s presidential campaigns were targeted with phishing attacks according to Google. The attacks came from Iran and China, respectively. Here’s Robert McMillan at The Wall Street Journal:
The attacks don’t appear to have been successful, Google, a unit of Alphabet Inc., said. The company has notified federal authorities and the targeted users of the attacks, said Shane Huntley, who runs Google’s in-house counterespionage group, known as the Threat Analysis Group.
The Biden campaign was targeted by a China-based group, known as APT 31, Mr. Huntley said. This group has been linked by security firms to the Chinese government. The Trump campaign was targeted by an Iranian group called APT 35, he said. APT stands for advanced persistent threat, a shorthand used by cybersecurity professionals for sophisticated adversaries that are backed by nation-states.
Facebook started labeling media outlets that are “wholly or partially under the editorial control of their government,” following an announcement of the policy in 2019. It will start labeling ads from these outlets later this year, and ban state-controlled media from advertising inside the US. (Adi Robertson / The Verge)
Facebook, Google, and Twitter each have their own rules that govern political advertising. Here’s how each company decides what you can and can’t see on their platform. (Patience Haggin and Emily Glazer / The Wall Street Journal)
Trump’s campaign spent $1.48 million on Google ads in the first week of May. It’s the highest weekly total of the 2020 campaign. (Eric Newcomer and Mark Bergen / Bloomberg)
A review of President Trump’s 139 tweets from Sunday, May 24th, to Saturday, May 30th, found at least 26 contained clearly false claims, underscoring the challenge for Twitter CEO Jack Dorsey in policing him. These included five about mail-in voting that were not flagged, five promoting the false conspiracy theory about Joe Scarborough and three about Twitter itself. (Linda Qiu / The New York Times)
Rumors about antifa storming Idaho to spread violence during the protests prompted residents to take up arms and stand watch. Now, local officials across the state have acknowledged that not a single participant in the protests was known to have defiled a home or storefront in the name of antifa. (Isaac Stanley-Becker and Tony Romm / The Washington Post)
Three men alleged to be members of the far-right extremist “Boogaloo” movement have been charged with trying to incite violence at protests in Minnesota and Texas. The movement has been growing in Facebook groups. (Andrew Blankstein, Tom Winter and Brandy Zadrozny / NBC)
Industry
⭐ Nextdoor says it supports the Black Lives Matter movement. But many of its volunteer moderators have been stifling conversations about race, police, and protests while removing posts with that mention Black Lives Matter. Brianna Sacks and Ryan Mac at BuzzFeed have the scoop:
In California, other people using Nextdoor posed the same question. While private companies have no obligation to allow for untrammeled speech on their platforms, some of those who had been moderated said it was hypocritical of Nextdoor to publicly say that it supported diversity while its own moderators were aggressively using the site’s rules to clamp down on discussions about race.
On Tuesday, Dylan Hailey, a 26-year-old security engineer from Alamo, California, wanted to show his support for the protests happening in his community. After reading a post on Nextdoor asking for people of “wealth and privilege” to acknowledge the systemic racism highlighted by demonstrators following the police killing of George Floyd, he commented “#BlackLivesMatter.”
Within an hour, his comment was removed.
Nextdoor users are sharing posts about the George Floyd protests, not realizing that police are probably watching. For years, the company has aggressively recruited law enforcement onto its platform, coaching departments on how to build a friendly appearance on the app. (Sarah Emerson / OneZero)
The George Floyd protests have turned the police scanner app Citizen into an overnight hit. But it’s unclear whether the app is helping people stay safe or stoking their fears. We think it’s stoking their fears! (Jared Newman / Fast Company)
Signal announced a new face-blurring tool that will be incorporated into the latest versions of the app. Users sharing pictures through the app will be able to quickly blur faces, adding another layer of privacy to pictures. Also useful for protest photography. (James Vincent / The Verge)
People are calling on Instagram to let anyone share links in their stories amid the protests. Currently, the company only allows users who have at least 10,000 followers or who are verified to use the link feature in their stories. (Lauren Strapagiel / BuzzFeed)
Health misinformation is spreading quickly during the coronavirus pandemic — and it’s becoming more sophisticated. Here’s how to spot it. (Christina Farr / CNBC)
Amazon is considering buying a stake worth at least $2 billion in Indian mobile operator Bharti Airtel. If it goes through, it would give Bharti a boost as it seeks to compete against the number one player in India, Reliance Jio. (Aditya Kalra and Sumeet Chatterjee / Reuters)
Things to do
Stuff to occupy you online during the quarantine.
Donate to support the fight for justice. There are some new suggestions in this updated guide from The Verge.
Prepare your phone for a protest. The Markup has some good tips for you before you leave the house.
Those good tweets
Leaving you with an extended selection of these today in light of how hard this week was and also my dad telling me he has started to read these first.
Black Mirror really outdid themselves this time. Having us EXPERIENCE season 6 instead of watching it on Netflix? Remarkable really
— Blacks Rule (@ThatgyalKrys) June 1, 2020
Thank you for your interest in wiping out half of humanity. We are experiencing a high volume of apocalypse right now. Your apocalypse is very important to us. The next available group of self-identified experts with no experience in the field will be with you shortly. https://t.co/OhgU49X7tJ
— Rebecca Metz (@TheRebeccaMetz) June 1, 2020
Marshall law?? You think I give a fuck if trump sends Eminem out here??
— stream my suicidal thoughts (@Yoshi_tokuGAYwa) June 2, 2020
Talk to us
Send us tips, comments, questions, and cases for the Oversight Board: casey@theverge.com and zoe@theverge.com.