For people who stumbled their way onto One American News Network (OANN)’s YouTube channel over the last few days, it might be easy to think President Trump won the 2020 election.
Videos titled “Trump won. Dems try to pull a fast one” and “Trump Won. MSM hopes you don’t believe your eyes,” started appearing on OANN’s YouTube channel on November 4th — one day after people cast their votes in the federal election. The videos are full of lies predicated on people’s fears that would trigger moderation on another network, including:
“It appears that Trump won by such a large margin, now they’re actually pumping out illegal ballots into the battleground states to actually beat him.”
“They think that they can add 100,000 votes that no one gets to see or review, and all 100,000 votes are for Joe Biden and that Republicans will just accept those votes as fact? They think they can just play Republicans.”
“Joe Biden didn’t campaign much — he’s senile, and is expected to be removed from office if elected.”
To be clear: there has been no public evidence of voter fraud so far, and even less evidence to support that Biden will be removed from office after the election. The videos have more than 500,000 views combined, at the time of this writing.
OANN has become a poster child for Trump-focused misinformation on YouTube, but the problem is much larger than a single video or a single channel. It’s a network-wide issue, the result of YouTube staking out a position as one of the most permissive major social networks. That stance has served YouTube well for years, but it’s become a liability in the face of the Trump campaign’s ongoing scramble to delegitimize the results of the 2020 election. While Facebook and Twitter tighten their moderation rules and supercharge enforcement, YouTube is still struggling with how to respond to misinformation around the election.
YouTube’s response to the OANN videos shows the company’s policy in practice: even after high-profile coverage of the videos’ demonstrably false claims, YouTube has decided to leave the videos live. The videos simply don’t violate YouTube’s content policy, a spokesperson told The Verge. Instead, the company will remove ads from them, because alleging who won an election before the election is called is “in the scope of our demonstrably false policy.” YouTube will also include an information box noting that the election results aren’t final, and pointing people to Google’s main election hub — but that’s only when the video is clicked.
It’s not just OANN using YouTube to spread misinformation, either. Steven Crowder, a controversial right-wing pundit who lost his ability to monetize his channel for more than a year after he incited harassment against another YouTube creator, spread misinformation about ballot counting during one of his livestreams.
During the livestream, which resulted in millions of views, Crowder and his fellow hosts talked about suspicious-looking activity outside of a polling center in Detroit. The theory Crowder helped spread insinuated that a man wheeling in a device was carrying in ballots, according to BuzzFeed. When the man in question loaded a box onto a moving device, Crowder said, “you wouldn’t be able to trust it because some ballots could fly off the back,” BuzzFeed noted.
The man in question ended up being a photographer for a local media organization. He was not handling ballots, but carrying equipment, according to BuzzFeed.
Elsewhere, former White House adviser and right-wing commentator Seb Gorka used a different YouTube livestream to spread misinformation about Trump winning certain states. In Gorka’s livestream, entitled “They’re trying to steal the President’s victory,” Gorka suggests that stations stopped counting votes in Democrat-controlled states while trying to find new mail-in ballots to help Biden win. At one point early on, Gorka notes that “every single one of those states saw the president victorious as the snapshot was taken yesterday evening.” No plausible evidence has been produced for any of these claims.
In a livestream earlier today about a number of topics related to the current election, former White House strategist and former executive chairman of Breitbart News, Steve Bannon, said he’d not only fire FBI Director Christopher Wray and Dr. Anthony Fauci, the director of the National Institute of Allergy and Infectious Diseases, but “put the heads on pikes.” He added he’d put them “at the two corners of the White House as a warning to federal bureaucrats — you either get with the program, or you’re gone.”
YouTube later removed the video for “violating our policy against inciting violence,” a spokesperson told The Verge. Around the same time as the video came down, Twitter banned Bannon’s podcast-specific Twitter account for “violating the Twitter Rules, specifically our policy on the glorification of violence,” a spokesperson told The Verge.
YouTube has policies about misinformation pertaining to the election, but organizations like OANN have become adept at evading them. The company will step in when a video misleads people about voting. This can include “content aiming to mislead voters about the time, place, means or eligibility requirements for voting, or false claims that could materially discourage voting,” according to a spokesperson. By that definition, it appears content that misled viewers about premature wins in certain states, outright incorrect information about who won the election, or insinuations about one party tampering with another party’s votes while those ballots are being counted, would not violate YouTube’s policies.
YouTube’s team prepared for Election Day — but weren’t ready for what came after, says Angelo Carusone, president of Media Matters for America, a watchdog group that has been reporting on YouTube’s misinformation struggles.
“That’s been their biggest failure,” Carusone said. “How they choose to label content really comes down to what’s inside the video, but the most critical part of a video oftentimes is that straight thumbnail, and how it’s framed in both the headline and the description, because that’s going to give people an interpretive lens for the rest of what they see.”
YouTube tends to be strictest about titles and thumbnails, which aren’t permitted to mislead users about the nature of the video. The policy states that creators and organizations are prohibited from “using the title, thumbnails, description, or tags to trick users into believing the content is something it is not.” Under those rules, you might think that videos with titles claiming point blank that Trump won should violate YouTube’s policy, since a winner has not yet been declared. But YouTube has not been enforcing the policy by that logic, and could not tell The Verge why the video’s title didn’t trigger moderation.
YouTube has faced criticism for its lax policies in the past — but never during such a high-stakes political crisis. Controversial videos like anti-vaccination content, are often classified as “borderline content”: ineligible for advertising and often downgraded in search results, but still present on platforms for audiences that search them out. But as a result, outright bans on the platform are still rare, and content like OANN’s recent videos can gain as much as 500,000 views even while struggling against the limitations placed on borderline content.
YouTube’s lack of action is particularly noticeable because other social networks like Facebook and Twitter have taken a more aggressive approach to try and prevent the spread of misinformation. Since election night, Twitter has flagged more than a third of the president’s tweets for misleading readers about the ongoing election, although it has resisted calls to ban the president from the platform entirely. More recently, Facebook shut down a 300,000-person group called “Stop the Steal,” aimed at halting the ongoing vote-counting efforts, after observing “worrying calls for violence from members of the group.”
In contrast, YouTube’s response has critics like Carusone wondering if the election caught them off guard. “I can’t tell how much of it is that they didn’t do the homework or want to put the work in,” he says. “I think they made an intentional decision not to do this. They were taking a temporary approach to a limited set of content on their platform.”
Update November 5th, 6:37pm ET: The story has been updated to note that YouTube removed a video where Steve Bannon made comments that violated the company’s policy against inciting violence.
Update 2 November 5th, 7:25pm ET: The story has been updated to include Twitter’s statement.