As it predicted would likely happen, YouTube removed more videos in the second quarter of 2020 than it ever has, as the company leaned more on its algorithm in place of most of its human content moderators. That’s according to the Community Guidelines Enforcement report the company released Tuesday (via Protocol), which shows it took down more than 11.4 million videos between April and June. In the same period last year, YouTube removed just under 9 million videos.
“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,” the company wrote in a blog post. “Because responsibility is our top priority, we chose the latter — using technology to help with some of the work normally done by reviewers.”
YouTube parent company Google told employees in March it was extending its work-from-home policy until the end of 2020 due to the coronavirus. The company warned that the measures meant it would rely more on technology than human reviewers and that videos that would normally be fine on the platform may end up being removed in error. Its human moderators work from offices specifically set up for reviews; to allow such work to be done outside of a controlled environment would risk having user data — and sensitive videos — inadvertently exposed.
The company knew that removing more videos that didn’t violate its rules would also mean more appeals from content creators as a result. So it added more staff to its appeals process to handle requests as quickly as possible. The number of appeals for content takedowns went from 166,000 in the first quarter of 2020 to more than 325,000 in the second. It also meant YouTube reversed itself and reinstated more videos in the second quarter: more than 160,000, compared to just over 41,000 in the first quarter (although YouTube noted in its blog post that some of the reinstatements may have been appealed in an earlier quarter).
YouTube said in its blog post that for sensitive policy areas such as child safety and violent extremism, it saw more than triple the number of removals as usual during the second quarter, but it viewed the temporary inconvenience for creators as worth the end result. “We accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible.”