Facebook and Instagram have removed more than 20 million posts for violating their policies on COVID-19-related misinformation in the second quarter, Facebook said in its community standards enforcement report on Wednesday.
In addition, the social media giant said that it deleted more than 3,000 accounts, pages and groups for repeatedly violating its rules against spreading COVID-19 and vaccine misinformation. The company placed warnings on more than 190 million COVID-19-related posts on Facebook.
Facebook reported that the prevalence of hate speech has decreased over the last three quarters, which is largely due to the company’s improvements in detection and ranking changes in News Feed. In Q2, the prevalence of hate speech was measured at 0.05 percent, meaning there were five views of hate speech for every 10,000 views of content. That figure is down from 0.05 to 0.06 percent, equivalent to five to six views per 10,000 views, in Q1.
Facebook said that it took action against a total of 6.2 million pieces of organized hate content in Q2, as compared to 9.8 million posts in Q1. Elsewhere, the company reported 34.1 million pieces of violent and graphic content, up from 30.1 million pieces in Q1.
On Instagram, the company took action on 367,000 pieces of organized hate content and 7.6 million pieces of violent and graphic content.
In another tech update, watch Boston Dynamics’ Atlas robots perform parkour.