Home » Technology » Facebook moderation is lacking for gender-based violence

Share This Post

Technology

Facebook moderation is lacking for gender-based violence

Facebook moderation is lacking for gender-based violence

/

A loophole within Meta’s bullying and harassment policy enabled a Facebook post depicting a victim of domestic violence to remain on the platform for almost two years.

Share this story

A Facebook logo surrounded by blue dots and white squiggles.

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Illustration by Nick Barclay / The Verge

Meta’s Oversight Board is urging the company to take a stricter stance against content that normalizes gender-based violence after discovering a Facebook post mocking an injured woman had remained on the platform for almost two years without being viewed by a human moderator. On Tuesday, the board asked Meta to address a gap within its bullying and harassment policy that seemingly permits content promoting gender-based violence by “praising, justifying, celebrating, or mocking it” to slip through its moderation practices.

An image depicting a woman with “visible marks of a physical attack, including bruises on her face and body” was published on Facebook in May 2021, alongside a caption in Arabic that alludes to her husband being responsible for her injuries. The caption said that the woman “got what she deserved,” according to the board, and was accompanied by several laughing and smiling emoji. The woman is not named in the post, but her face is visible within the image.

The post depicting the injured woman was reported three times for violating Meta’s violence and incitement community standard in February 2023. However, the board claims this report was closed, as flagged content reports that aren’t reviewed within 48 hours are automatically shut down. The issue was later reported to the Meta Oversight Board directly, and the post has since been removed after Meta agreed that it did, in fact, violate its bullying and harassment policy. Meta has yet to respond to the Oversight Board’s request; The Verge has also reached out for comment.

The board said that the post should have been removed because mocking a serious physical injury violates Meta’s policy on bullying and harassment. It also notes, however, that these current rules wouldn’t be applicable to posts where the victim isn’t identifiable or a fictional character is depicted — essentially loopholes that could allow this type of content to spread. The board is now asking Meta to establish a policy that more effectively bans content that normalizes gender-based violence and to clarify that its bullying and harassment community standard clearly bans calls for or the celebration of “serious physical injury.”

Share This Post