Home » Technology » Roblox is struggling to moderate re-creations of mass shootings

Share This Post

Technology

Roblox is struggling to moderate re-creations of mass shootings

Roblox is struggling to moderate re-creations of mass shootings

For over a year, Anti-Defamation League researcher Daniel Kelley has been finding re-creations of a horrific mass shooting on Roblox — and every time he looks, he says he finds more. Kelley told The Verge it’s happened three times: first in January 2020, then again in May 2021. The most recent incident came on August 13th, as he was preparing a presentation on how to report offending content.

“I would like one time,” Kelley said on Twitter after the last incident, “to search for ‘Christchurch’ on Roblox and not find a new recreation of the 2019 Christchurch mosque shooting on a game platform aimed at very young children.”

The mosque shooting, which killed more than 50 people, has been condemned as one of the most violent single acts of religious hatred undertaken in recent years. In the wake of the shooting, Prime Minister Jacinda Ardern called it “one of New Zealand’s darkest days.”

The Verge confirmed Kelley’s finding, locating two different Roblox experiences focused on the shooting through a simple text search for “Christchurch.” Both rooms had logged more than two hundred visits.

Roblox proactively monitors for terrorist content, but the scenarios flagged by Kelley seem to have slipped through. Reached for comment, Roblox insisted that the platform is aggressively moderating against re-creations of mass shootings.

“We promptly removed this experience from Roblox after it was brought to our attention and suspended the user responsible for violating our Community Rules,” a company representative said in a statement. “We do not tolerate racism, discriminatory speech, or content related to tragic events. We have a stringent safety and monitoring system that is continuously active and that we rigorously and proactively enforce.”

References to the Christchurch shooting are particularly difficult to block through automatic text searches, the company said, since a catchall filter would also block references to the city. “In this case our proactive detection includes human review to balance allowing references to the geographic location (New Zealand’s largest city) but not uses that violate our policies,” the representative continued.

Part of the issue stems from Roblox’s sheer size — the result of rapid and unprecedented growth. The game currently boasts over 40 million daily users, resulting in a flood of user-generated rooms too massive to manually scan. The distributed structure of the game, which relies on users generating their own scenarios and spaces, makes it particularly challenging to monitor.

Roblox filed to go public in November, claiming in a filing that the game had been played by more than half the children under 16 in the US. The company is currently valued at roughly $45 billion.

The same IPO filing specifically names community moderation issues as a potential risk to the business. “The success of our business model is contingent upon our ability to provide a safe online environment for children to experience,” the filing reads. “If we are not able to continue to provide a safe environment, our business will suffer dramatically.”

As an extremism researcher, Kelley is less worried about the health of Roblox’s business, and more worried that the platform could become a vector for radicalization.

“Each game on Roblox is a potentially a social platform in and of itself, and can potentially give refuge to players of all ages who are flirting with or fully engaged with hateful ideologies online,” Kelley says. “Every space that allows for the veneration of hateful ideologies … contributes to the normalization of these ideologies and their spread.”

Adi Robertson contributed reporting to this article.

Share This Post