A woman has sued Reddit for allowing an ex-boyfriend to repeatedly post pornographic images of her as a 16-year-old. The lawsuit applies controversial measures instituted in 2018 under FOSTA-SESTA to a site that’s drawn particular criticism for child sexualization. The resulting case will test the limits of platforms’ legal shields amid ongoing efforts to pare back the law behind them.
The woman, identified under the pseudonym Jane Doe, argues that “Reddit knowingly benefits from lax enforcement of its content polices, including for child pornography.” She claims that in 2019, an abusive ex-boyfriend posted sexual photos and videos that he’d taken without her knowledge or consent. But when she alerted Reddit moderators, they could wait “several days” before removing the content, while Reddit administrators allowed the man to keep posting and create a new account when his old account was banned.
“Because Reddit refused to help, it fell to Jane Doe to monitor no less than 36 subreddits — that she knows of — which Reddit allowed her ex-boyfriend to repeatedly use to repeatedly post child pornography,” the complaint reads. “Reddit’s refusal to act has meant that for the past several years Jane Doe has been forced to log on to Reddit and spend hours looking through some of its darkest and most disturbing subreddits so that she can locate the posts of her underage self and then fight with Reddit to have them removed.”
The woman is seeking a class action suit representing anyone who had similar photos or videos posted on Reddit while they were under 18 years of age. She’s accusing Reddit of distributing child pornography, failing to report child sexual abuse material, and violating the Trafficking Victims Protection Act. It’s unclear whether the ex-boyfriend — who would be fully liable for posting the images — has been sued or criminally charged.
The complaint cites FOSTA-SESTA, an amendment to Section 230 of the Communications Decency Act, as a central element of the lawsuit. Section 230 provides a broad legal shield for “interactive computer services” like Reddit, limiting their liability if users post illegal content. But the 2018 FOSTA-SESTA bill removed protections for sex trafficking-related material. The lawsuit apparently argues Jane Doe’s case meets that definition because Reddit’s advertising revenue turned the video into a “commercial sex act.”
The lawsuit argues that Reddit knew its site was a hub for illegal photos and videos, based on news coverage and tips from users themselves, and it should have done more to protect victims. “Reddit has itself admitted that it is aware of the presence of child pornography on its website,” the complaint reads. Among other questionable content, it lists several now-removed subreddits with titles referencing “jailbait,” including an infamous forum that was removed in 2011 after media controversy. (That subreddit did not allow nude images, but it encouraged sexually suggestive ones.)
Reddit denies that it condoned abuse. “Child sexual abuse material (CSAM) has no place on the Reddit platform. We actively maintain policies and procedures that don’t just follow the law, but go above and beyond it,” a Reddit spokesperson said in a statement to The Verge. “We deploy both automated tools and human intelligence to proactively detect and prevent the dissemination of CSAM material. When we find such material, we purge it and permanently ban the user from accessing Reddit. We also take the steps required under law to report the relevant user(s) and preserve any necessary user data.”
Reddit banned child sexualization in 2012, saying that interpreting “vague and debated legal guidelines” over what counted as illegal child pornography risked pulling Reddit into a “legal quagmire.” But the suit alleges that Reddit fails to enforce its guidelines, particularly because it outsources moderation to unpaid volunteers who manage individual subreddits. “Reddit’s internal security has been compromised by its choice to rely on unpaid moderators, which fail to enforce the standards that are supposed to protect Reddit users and others,” it says.
FOSTA-SESTA had immediate and damaging repercussions for sex workers, who — thanks to US laws that conflate consensual sex work with trafficking — faced a broad online crackdown. Meanwhile, anti-trafficking lawsuits relying on FOSTA-SESTA have appeared more gradually, and their results have been less clear. Facebook is currently embroiled in a long-running Texas case claiming the site allowed traffickers to recruit victims. University of Notre Dame law professor Alex Yelderman wrote about some other early cases in January of 2020, including complaints targeting online marketplace Craigslist and email marketing company MailChimp.
FOSTA-SESTA was aimed at taking down escort sites like Backpage, which hosted ads for illegal sexual services. (Notably, Backpage’s operators were arrested before FOSTA-SESTA became law.) But this lawsuit covers a very different and less directly trafficking-related situation: a site failing to remove nonconsensual pornography involving a minor. The firm behind the case, Susman Godfrey LLP, used a similar argument against porn giant MindGeek in February — saying video site Pornhub violated trafficking laws by hosting underage sexual content.
If courts don’t agree that sharing the images constitutes trafficking, Section 230 would likely protect Reddit from civil liability. Section 230 doesn’t protect sites from federal charges, including ones related to child sexual exploitation, but there’s no indication Reddit is in danger of criminal prosecution in connection with this incident. And there is no FOSTA-SESTA-style carveout exempting child sexual abuse material from Section 230 civil protections.
Some lawmakers have pushed for broader limits to Section 230. The EARN IT Act, for instance, would make services potentially liable for child sexual abuse material unless they hewed to a set of best practices. But its open-ended language alarmed privacy advocates who saw it as a potential attack on encryption, and it hasn’t proceeded to a vote on the House or Senate floor. Changing (or repealing) Section 230 could also drag sites and apps into protracted court battles over moderation decisions, particularly smaller services without the legal resources of tech giants.
Much of the debate over Section 230 has focused on Facebook, Alphabet, and other massive Silicon Valley companies. Suits like this one highlight FOSTA-SESTA’s potential impact on mid-sized web services. They also, however, highlight harmful failures in services like Reddit — whose hybrid moderation system has created gaps that abusers can easily exploit.