The spread of fake photos is “alarming” and “obviously” shows that legislation is needed, the White House said on Friday.
Share this story
Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon. The statement, from White House press secretary Karine Jean-Pierre, came in response to a question about the spread of fake sexualized photos of Taylor Swift on social media this week.
Jean-Pierre called the incident “alarming” and said it’s among the AI issues the Biden administration has been prioritizing.
“Of course Congress should take legislative action,” Jean-Pierre said. “That’s how you deal with some of these issues.” She did not refer to any specific legislation that the White House was backing.
The images spread across X in particular on Wednesday night, with one hitting 45 million views before being taken down. The platform was slow to respond, with the post staying up for around 17 hours. The images later spread to smaller accounts and are still available on X.
Jean-Pierre said social media platforms “have an important role to play in enforcing their own rules” to prevent this type of material from spreading. “We know that lax enforcement disproportionately impacts women and also girls, sadly, who are the overwhelming targets of online harassment and also abuse,” she said in a briefing with reporters.
The White House previously launched a task force to address online harassment, Jean-Pierre said. But it was clear that was a patchwork approach. “There should be legislation, obviously, to deal with this issue,” she said.
Congress has spent years criticizing social media platforms for their moderation practices, but the body has so far been unable to agree on and pass regulations in response. Support for Taylor Swift may be bipartisan, but it’s not clear that’ll be enough to pull together an actual bill.