Home » Technology » Content moderation issues are taking center stage in the presidential election campaign

Share This Post

Technology

Content moderation issues are taking center stage in the presidential election campaign

Content moderation issues are taking center stage in the presidential election campaign

Happy Worldwide Developer Conference to those who celebrate. I spent the weekend watching three stories unfold at the intersection of social networks and politics. Let’s take them in turn.

I.

During the initial George Floyd protests earlier this month, a big question was whether the social networks would take action against President Trump for one of his bad posts. Twitter applied some warning labels, and Snapchat removed him from promotion in the Discover tab. But Facebook declined to take action, and a lot of people got mad: current employees, former employees, advertisers. There was even a virtual walkout. Facebook not acting in this case led some people to believe it would not act in any case, and it’s still dealing with the fallout.

Around that time, I predicted that by July or August, Trump would post something else that clearly violated Facebook’s policies and had to be removed. (Scrub to about 1:03:45 of this Vergecast episode.) Instead it took about two weeks. Here’s Russell Brandom in The Verge:

Facebook has removed more than 80 ads placed by the Trump campaign for use of imagery linked to Nazism. The ads used the imagery of an inverted triangle, which the Trump campaign has argued is a “symbol widely used by antifa.” The same symbol was used to identify political prisoners in Nazi death camps, leading Media Matters to call it an “infamous Nazi symbol” with no place in political rhetoric.

Facebook agreed, ultimately removing the ads because of the Nazi-linked imagery. “Our policy prohibits using a banned hate group’s symbol to identify political prisoners without the context that condemns or discusses the symbol,” said Facebook’s Andy Stone in a statement.

A day later, both Facebook and Twitter removed an organic post from Trump’s account. In this case, the video was a deceptively edited piece taken from CNN, and it was removed under the Digital Millennium Copyright Act. That’s a little different than removing a post for violating a rule about speech, but it still signals that Facebook is willing to act, and I have no real doubts the company will continue removing presidential posts that violate their standards.

In fact, my cynical read here is that everyone is getting something that they want. Trump and the right get ammo for their ongoing bad-faith allegations that social networks are “biased” against them, even as their posts get more distribution than anyone else’s; and Facebook gets to point to enforcement action as evidence it’s not in bed with the administration. (Even if it’s an occasional dinner guest.) That ought to help with morale, and may discourage other brands from showily, temporarily pulling their advertisements.

At the same time, there is a clear pattern of escalation here. Questionable posts stayed up; that led to the posting of Nazi imagery; surely worse is to come. I’ve never thought it was even plausible that a big social network would ban one of the president’s accounts. But if he continues in this vein, one or more of them may feel as if they don’t have a choice.

II.

Another thing the president did over the weekend was to hold an indoor, mask-optional rally at the near-height of the ongoing pandemic. At one point, the campaign expected that as many as 1 million people would try to attend the rally. It turned out that they had been played. Here are Taylor Lorenz, Kellen Browning and Sheera Frenkel in the New York Times:

TikTok users and fans of Korean pop music groups claimed to have registered potentially hundreds of thousands of tickets for Mr. Trump’s campaign rally as a prank. After the Trump campaign’s official account @TeamTrump posted a tweet asking supporters to register for free tickets using their phones on June 11, K-pop fan accounts began sharing the information with followers, encouraging them to register for the rally — and then not show.

The trend quickly spread on TikTok, where videos with millions of views instructed viewers to do the same, as CNN reported on Tuesday. […] Thousands of other users posted similar tweets and videos to TikTok that racked up millions of views.

Discussion of this story dominated my feeds on Sunday. One, the people in my feeds mostly like to see the president made to look foolish. Two, the culprits — younger music fans organizing on an upstart social network — made the story irresistible.

It also raised some provocative questions.

  • How much of the low turnout should be attributed to the teens, and how much should be attributed to other factors — like that pandemic, for example?
  • Was this an example of what Facebook would call, on its platform, “coordinated inauthentic behavior”? Or was it something else? (Nathaniel Gleicher, who runs security policy at Facebook, said it’s something else, because the TikTok teens appeared to be using their real accounts and are not working to hide their coordination.)
  • How will this behavior be weaponized in the future against the rest of us? “When one group uses these algorithms effectively, supporters tend to celebrate,” Zeynep Tufecki tweeted. “In 2012, it fell on deaf ears when a few of us tried to warn that the role Facebook was playing in elections wasn’t healthy for democracy. It took 2016 to realize tools don’t stay in one side’s hands.”

In the moment, Trump versus the TikTok teens offered a canvas that anyone could project their hopes and LOLs onto. And a lot of good comes out of online protests and organizing. But if you worried about Russians tricking Americans into showing up to fake events in 2016, it seems to me you might also worry about the implications of Americans tricking campaigns into doing, uh, whatever they tricked Trump into doing this weekend. A country where nothing is true and everything is possible is, to be clear, looks more like Russia than the one I grew up in. It seems like a dangerous path to go down, even if I realize we’re already well on our way.

III.

Snap won some praise earlier this month when CEO Evan Spiegel announced that Trump would be removed from Snapchat Discover. As part of that announcement, he wrote an unusually personal blog post in which he reflected on his privilege and advocated for reparations for black folks.

Some former employees read the blog post and then tweeted about how when they worked at Snap, that commitment to racial justice hadn’t always been apparent. Then some current employees were like hey, if you want to show your commitment to black folks, how about releasing a public diversity report, like all of Snap’s peers? And Spiegel said no, because it would “only [reinforce] the perception that tech is not a place for underrepresented groups.” The idea apparently being that not knowing how many underrepresented minorities work at Snap would make it more attractive to underrepresented minorities than knowing.

Anyway, then Friday was Juneteenth, and as it often does, the company released a special augmented-reality filter to commemorate the occasion. It was, unfortunately, a disaster. Kim Lyons and I wrote about it at The Verge:

Snapchat is apologizing for a controversial Juneteenth filter that allowed users to “smile and break the chains,” saying the filter had not gone through its usual review protocols. The filter was panned by critics on Friday morning shortly after its release for its tone deafness, and was disabled by about 11AM ET. […]

Atlanta-based digital strategist Mark S. Luckie demonstrated the filter on Twitter, calling it “interesting.” The filter showed what appeared to be an approximation of the Pan-African flag, and prompted the user to smile — a common trigger for animated Snapchat filters — causing chains to appear and then break behind the user.

Snap has a history of releasing disastrous filters. It also has a history of bragging about how it uses human curators to weed out bad stuff from Discover and other surfaces of its app. So this one hurt.

The company investigated what happened and emailed its findings to the team. We published that email on Sunday, and it’s worth reading in full. Oona King, the company’s head of diversity and inclusion, said that black and white employees had collaborated on the filter, but that it had not gone through the usual review processes. She wrote (emphasis hers):

For the record, and the avoidance of all doubt: the two Snap team members who on separate occasions specifically questioned if the “smile” trigger was appropriate for Juneteenth were two White team members. The Snap team members who suggested the smile trigger to begin with, and said it was acceptable to use, were Black Snap team members, and / or members of my team.

Speaking on behalf of my team, clearly we failed to recognize the gravity of the “smile” trigger. That is a failure I fully own. We reviewed the Lens from the standpoint of Black creative content, made by and for Black people, so did not adequately consider how it would look when used by non-Black members of our community. What we also did not fully realize was a) that a ‘smile’ trigger would necessarily include the actual word “smile” on the content; and b) that people would perceive this as work created by White creatives, not Black creatives.

I asked people to share their thoughts, and heard from more than 50, including from some black folks who work in tech. I encourage you to read them.

On one hand: mistakes happen. I’ve spoken to people at Snap about this incident, and they’re clearly pained by it. Sometimes at a company you try to do a nice thing, and despite your best efforts it blows up in your face. And it can be hard to sort out the real lessons to be learned from the conversation on Twitter, which is a powerful amplifier for schadenfreude.

On the other: Snap had been setting itself up to take this fall for years. It talks a big game about inclusion and platform integrity, but the reality has often been found to be running far behind the hype. If you want to take the credit, you’ve got to do the work. Snap ought to hire and retain more underrepresented minorities, release its diversity report, and stop distributing filters to its entire user base without a proper review. And until it does, the least it can do is to stop patting itself on the back.

The Ratio

Today in news that could affect public perception of the big tech platforms.

Trending up: Google is adding fact-checking labels to Google Images search results. The move is meant to curb the spread of misleading photos and images. (Sara Fischer / Axios)

Governing

Over 1,650 Google employees have signed an open letter to CEO Sundar Pichai demanding the company stop selling its technology to police forces across the US. “We should not be in the business of criminalizing Black existence while we chant that Black Lives Matter,” they said. Zoe Schiffer wrote about it today at The Verge:

Employees are specifically calling out Google’s ongoing Cloud contract with the Clarkstown Police Department in New York, which was sued for allegedly conducting illegal surveillance on Black Lives Matter protestors in 2015. They’re also highlighting the company’s indirect support of a sheriff’s department in Arizona tracking people who cross the US-Mexico border.

To workers, the partnerships stand in sharp contrast to the external statements of racial equity that executives like Pichai have been making. While the company has pledged $175 million to support black business owners and job seekers, and YouTube created a $100 million fund to amplify the voices of black creators, it continues to profit from police contracts.

Nextdoor is discontinuing a feature that allows users to forward their posts directly to local police departments. The move comes as the company faces scrutiny over its role as a platform for racial profiling and its increasingly cozy partnerships with law enforcement. (Sarah Holder / Bloomberg)

Mark Zuckerberg and Donald Trump have forged an uneasy alliance. While the two can speak out against each other publicly, in reality the status quo benefits them both. (Ben Smith / The New York Times)

Companies, including the outdoor apparel brand The North Face, are committing to an advertising boycott of Facebook in light of the platform’s handling of misinformation and hate speech. (Brian Fung / CNN)

Rumors about buses full of anti-fascist activists known as antifa coming to cities across the United States spread through local Facebook groups as well as on Nextdoor and community texting networks. While the information turned out to be false, it was hard to track down and stop. (Davey Alba and Ben Decker / The New York Times)

Twitter added a warning label to one of President Trump’s tweets after the company determined it violated its policies on manipulated media. The president tweeted a doctored version of a popular video that went viral in 2019 that showed two toddlers, one black and one white, hugging. (Cat Zakrzewski / The Washington Post)

Both Trump and Biden’s presidential campaigns are using custom apps to speak directly to voters. But Trump’s app is also gathering a ton of personal information on users, including their identity and location. It also asks to control the phone’s Bluetooth function. (Jacob Gursky and Samuel Woolley / MIT Technology Review)

The US Supreme Court rejected an appeal on a ruling that tech companies like Facebook and Google say will cost them billions of dollars in taxes by limiting deductions for stock payments to employees. The appeal challenged an Internal Revenue Service regulation that forces companies to allocate some of those stock expenses to foreign subsidiaries. (Greg Stohr and Laura Davison / Bloomberg)

A French court dismissed Google’s appeal against a $57 million fine issued by the European Union’s data watchdog last year for not making it clear enough to Android users how it processes their personal information. (Natasha Lomas / TechCrunch)

Three days after surveillance firm NSO Group announced its new human rights policy, a likely Moroccan government agency hacked the phone of a human rights defender using NSO malware. The company’s new policy is supposed to rule out this type of behavior. (Joseph Cox / Vice)

The UK government abandoned plans to build its own contact tracing app, after spending three months and millions of pounds on the technology. It will use the system developed by Apple and Google instead. (Dan Sabbagh and Alex Hern / The Guardian)

Industry

Microsoft is shutting down its streaming platform Mixer on July 22nd. It plans to move existing partners over to Facebook Gaming, starting today. Here’s The Verge’s Tom Warren:

Microsoft has struggled to reach the scale needed for Mixer to compete with Twitch, YouTube, and even Facebook Gaming which has led to today’s decision. “We started pretty far behind, in terms of where Mixer’s monthly active viewers were compared to some of the big players out there,” says Phil Spencer, Microsoft’s head of gaming, in an interview with The Verge. “I think the Mixer community is really going to benefit from the broad audience that Facebook has through their properties, and the abilities to reach gamers in a very seamless way through the social platform Facebook has.”

Apple announced iOS 14, its next major software update coming to to iPhones later this year. The news came during WWDC 2020, the company’s first all-digital keynote. Memoji are also getting some improvements, including accessories like face masks, and emotions, like blushing. (Cameron Faulkner / The Verge)

Facebook acquired the virtual reality studio behind “Lone Echo.” Ready at Dawn Studios has been working with Facebook and Oculus for a while as a publishing partner. This acquisition brings the team into the Oculus fold as they prep the release of a full sequel to Lone Echo. (Lucas Matney / TechCrunch)

Apple is acting like a monopolist and a bully, according to the chairman of the House antitrust subcommittee. “Because of the market power that Apple has, it is charging exorbitant rents — highway robbery, basically — bullying people to pay 30 percent or denying access to their market,” said Rep. David Cicilline (D-RI) last week. (Nilay Patel / The Verge)

Apple approved a new version of the subscription email app Hey after rejecting an update last week. But the approval isn’t permanent. It’s meant to give Hey developer Basecamp time to develop a version of the app more in line with Apple’s policies. (Nilay Patel / The Verge)

Apple will start removing thousands of mobile games that lack government approval from its App Store in China next month. The move closes a loophole that some companies have relied on for years. (Zheping Huang / Bloomberg)

TikTok published a blog post explaining how its recommendation algorithm works. The post includes tips for personalizing the feed to avoid being served random videos you might not be interested in. (Julia Alexander / The Verge)

The Pizzagate conspiracy theory has found a new home on TikTok, after YouTube spent years rooting it out. On the video sharing app, the #Pizzagate hashtag has more than 69 million views, while related hashtags have earned several millions more. (Will Sommer / Daily Beast)

Things to do

Stuff to occupy you online during the quarantine.

Read this satire of Hey. Bye.fyi “is the first email service to automatically respond with an insult, and then delete every email sent to you.” I’m sold.

Read this satire of Apple’s App Store policies. After Apple marketing chief Phil Schiller said “You download the app and it doesn’t work, that’s not what we want on the store,” Lanny Bose put together a beautiful page of apps in the App Store that absolutely do not work unless you go sign up elsewhere.

Subscribe to a newsletter chock full of good tweets. If the last section of The Interface is your favorite, check out Subtweets, a hilarious occasional newsletter sent out by two Twitter employees that consistently leaves me cry-laughing.

And finally…

Talk to us

Send us tips, comments, questions, and fake rally registrations: casey@theverge.com and zoe@theverge.com.

You Deserve to Make Money Even When you are looking for Dates Online.

So we reimagined what a dating should be.

It begins with giving you back power. Get to meet Beautiful people, chat and make money in the process. Earn rewards by chatting, sharing photos, blogging and help give users back their fair share of Internet revenue.https://www.pmdates.com/assets/sources/uploads/5e2ec867e1d61_pmdates392x105.png

Share This Post