Home » Technology » The child safety problem on platforms is worse than we knew

Share This Post

Technology

The child safety problem on platforms is worse than we knew

The child safety problem on platforms is worse than we knew

Millions of young children are using platforms years before they turn 13 — and their presence could put many of them in danger. An alarming new study has found that minors in the US often receive abuse, harassment, or sexual solicitation from adults on tech platforms. For the most part, though, children say they are not informing parents or other trusted adults about these interactions — and are instead turning for support for tech platforms, whose limited blocking and reporting tools have failed to address the threats they face.

The report from Thorn, a nonprofit organization that builds technology to defend children from sexual abuse, identifies a disturbing gap in efforts by Snap, Facebook, YouTube, TikTok, and others to keep children safe. Officially, children are not supposed to use most apps before they turn 13 without adult supervision. In practice, though, the majority of American children are using apps anyway. And even when they block and report bullies and predators, the majority of children say that they are quickly re-contacted by the same bad actors — either via new accounts or separate social platforms.

Among the report’s key findings:

  • Children are using major platforms in large numbers long before they turn 13: 45 percent of children ages 9-12 say they use Facebook daily; 40 percent use Instagram; 40 percent use Snapchat; 41 percent use TikTok; and 78 percent use YouTube.
  • Children report having online sexual interactions at high rates — both with their peers and people they believe to adults: 25 percent of kids 9-17 reported having had a sexually explicit interaction with someone they thought was 18 or older, compared to 23 percent of participants that had a similar experience with someone they believed to be a minor.
  • Children are more than twice as likely to use platform blocking and reporting tools than they are to tell parents and other caregivers about what happened: 83 percent of 9- to 17-year-olds who reported having an online sexual interaction reacted with reporting, blocking, or muting the offender, while only 37 percent said they told a parent, trusted adult, or peer.
  • The majority of children who block or report other users say those same users quickly find them again online: More than half of children who blocked someone said they were contacted again by the same person again, either through a new account or a different platform. This was true both for people children knew in real life (54 percent) and people they had only met online (51 percent).

Children who identify as LGBTQ+ experience all of these harms at higher rates than their non-LGBTQ+ peers: 57 percent of youth who identify as LGBTQ+ said they have had potentially harmful experiences online, compared to 46 percent of non-LGBTQ+ youth. They also had online sexual interactions at much higher rates than their peers.



Thorn’s research comes at a time when regulators are examining platforms’ child safety efforts with increasing scrutiny. In a March hearing, US lawmakers criticized Facebook and Google for the potential effect their apps could have on children. And this week, 44 attorneys general wrote a letter to Facebook CEO Mark Zuckerberg urging him to abandon a plan to create a version of Instagram for children.

The Thorn report could trigger heightened scrutiny of the way young children use platforms, and the relatively few protections that tech companies have currently put in place to protect them or offer them support. It could also push the industry to collaborate on cross-platform solutions between Facebook, Google, Snap and others that Thorn says are needed to fully address the threats children face.

Taken together, the findings suggest that platforms, parents, and governments need to work harder to understand how children are using technology from the time they are in elementary school, and develop new solutions to protect them as they explore online spaces and express themselves.

“Platforms have to be better at designing experiences,” said Julie Cordua, Thorn’s CEO, in an interview. “Adults need to create safe spaces for kids to have conversations … And then on the government and policy side, lawmakers need to understand kids’ online experiences at this level of detail. You’ve got to dive into the details and understand the experiences of youth.”

Data on these subjects is difficult to collect for several reasons, many of them obvious: the topics involved are extremely sensitive, and much of the best information is held within the private companies running the platform. To do its work, Thorn had 1,000 children ages 9 to 17 fill out a 20-minute online survey with their parents’ consent. The survey was conducted from October 25 to November 11, and the margin of sampling error is +/- 3.1 percent.

The findings of the survey tell multiple stories about the current relationship between children and technology. The most pressing one may be that because young children are not supposed to use platforms, companies often do not build safety tools with them in mind. The result is a generation of children, some of them still in elementary school, attempting to navigate blocking and reporting tools that were designed for older teens and adults.

“Kids see platforms as a first line of engagement,” Cordua said. “We need to lean into what kids are doing, and beef up the support mechanisms so they’re not putting the onus on the child. Put the onus on the platform that’s building the experience and make it safe for all.”



An obvious thing platforms can do is to rewrite the language they employ for user reports to more accurately reflect the harms taking place there. Thorn found that 22 percent of minors who attempted to report that their nudes had been leaked didn’t feel like platform reporting systems permitted them to report it. In Silicon Valley terms, it’s a user experience problem.

“While minors say they are confident in their ability to use platform reporting tools to address their concerns, when given a series of commonly available options from reporting menus, many indicated that none of the options fit the situation,” the Thorn report says. “Nearly a quarter said they ‘don’t feel like any of these choices fit the situation’ of being solicited for (self-generated nudes) by someone they believe to be an adult (23 percent) or someone they believe to be under 18 (24 percent).”

Thorn has other ideas for what platforms in particular should do here. For example, they could invest more heavily in age verification. Even if you accept that kids will always find ways into online spaces meant for adults, the sheer numbers can be surprising. Thorn found that 27 percent of 9- to 12-year-old boys have used a dating app; most dating apps require users to be at least 18.

Platforms could integrate crisis support phone numbers into messaging apps to help kids find resources when they have experienced abuse. They could share block lists with each other to help identify predators, though this would raise privacy and civil rights concerns. And they could invest more heavily in ban evasion so that predators and bullies can’t easily create alternate accounts after they are blocked.

In their current state, Cordua told me, platform reporting tools can feel like fire alarms that have had their wires cut.

“The good thing is, kids want to use these tools,” she said. “Now we need to make sure they work. This is an opportunity for us to ensure that if they pull that fire alarm, people respond.”

Too often, though, no one does. One in three minors who have reported an issue said the platform took more than a week to respond, and 22 percent said they had made a report that was never resolved. “This is a critical window during which the child remains vulnerable to continued victimization,” the report authors say.

The Thorn report found children on every platform where it looked. But some stood out for the frequency with which children reported experiencing harm. The platforms with the highest number of minors reported potential harm were Snapchat (26 percent), Instagram (26 percent), YouTube (19 percent), TikTok (18 percent), and Messenger (18 percent). The platforms where the most minors said they had an online sexual interaction were Snapchat (16 percent), Instagram (16 percent), Messenger (11 percent), and Facebook (10 percent).

I shared Thorn’s findings with these platforms and asked them to respond. Here’s what they told me.

  • Snap: “We really appreciate the extensive findings and related recommendations in Thorn’s research. The prevalence of unwanted sexual contact is horrific, and this study will help inform our ongoing efforts to combat these behaviors on Snapchat. In recent months, we have been increasing our in-app education and support tools for Snapchatters, working to revamp our in-app reporting tools, putting in place additional protections for minors, and expanding resources for parents. After reviewing this research, we are making additional changes to make us be even more responsive to the issues raised by the report.”
  • Facebook and Instagram: “We appreciate Thorn’s research and value our collaboration with them. We’ve made meaningful progress on these issues, including restricting Direct Messages between teens and adults they don’t follow on Instagram, helping teens avoid unwanted chats with adults, making it harder for adults to search for teens, improving reporting features, and updating our child safety policies to include more violating content for removal. However, Thorn’s research, while good, stopped far short of the global view on this issue when they excluded Apple’s iMessage, which is used heavily by teens, is preloaded on every iPhone, and is bigger than Messenger and Instagram Direct combined.” (Thorn responded to the swipe at Apple by saying it focused on social platforms because that is where many of these inappropriate relationships begin — typically they migrate to messaging apps from elsewhere.)
  • YouTube: “YouTube doesn’t allow content that endangers the emotional and physical well-being of minors and we have strict harassment policies prohibiting cyberbullying or content that threatens individuals in any way. Additionally, because YouTube has never been for people under 13, we created YouTube Kids in 2015 and recently announced a supervised account option for parents who have decided their tweens or teens are ready to explore YouTube.”
  • TikTok: “Protecting minors is vitally important, and TikTok has taken industry-leading steps to promote a safe and age-appropriate experience for teens. These include setting accounts ages 13-15 to private by default, restricting direct messaging for younger teens, and committing to publish information regarding removals of suspected underage accounts in our Transparency Reports.”

While platforms have implemented a variety of partial solutions to the issue of underage access and the harms that can follow, there remains a lack of industry-wide standards or collaboration on solutions. Meanwhile, parents and lawmakers seem largely ignorant of the challenges.

For too long, tech companies have been able to set aside problems related to the presence of children on their platforms simply by saying that kids are not allowed there, and that the companies work hard to remove them. The Thorn report illustrates the degree to which harms remain on the platform in spite of, and even partially because of, this stance. To truly protect children, they will need to take decisive and more coordinated action.

“Let’s deal with the reality that kids are in these spaces, and re-create it as a safe space,” Cordua said. “When you build for the weakest link, or you build for the most vulnerable, you improve what you’re building for every single person.”


This column was co-published with Platformer, a daily newsletter about Big Tech and democracy.

You Deserve to Make Money Even When you are looking for Dates Online.

So we reimagined what a dating should be.

It begins with giving you back power. Get to meet Beautiful people, chat and make money in the process. Earn rewards by chatting, sharing photos, blogging and help give users back their fair share of Internet revenue.https://www.pmdates.com/assets/sources/uploads/5e2ec867e1d61_pmdates392x105.png

Share This Post