Since TikTok came to the forefront of social media, viral challenges have contributed significantly to the app’s popularity, often spreading like wildfire especially among younger users. While some of these challenges are harmless and can drive engagement, the company is now looking to curb ones that are more dangerous along with toxic hoaxes.
In a new survey of more than 10,000 teenage users, parents and teachers across numerous countries including the U.S. and U.K., TikTok found that 31 percent of teens have participated in some form of online challenge before. Although 48% of those users said that the challenge was completely safe and harmless, 32% said that their experience involved a little risk, and 14% actually described their challenge as risky or flat-out dangerous. 3% even called them “very dangerous.”
At the same time, the study also revealed that 46% of teenagers would prefer to have access to more information before engaging in those viral challenges, while another 31% said that they’ve “felt a negative impact” from hoaxes related to either self-harm or suicide.
In light of these results, TikTok is now looking to change up its policies and terms and conditions, hoping to create a safer environment for its users. The platform currently already removes hoaxes and tries to contain them from reaching a bigger audience, but it also plans on actively removing “alarmist warning” videos that give off the impression of hoaxes being real.
In addition, its safety teams will be alerted when certain violating content flares up in the community, and those who see posts revolving around dangerous challenges and hoaxes will be presented with a warning label that was written in consultation with a clinical child psychiatrist and behavioral scientist. For users who actively search for posts relating to self-harm and suicide, TikTok will also provide access to preventative resources such as the National Suicide Prevention Helpline.
Elsewhere in tech, Instagram will now let you “rage shake” your phone to report an issue.