TikTok algorithm promotes self-harm and eating disorder videos, according to a report published Wednesday that highlighted worries about social media and its impact on youth mental health.
TikTok accounts for fictional teen personalities in the United States, United Kingdom, Canada, and Australia were constructed by researchers at the charity Center for Countering Digital Hate. The researchers who ran the accounts “liked” videos on self-harm and eating disorders to observe how TikTok’s algorithm reacted.
Within minutes, the immensely popular platform started recommending videos on losing weight and self-harm, including photos of models and desired body types, razor blade images, and suicide chats.
When the researchers constructed identities with user names that implied a specific predisposition to eating disorders, such as “lose weight,” the accounts were given even more hazardous content.
“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the center’s CEO Imran Ahmed, whose organization has offices in the US and UK. “It is literally pumping the most dangerous possible messages to young people.”
Social media algorithms function by recognising subjects and information that a user is interested in, and then sending them more of the same to maximise their time on the site. However, critics of social media argue that the same algorithms that highlight information about a specific sports team, hobby, or dance craze can lead users down a rabbit hole of toxic content.
According to Josh Golin, executive director of Fairplay, a nonprofit that advocates for greater online protections for children, it’s a particular problem for teens and children, who spend more time online and are more vulnerable to bullying, peer pressure, or negative content about eating disorders or suicide.
He went on to say that TikTok isn’t the only platform that doesn’t safeguard its young users from hazardous content and invasive data collecting.
“All of these harms are linked to the business model,” Golin explained. “It doesn’t make any difference what the social media platform is.”
TikTok contested the findings in a statement from a business spokeswoman, emphasising that the researchers did not use the platform like regular users and that the results were skewed as a result. The business also stated that a user’s account name should have no bearing on the type of content the user receives.
TikTok does not allow users under the age of 13, and its official guidelines forbid films that promote eating disorders or suicide. TikTok users in the United States who search for eating disorder content receive a prompt with mental health resources and contact information for the National Eating Disorder Association.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” said TikTok, which is owned by ByteDance, a Chinese business located in Singapore.
Despite the platform’s efforts, researchers at the Center for Countering Digital Hate discovered that eating disorder video had been seen billions of times on TikTok. Researchers discovered that in certain situations, young TikTok users used coded language about eating disorders to avoid TikTok’s content control.
The sheer volume of toxic content available to teens on TikTok demonstrates that self-regulation has failed, according to Ahmed, who added that government regulations are required to push platforms to do more to protect minors.
Ahmed pointed out that the TikTok version available to domestic Chinese audiences is designed to promote math and science material to young users, and it limits the amount of time 13- and 14-year-olds can spend on the site each day.
A proposal before Congress would impose new laws limiting the data that social media platforms can gather about young users, as well as establish a new office within the Federal Trade Commission dedicated to preserving the privacy of young social media users.
Senator Edward Markey, D-Mass., one of the bill’s supporters, said Wednesday that he believes lawmakers from both parties can agree on the need for stricter rules on how platforms access and utilise the information of underage users.
“Data is the raw material that big tech uses to track, to manipulate, and to traumatize young people in our country every single day,” Markey added.
[embedpost slug=”famous-tiktok-star-died-in-car-accident/”]