- A new report finds that harmful content on TikTok can appear within minutes of creating an account.
- Within 2.6 minutes after joining, users were recommended content related to suicide, the new report found.
- Eating disorder content was recommended within 8 minutes.
TikTok recommends potentially harmful content related to self-harm and eating disorders to some young teenagers within minutes of them creating an account, a new report released Dec. 15 found.
In the study, researchers from the non-profit Center for Countering Digital Hate (CCDH) set up TikTok accounts posing as 13-year-old users from the United States and several other countries.
Within 2.6 minutes after joining, users were recommended content related to suicide, the report found. Eating disorder content was recommended within 8 minutes, it showed.
Every 39 seconds on average, TikTok recommended videos about body image and mental health to teens, the report shows.
Researchers also found content on the platform related to eating disorders, with 13.2 billion views. This content included 56 hashtags, which were often designed to evade moderation by the platform.
“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Imran Ahmed, CEO of the CCDH, said in the report.
TikTok was launched globally in 2018 by Chinese company ByteDance. A number of U.S. states are cracking down on the app over concerns about data security and mature content on the app that is accessible by teens.
In September 2021, the app reached a billion active monthly users worldwide.
Over two-thirds of U.S. teens say they use TikTok, with 1 in 6 saying they use it almost constantly, according to a survey by the Pew Research Center.
The app’s algorithm recommends content to users apparently based on their likes, follows, watch time and interests. This produces an endlessly-scrolling personalized “For You” feed.
To test the algorithm, researchers set up two new accounts each in the United States, Canada, the United Kingdom and Australia at TikTok’s minimum user age of 13.
One “standard” account had a female username created with a random name generator. The second — “vulnerable” — account contained the term “loseweight” in the username, indicating a concern about body image.
For all accounts, researchers paused briefly on videos about mental health and body image, and liked them.
Researchers gathered data for the first 30 minutes of use.
They found that the vulnerable accounts received 12 times more recommendations for videos related to self-harm and suicide than the standard accounts.
Dr. Sourav Sengupta, an associate professor of psychiatry and pediatrics at the University at Buffalo, told Healthline that these results are concerning for all teenagers, “but even more so for teens who are at risk because of existing emotional health challenges.
In the study, researchers found that of the 39 videos about suicide shown to vulnerable accounts, six of them discuss plans or desires to attempt suicide.
One video that had amassed over 380,000 likes had the caption: “Making everyone think your [sic] fine so that you can attempt [suicide] in private”.
Melissa Huey, PhD, an assistant professor of psychology at New York Institute of Technology, told Healthline that the study is also disturbing because it shows that even young teens are exposed to this content on TikTok.
“Adolescence is the second most transformative period in your life — after the initial 2 to 3 years,” she said. “So these teens are vulnerable, and they’re highly susceptible to peer influence.”
In addition, “they’re struggling a lot with depression and anxiety — due to social media and the internet. So these platforms are just helping to facilitate issues that teens are already struggling with,” she said.
A TikTok spokesperson questioned the methodology of the study, reports CNN:
“This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people. We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.”
“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”
However, CCDH’s Ahmed said the report underscores the urgent need for reform of online spaces
“Without oversight, TikTok’s opaque algorithm will continue to profit by serving its users — children as young as 13, remember — increasingly intense and distressing content without checks, resources, or support,” he said.
Sengupta said while social media has many positive uses — such as enabling teens to stay connected during the pandemic — the report shows how easy it is for negative content to be amplified in teens’ social media feeds.
With potential consequences.
“For young people’s brains to get accustomed to content at the extremes as being ‘normal’ is really risky,” he said.
Huey thinks greater accountability for social media companies could reduce some of the harm currently being done.
“Instead of pushing things that exacerbate an eating disorder, [social media platforms] should provide resources that help, like eating disorder or suicide prevention helplines,” she said.
However, “at the end of the day, I think parents really need to step in and have a deeper involvement in what their child is doing online,” she said.
“I’m not blaming parents,” she added. “because I’m a parent too — and it’s hard.”
Huey said because teens are highly susceptible to what’s online, a big part of regulating their internet use involves limiting the time they spend there.
“The longer you spend on the internet, the deeper you can get into it,” she said, “I’m sure we’ve all been there with Facebook or whatever [social media platform], where you just go down a ‘rabbit hole.’”
In addition, being online too often, with few limits, can worsen a teen’s depression and anxiety, said Sengupta.
“A teen in their room all day on social media is not a healthy pattern,” he said. “Parents have to find ways to help teens regulate their use of the internet.”
However, he pointed out that many adults struggle with their own internet use, which shows how challenging this can be.
Still, “if we’re asking our teens to put limits on their use of social media, we need to regulate our own use,” he said.
This role modeling by parents is not just about limiting time on social media. Parents should make an effort to engage with family and friends “in real life,” said Sengupta — which means putting down your phone at the dinner table.
While setting time limits can be helpful, Huey doesn’t think blocking teens’ use of the internet entirely is the best way to go — or even possible.
“At this point in our society — the end of 2022 — teens are going to access the internet,” she said. “So they need to learn how to process the information they find there.”
While this task may seem daunting, Huey said parents who put in the time and effort can really have an impact on their children.
“If children have more parental support and guidance, they have lower susceptibility to peers,” she said. “And in this day and age, the internet is really where teens find their peers.”