13-year-olds can be pushed videos about suicide within minutes of joining TikTok
TikTok actively pushes suicide and eating disorder content to accounts of children, according to a new study.
The study, conducted by the Center for Countering Digital Hate—a nonprofit that advocates for the de-platforming of hurtful and hateful content on social media—found that TikTok is quick to push harmful content to children. The organization created fresh accounts in the U.S., United Kingdom, Canada, and Australia and set the accounts to the lowest age TikTok will allow—13. The accounts paused briefly on videos about body image and mental health and liked the videos. Within 2.6 minutes, the study found, TikTok recommended suicide content. Within eight minutes, the app was recommending content related to eating disorders, and TikTok was serving content about body image and mental health every 39 seconds. Some were unmarked ads for weight-loss products.
“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” said Imran Ahmed, CEO of the Center for Countering Digital Hate, in a press release.
The group created two accounts in each country, both age 13. One account was set up to be a “standard” teen account, while the other was set to be a “vulnerable” teen. The “vulnerable” accounts were set up to include “loseweight” in the username, as researchers found that would impact the kind of content the account was pushed.
The standard teen accounts had mental health or body image content shown to it every 39 seconds, with 185 videos shown to the accounts in 30 minutes. These accounts were also shown suicide, self-harm, or eating disorder content every 206 seconds, according to the report.
Unsurprisingly, the report said that accounts intentionally set up to mimic vulnerable teens were shown more harmful content, not less. The “vulnerable” accounts were shown content about suicide, self-harm, and eating disorders every 66 seconds, over double the rate of the standard accounts. In one case, a “vulnerable” account was shown three videos of users discussing suicide plans in one minute.
CCDH recommends that countries need to enact legislation to make the most direct impact.
“Without legislation, TikTok is behaving like other social media platforms and is primarily interested in driving engagement while escaping public scrutiny and accountability. The TikTok algorithm does not care whether it is pro-anorexia content or viral dances that drive the user attention they monetize. TikTok just wants to collect users’ data and keep them on the platform, viewing advertisements,” the report said.
Ahmed said the time for legislation to prevent harmful content on TikTok, especially content shown to teens, is now.
“This report underscores the urgent need for reform of online spaces,” Ahmed said. “Without oversight, TikTok’s opaque algorithm will continue to profit by serving its users—children as young as 13, remember —increasingly intense and distressing content without checks, resources, or support.”
TikTok did not respond to a request for comment.
For more information about suicide prevention or to speak with someone confidentially, contact the National Suicide Prevention Lifeline (U.S.) or Samaritans (U.K.).
For more information about eating disorders or to speak with someone confidentially, contact the National Eating Disorders Organization.
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
The post 13-year-olds can be pushed videos about suicide within minutes of joining TikTok appeared first on The Daily Dot.
dailynoti coindeskcrypto cryptonewscrypto bitcoinmymagazine mybitcoinist cryptowithpotato mycryptoslate fivenewscrypto findtechcrunch journalpayments nulltxcrypto newsbtcarea
Post a Comment