Some might find information in this Press Release triggering.
- Technical research in partnership with the Algorithmic Transparency Institute and AI Forensics using automated accounts showed that after 5-6 hours on the platform, almost 1 in 2 videos were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.
- There was an even faster “rabbit hole” effect when researchers manually rewatched mental health-related videos suggested to “sock puppet” accounts mimicking 13-year-old users in Kenya, the Philippines and the USA.
- Between 3 and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticizing, normalizing or encouraging suicide.
- TikTok’s very business model is inherently abusive and privileges engagement to keep users hooked on the platform, in order to collect evermore data about them. It unequally applies protections for users around the world.
TikTok’s content recommender system and its invasive data collection practicespose a danger to young users of the platform by amplifying depressive and suicidal content that risk worsening existing mental health challenges, two companion reports released today by Amnesty International show.
The two reports –Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and the I Feel Exposed: Caught in TikTok’s Surveillance Web -highlight the abuses experienced by children and young people using TikTok, and the ways in which these abuses are caused by TikTok’s recommender system and the underlying business model.
The findings of a joint technical investigation, with our partners – the Algorithmic Transparency Institute (ATI) at the National Conference on Citizenship and AI Forensics – show how children and young people who watch mental health-related content on TikTok’s ‘For You’ page are quickly being drawn into “rabbit holes” of potentially harmful content, including videos that romanticize and encourage depressive thinking, self-harm and suicide.
“The findings expose TikTok’s manipulative and addictive design practices, which are designed to keep users engaged for as long as possible. They also show that the platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm,” said Lisa Dittmer, Amnesty International Researcher.
The platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm.Lisa Dittmer, Amnesty International Researcher