TikTok caught in the act: videos on suicide broadcast to 13-year-olds, denounces Amnesty International

TikTok caught in the act: videos on suicide broadcast to 13-year-olds, denounces Amnesty International
Chilling investigation: despite European laws, the social network continues to recommend content on self-harm and depression to minors. A phenomenon called the “rabbit hole effect” which plunges young people into a dangerous psychological spiral.

In a new investigation released this Tuesday, the NGO Amnesty International highlights the abuses of the Tiktok application, accused “to encourage self-harm or suicide” among the youngest. Investigation into a social phenomenon that hits our adolescents hard.

What a 13-year-old can see on TikTok

Already, in 2022, the Center for the Fight against Digital Hate (CCDH) had alerted families: by creating fake TikTok accounts, then by liking videos addressing various mental disorders (anorexia, depression, etc.), it only took a few minutes for the application to dump content on the feed evoking the theme of suicide. Since then, things haven’t really changed. According to Amnesty International, TikTok “still floods the screens of vulnerable teenagers with dangerous content” and this, despite the European legislation in force.

To support their claims, Amnesty experts spent several days stepping into the shoes of teenagers.

“We created three fake accounts: a boy and two girls aged 13, the minimum age to be registered on the platform. The instructions: scroll through the videos on the “For you” feed for three to four hours and watch each content related to mental health or sadness twice. Without liking, commenting or sharing anything, just watch”, they emphasize.

Results ?

  • “In less than 20 minutes, feeds are saturated with videos on mental health”;
  • “After 45 minutes of experience, explicit messages about suicide appear”;
  • “Three hours later, all accounts are flooded with dark content, sometimes directly expressing a desire to end one’s life.”

© Amnesty international

Examples of videos recommended by the TikTok algorithm during Amnesty’s experiment on fake accounts of 13-year-olds.

“Rabbit hole effect”: a vicious spiral of harmful content

Justine Payoux, campaign manager at Amnesty International France, confirms having experienced this “infernal spiral of harmful content” – also called the “rabbit hole effect”.

In just 40 minutes, my feed was flooded with melancholy content composed of dark landscapes and sad music. It very quickly led me into a vicious spiral of depressive or suicidal content. During the third hour, I went from an AI video with the cry of a woman in distress, to the video of a crying teenager saying he wanted to end his life, to videos sometimes alluding to methods of suicide. she alerts.

Concretely, it is not possible to choose on the platform what you wish to view. “Very quickly, users find themselves locked into these recommendations“, specifies the report.

Amnesty International’s study reveals that, despite current regulations, risky videos continue to spread on TikTok. They are not hidden; on the contrary, the algorithm highlights them on the “for you” feeds of young people who show an interest in mental health issues. “I had difficulty viewing some of the content, so imagine the effect this could have on vulnerable teenagers as young as 13.”underlines Justine Payoux at the conclusion of the experience.

What should you do as a parent to protect your child?

If it is impossible to select content on Tiktok, talking to your children – and warning them of these risks – still remains the best idea, according to the Center for the Fight against Digital Hate (CCDH).

  • Tip #1: Talk to Kids About Social Media. It’s as important in their lives as television and cable were to generations past.
  • Tip #2: Ask them what they see on their feedbut also what interests them.
  • Tip #3: If a child shows worrying signs or behaviorsthe help of a specialist such as a child psychiatrist – or of child protection associations on the Internet – may be necessary.

For its part, Amnesty International asked TikTok to “stop seeking to maximize user engagement at the expense of their health and other human rights.

So that things can finally move.