The first time Gaëlle heard about TikTok, she wasn’t really wary of it. She just thought that the dance challenges that a colleague’s daughter took on the platform were “surprising” for a child of her age. When her own teenager gets her first cell phone, when she enters fifth grade, Gaëlle therefore “does not pay much attention” to the installation of the Chinese application on her smartphone – even if she establishes parental controls and ensures that the Wi-Fi in the family home is turned off “from 9 p.m.”. A few months later, the mother was contacted by her daughter’s dance teacher, who alerted her to the “impressive scars” which streaked the schoolgirl’s arms. In a few months, her mental health deteriorated violently: the following year, the young girl attempted suicide by ingesting an almost lethal quantity of paracetamol. Then lost “more than 15 kilos in three months”, going from 50 to 35 kilos in 2023 – to the point of being hospitalized for more than a year.
“We realized that she was reporting a lot of stories about people who were really not doing well. We wondered where she met them, then how she had learned that paracetamol could be lethal, or how she had the idea of scarifying himself,” says Gaëlle. Suspecting a link with social networks, she finally decides to cut off her daughter’s access to TikTok, and observes a “clear change” in her child’s behavior. “We then spoke to her about social networks, and she told us about the spiral into which she had fallen on the platform,” breathes Gaëlle, who evokes the videos “which depict, romanticize and glamorize self-harm and anorexia “, the “encouragements to stop eating” throughout the content, or even “the video loops of adolescents who describe in detail their suicidal and depressive thoughts”. “There is an extremely perverse mechanism, which locks adolescents into harmful content without any moderation. It is a dangerous weapon,” concludes the mother.
Member of the collective of victims’ relatives “Algos victima”, Gaëlle is one of seven French families to have contacted the lawyer specializing in digital law Laure Boutron-Marmion, who announced on November 4 that she had filed a class action against TikTok on civil court of Créteil – a first in Europe. The social network is notably accused of being partly responsible for the deterioration of the mental and physical health of seven teenage girls, two of whom ended their lives at the age of 15, and four attempted suicide. “TikTok is specifically accused of two things: the fact of having designed a deliberately addictive application which pushes, through its algorithm, dangerous content, and the fact that this content is not sufficiently moderated,” explains the lawyer. from L’Express.
“It’s an escalation of horror”
Faced with the Chinese giant, Me Boutron-Marmion is well aware that the fight will be tough. “I obviously expect the predictable argument according to which this content is not directly created by TikTok, that the platform is only a host and cannot therefore be incriminated, for example”, provides the lawyer, for whom this argument is “largely insufficient”. “The problem is not so much the existence of the videos per se, but rather their lack of moderation by TikTok, and the virality of this content allowed by the algorithmic dynamics of the platform,” she underlines. The entire complexity of the case thus rests on proving the causal link between suicide, suicide attempts by adolescents or their general psychological state (self-mutilation, episodes of anorexia, etc.), and the looping of video viewing. propelled by TikTok on the news feed of these young users.
“In this case, the platform could be held responsible for the bodily harm caused to these adolescents, and be ordered to pay compensation to the families. This decision could also encourage them to change their commercial policy, particularly on the problem of algorithms,” hopes the lawyer. For its part, TikTok indicates to L’Express that it has received “no notification relating to this legal procedure”, and specifies that its community rules indicate “very clearly” that it “is not authorized to exhibit, promote or sharing plans for suicide or self-harm.” Between April and June 2024, among the videos deleted by the platform for violation of these community rules, “98.8% were removed proactively, 91% of these videos having not been viewed”, it is specified, while the social network assures that it “endeavours to carefully apply limits” to certain content which “may have an impact on the viewing experience if viewed repeatedly”, particularly when it comes to of videos on the themes of “sadness, extreme exercise or diets”.
And for good reason. Since the start of the 2020s, several international reports have already highlighted “the dangerousness” of the famous algorithms created by TikTok on certain types of videos. In December 2022, a study from the British NGO Center for Countering Digital Hate showed that a 13-year-old user who spent a few seconds in front of content related to self-image and mental health issues would be offered “twelve times more related videos to suicide”, would be redirected after “2.6 minutes on average” to videos on suicide, and after “eight minutes” to content on eating disorders. Same type of observation at the end of a study conducted by AI Forensics, Algorithmic Transparency Initiative and Amnesty International, published in November 2023, according to which “children and young people who view mental health-related content on their TikTok feed are quickly drawn into ‘spirals’ of content that idealize and encourage depressive thoughts, self-harm and suicide.” “It’s an escalation in horror: we’re talking about challenges on the width of the scarifications, the place where to inflict them, a trivialization of the desire for suicide, a glamorization of dark ideas or food deprivation,” insists Me Boutron-Marmion.
“Systemic risk”
To the point that the very recent European Digital Services Regulation (DSA, in English), set up by the European Commission and entered into force in August 2023, specifies in its article 34 that providers of very large online platforms – such as TikTok – are required to “identify, analyze and evaluate diligent manner any systemic risks within the Union arising from the design or operation of their […] algorithmic systems”, including “any actual or foreseeable negative effect linked […] to the protection of public health and minors and the serious negative consequences on the physical and mental well-being of people”.
“There is no specific text on the responsibility of platforms and their algorithms on mental health in France, but European law prevails, and a breach of the DSA could directly influence this procedure in France… If indeed the breach can be proven”, analyzes Constantin Pavléas, lawyer specializing in digital law. According to the DSA, online platform providers are also required to quickly remove or block content marked as “manifestly illegal”. “Here again, there may be a pitfall: certain videos evoking suicide or self-mutilation on TikTok are not clearly illicit, and fall into a sort of gray zone, with no obligation to withdraw,” deciphers his colleague Alexandre Archambault, also lawyer specializing in digital law.
Investigations opened by the European Commission
However, the operation of TikTok is firmly monitored by the European Commission, which announced in February 2024 opening an investigation targeting the platform for alleged breaches of the DSA, specifically concerning the protection of minors. The Commission mentioned in particular “the risks linked to the addictive design” of TikTok and the “harmful content” which was disseminated there. This infringement procedure, still ongoing, should allow the Commission to ensure that TikTok takes the necessary measures “to protect the physical and emotional well-being of young Europeans”, commented the European Market Commissioner at the time. interior, Thierry Breton. “The Commission does not turn a blind eye to TikTok’s practices, and is closely interested in the functioning and addictive nature of its algorithms and the way in which the platform attempts or not to mitigate these risks”, confirms to L’ Express Thomas Regnier, spokesperson at the European Commission for the digital economy, research and innovation.
“The investigation is ongoing, TikTok is collaborating and cooperating. If the platform is confirmed to have violated the DSA, it risks confirmation of the preliminary investigation, which may lead to a decision of non-compliance accompanied by a financial sanction of up to 6% of the platform’s global turnover”, it is specified. Over the past year, Thomas Regnier adds that “six different requests for information” have been sent by the Commission to TikTok – most of which are linked “to the protection of minors, the functioning of the algorithms and the platform’s recommendation systems”. Two formal investigations – including those in 2022 – have been opened.
In such a context, the appeal filed by Me Laure Boutron-Marmion will “perhaps set a precedent and raise awareness of the risks linked to this type of social network”, concludes the lawyer, referring to the British case of Molly Russell, a 14-year-old teenager who committed suicide in 2017. After a long investigation, Meta – owner of Facebook and Instagram – and Pinterest were found partly responsible for her death in the fall of 2022. Investigators responsible for establishing the causes of death thus specified that “the negative effects of online content”, consumed massively by the young woman through algorithms, “significantly contributed” to her suicide. According to the investigation, during the six months preceding her death, Molly had viewed 138 videos discussing suicide or mutilation on social networks.
.