Social media algorithms promote misogynistic content among teenagers

Social media algorithms promote misogynistic content among teenagers

  • News
  • Published on
    Updated


    Reading 3 min.

    TikTok’s algorithm was caught committing misogyny. In just five days, TikTok’s algorithm recommended four times as many misogynistic videos to a user, according to a UCL study.

    A studyconducted among more than 1,000 adolescents aged 13 to 17, by UCL, University of Kent and The Association of School and College Leaders (ASCL), highlights a worrying reality: exposure to misogynistic content on social networks social networks creates a vicious circle where misogynistic attitudes are reinforced and spread.

    Initial suggested content aligned with each archetype’s stated interests, for example with videos exploring themes of loneliness or self-improvement, but then increasingly focused on anger and self-blame. regard for women. After five days, TikTok’s algorithm showed four times more videos with misogynistic content, such as objectification, sexual harassment or denigration of women (from 13% of recommended videos to 56%)“, explains the study.

    The results of the study are clear: adolescents exposed to more misogynistic content on social media are more likely to adopt misogynistic attitudes themselves. Worse yet, this exposure makes them more vulnerable to online sexual harassment, highlighting the deleterious impact of misogyny on adolescent well-being. Dr Kaitlyn Regehr (UCL Information Studies), lead researcher of the study, said: “Algorithmic processes on TikTok and other social networks target people’s vulnerabilities – like loneliness or feelings of loss of control – and gamify harmful content. Microdoses on topics like self-harm or extremism are seen by young people as entertainment.”

    Complicit algorithms

    Researchers point the finger at social media algorithms, which, by promoting the dissemination of content similar to that already viewed, lock adolescents into bubbles of misogyny. This echo chamber amplifies hateful messages and limits exposure to contrary opinions, depriving young people of an open and inclusive view of the world. “Negative views and themes are normalized among young people today. Online consumption impacts young people’s offline behaviors as we see these ideologies move off screens and into schoolyards. Additionally, adults are often unaware of how harmful algorithmic processes work, or even how they might fuel their own social media addictions, making parental education around these issues difficult.“, underlined the researcher.

    This is very worrying in general, but particularly when it comes to the amplification of messages about toxic masculinity and its impact on young people who need to be able to grow and develop their understanding of the world without being influenced by such appalling material“, explained Geoff Barton, general secretary of the Association of School and College Leaders, “We welcome the call to involve young people, particularly boys, in the conversation to combat this issue with their peers and families. We call on TikTok in particular and social networks in general to urgently review their algorithms and strengthen safeguards to prevent this type of content, and for the Government and Ofcom to consider the implications of this issue under the auspices of the new online safety law. It’s time to act rather than continue talking about action.”

    A call to action

    Faced with this alarming observation, UCL researchers are calling on social media platforms to take concrete measures to combat misogyny on their platforms. Among the recommendations made, we find the idea of ​​making social network companies responsible, of teaching young people to adopt “healthy digital consumption”, of setting up peer mentoring, or of informing parents and the community on social media algorithms. This will allow us to better understand how these algorithms can influence young people and take measures to protect them.

    dts6