Is social media harmful to mental health? If YouTube will probably not answer “yes” to this question, the platform has in any case adjusted its recommendation algorithm for adolescents aged 13 to 17 after being alerted by its advisory committee of health experts, as well as numerous researchers for more than three years. YouTube has therefore decided to limit exposure to potentially harmful content, such as that promoting unrealistic body ideals or showing scenes of violence, aggression or intimidation. This preventive measure, implemented in September in the United Kingdom then gradually extended to the rest of the world, is good news because it should reduce the risk of the algorithm’s spiral effect among young people.
To understand the importance of this decision, it is essential to understand how the YouTube algorithm works. This relies on five indicators to recommend videos to users. The first is the history of searches and watched videos. The second is based on the relevance of the video content to the keywords used in the search, including the title, tags, descriptions and the content itself. The third is based on user engagement with previous videos, such as watch time and number of likes. The fourth concerns signals indicating the quality of a channel to assess its expertise and reliability on a given subject. The latter encompasses factors such as geographic area, seasonality and temporality.
The algorithm thus offers personalized recommendations based on the interests and history of each user. Additionally, these suggestions are adjusted based on the performance and quality of the videos. This method aims to maximize user attention retention while collecting personal data for advertising purposes, in accordance with the principles of captology [NDLR : l’étude des technologies numériques comme outil d’influence et de persuasion des individus].
The algorithm, an accelerator of difficulties among vulnerable teenagers
But how can the operation of YouTube’s algorithm cause mental health problems? The crux of the problem lies in the fact that this tool is designed to sort millions of videos and offer the content deemed most useful and relevant to the user. However, the way this technology is perceived, interpreted and used can result in both positive and negative externalities.
British researchers propose to analyze this paradox by explaining the impact of the design of applications (and digital software), no longer through what this design is supposed to do, but through the way in which it is perceived, interpreted and used by users. For example, the algorithm is programmed to adapt content according to the user’s identity, preferences and expectations, making this personalization one of the key functionalities of the site. The resulting use is to make it possible to consult very relevant content for adolescents, which will contribute to the development of the self-concept, that is to say the belief that one has of one’s qualities and one’s qualities. features.
But this functioning can also exploit vulnerabilities around self-esteem or self-image. This can lead individuals to have access to content that resonates with pre-existing difficulties. For example, all it takes is for a vulnerable teenager to spend a little too much time on content related to their vulnerabilities to be offered a multitude of similar videos. We then speak of a negative spiral effect.
The problem is that adolescence is a pivotal period of development where a lot of change occurs at the behavioral, cognitive and neurobiological level. Adolescents are in full development of their self-concept, they are also more sensitive to what their peers think and in particular to their negative judgments. They fear being rejected by others and are more sensitive to stress. So many developmental specificities that make them more fragile from the point of view of their mental health.
When social media amplifies exposure to body ideals
In this context, YouTube’s recommendation algorithm and more specifically the way in which it is perceived, interpreted and used, may interact with adolescents’ developmental changes and amplify their mental health vulnerability.
For example, American researchers are advancing that social media could interfere with a sensitive period of social learning and friendship formation, creating a “perfect storm” exacerbating girls’ concerns about their body image and leading to mental health problems. In other words, young girls with strong body concerns may, due to the functioning of the algorithm, be subjected to a proliferation of images or videos promoting the “ideal body”, which could influence the perception of their own body.
The promotion of body ideals is not a recent phenomenon. It already existed long before the Internet, in women’s magazines and on television. However, the problem with personalized recommendation algorithms like YouTube’s is that they intensify exposure to unrealistic body standards and can quickly lock users into a spiral effect. This leads to an increase in situations where young women compare themselves to the images they see on these platforms.
By comparing themselves in this way, young women face more mental health risks such as depression, negative emotions and body dissatisfaction, which are negative thoughts and feelings about one’s actual physical appearance. Currently, researchers hypothesize that heavy use of highly visual social media could be a major risk factor for body dissatisfaction and contribute to the development of eating disorders. However, further research is needed to clarify these links.
The flaws in YouTube’s new algorithm metrics
The new measures put in place by YouTube to protect young people therefore seem to constitute excellent progress. However, it is regrettable that this adjustment of the algorithm only concerns adolescents aged 13 to 17 who have an account on the platform. This protection should be extended to all YouTube users, in order to circumvent at least two obstacles: on the one hand, content will not be filtered for young people who have lied about their age, on the other hand, a spiral effect negative will remain possible with unidentified young people. Furthermore, adults – men or women – can also be vulnerable to the spiral effect of content on the ideal body, particularly those who are concerned about their weight and who also present psychological vulnerabilities. The latter will not benefit from the protection put in place by the YouTube platform.
Finally, a question arises: how can we precisely identify problematic recommendations? If pro-anorexia content is easily identifiable, it is more difficult to distinguish recommendations on a healthy sports routine from recommendations excessively promoting weight loss through sport. This adjustment to the recommendation algorithm nevertheless remains great progress for the mental health of adolescents. Other highly visual social media platforms like Instagram or TikTok should largely draw inspiration from these provisions. Because there is still a long way to go for social networks to become safe spaces for our teenagers.
.