Twitter, Instagram, Tinder… In the secret of algorithms

Twitter Instagram Tinder… In the secret of algorithms

“What are your hobbies ?” “Do you want to see fewer posts like this?” The best magic trick of the Web giants is to have given Internet users the illusion of controlling what scrolls on their screen. Tell us what interests you, I promise, we’ll offer it to you, they swear in unison. Elon Musk’s unprecedented decision to reveal the workings of Twitter’s recommendation system is a stark reminder: Internet users are not the masters. It is the network’s algorithm that chooses, in reality, whether a message will be seen by three pequins or 300,000. The blue bird network, acquired in October by Musk, thus takes many parameters taken into account to make this difficult choice.

For example, the presence of photo or video in a message, its seniority or the ratio between the number of people who follow the author and the number of people he follows on his side. The reactions that a tweet is likely to trigger also play a crucial role. Will the Internet user “like” it? To share it ? Answer to ? The platform assesses the chances of each of these actions occurring by analyzing his previous interactions and those of people who like the same kind of tweets as him. She then weights this with various coefficients : 0.5 for the probability that a user “likes” a tweet, 1 that he retweets it, 27 that he replies to it or – 74 for the probability that he reacts negatively to it (by blocking the author by example).

Facebook’s Top Secret Algorithm

Meta (Facebook, Instagram) keeps its recommendation algorithm jealously secret. Whistleblowers and patient experimentation by researchers have, however, made it possible to identify certain major principles. “Social networks all seek to keep the Internet user as long as possible on the platform. A debatable but effective way to achieve this is to highlight content that arouses strong emotions to the detriment of more moderate publications”, analyzes Nicolas Kayser-Bril, reporter for the NGO AlgorithmWatch.

In the defense of the Web giants, sorting out the incredible quantity of publications on the Internet is a delicate task. Every day, on Twitter alone, 500 million new tweets are written, and 150 billion publications are distributed to all members. All will undergo a long series of evaluations and screening processes before being dispatched. “No engineer, internally, has a precise vision of the entire system”, continues Nicolas Kayser-Bril.

These tools that decide, no more and no less, what makes the “A” of the Internet and what is relegated to the rank of footnotes deserve serious consideration. Because the sorting carried out by the platforms is far from being exemplary. By focusing too much on emotion and buzz, social networks quickly find themselves pushing divisive and outrageous publications that “snatch” Internet users by making them angry. Not to mention the dubious recommendations they sometimes make to their users. “I’m a teacher, and I’ve watched serious Apollo 11 documentaries. But now my YouTube recommendations are full of conspiracy videos about 9/11, Hitler’s escape, aliens, and anti-American propaganda,” one user said. polled some time ago by Mozilla in a survey of YouTube recommendations.

The book Love under algorithm by Judith Duportail also shows the disproportionate influence of the Tinder algorithm, which decides in a certain way who we have the right to meet and love. And how it can be oriented. In a patent, the group’s teams noted, for example, that it would theoretically be possible to weight a person’s age and income according to their sex, so that a man is regularly introduced to women younger than him and with lower incomes. than his own, and very rarely to a woman from younger men with lower incomes. The Web giants, of course, brandish business secrecy as soon as the curious come to ask embarrassing questions about these sensitive arbitrations. However, politicians are beginning to understand the importance of what is at stake here. Europe, in particular, has kicked in the anthill. Its Digital Services Act, approved in 2022 and which will come into force next year, will require large online platforms to open their algorithmic black box.

Force the Internet user to watch what he does not like

The devil is in the details, will the platforms play the game? And, if not, what will be the punishment? The case of Twitter reveals the ambivalence of certain transparency operations. The platform did not deign to reveal its code in the form of an executable program, which can be launched on a computer in order to practice various experiments, as is usual. “We end up with a mountain of lines of code that are difficult to read and whose behavior we cannot really test”, underlines the expert from AlgorithmWatch. In addition, Twitter’s recommendation system will soon be profoundly modified, since Elon Musk has made the contested choice to give less visibility to people who do not subscribe to his new paid subscription Twitter Blue.

Are the platforms willing to accommodate their members’ preferences? Nothing is less sure. On YouTube, Internet users can, for example, report videos that they do not like. But that only reduces the likelihood of being exposed to similar videos by 12%, according to a study by the Mozilla Foundation by the end of 2022. If you fool your audience too much, there is a great risk of getting tired of it.



lep-general-02