“Do not amplify”. These three innocuous words can condemn you to wander in the limbo of Twitter for eternity… When they are attached to a profile in the internal version used by Twitter employees, this means that the publications of this Internet user may appear, but should not be highlighted. This is called the “shadow ban”, a form of phantom banning because the person unknowingly undergoes filtering which greatly limits the popularity of these tweets. A practice that has been hotly debated in recent days. Who lit the fuse? Elon Musk, of course, who distributed to certain journalists internal documents at Twitter with examples of accounts targeted by this “shadow ban”.
Since he bought Twitter, Elon Musk has constantly criticized the functioning of the social network. It started with killer tweets to some employees (most of whom have since been fired). And in recent days, he has undertaken to cleverly orchestrate with a few journalists the dissemination of certain internal elements which he pompously baptizes the “Twitter Files”. These “revelations” are however all relative concerning the “shadow ban”. Twitter had indeed announced very publicly in 2018 that it was implementing this type of filtering. At the time, Slate’s American tech journalist, Will Oremus, humorously dubbed it the “twitter purgatory”.
In the world of social networks, this practice is now quite widespread. It is part of the arsenal of sanctions that platforms can take when an account violates its general conditions, for example by disseminating disinformation on the Covid, racist or violent messages. Depending on the seriousness and frequency of these breaches of the rules, the Internet user may see his message deleted or broadcast with an alert. A May 2020 tweet from Donald Trump casting doubt on the reliability of postal voting had, for example, been broadcast with a warning that it could contain misinformation.
Twitter users condemned to invisibility
The most severe punishments are the banishment (temporary or permanent) of a person. And between the two, there is this famous “shadow ban” which does not prohibit an Internet user from writing (and those who appreciate him from finding his publications), but does not put his messages forward to people who did not explicitly look for them. A crowd of social networks use more or less similar sanctions. In August 2022Mark Zuckerberg (Facebook, Instagram, etc.) thus confirmed that if a fact-checker identified that a publication contained misinformation, it could be made “less visible” on these networks and that in the event of actions of this type repeated “more comprehensive measures could apply”.
The most ironic thing about this story is that if Elon Musk pretends to fall backwards when he “reveals” that the old management practiced the “shadow ban”, he himself admitted on November 18 that this technique had its virtues.” Twitter’s new policy is freedom of speech, not freedom to be highly visible. Hateful tweets will not be featured on the network and they will be demonetized, so Twitter will not make money from these posts. It will not be possible to find this type of tweet unless you search for it voluntarily.”
However, this rather artificial controversy has a merit: that of reminding us that the giant social networks are no longer companies like the others. The smallest of their micro-decisions has a massive impact on the global conversation. And faced with such power, adequate checks and balances are needed. That Elon Musk offers much greater transparency on the decisions taken (did you suffer a “shadow ban”? Why?) is a very good thing. It was a real mistake by the old management of Twitter not to be transparent on this subject. It is important that users are informed of the sanctions aimed at them and have the possibility of appealing against these decisions, a fortiori if filtering is put in place on all of their publications.
On the puzzle of content moderation (sorting out what can be broadcast and what should be deleted), it is however crucial that the red lines are not determined only internally, but collectively. , with external actors from civil society (academics, lawyers, associations, etc.). Unfortunately, this is not the path that Elon Musk is taking here. The new owner of Twitter has announced that it is disbanding its Trust and Safety Council, which brought together around 100 independent researchers and human rights activists to work on these thorny issues. Thorny questions that Elon Musk obviously prefers to settle by quickly organizing Twitter polls. When you have one of the largest communities of followers, it is true that it easily gives the impression that your idea has been approved.