The unbearable dictatorship of algorithms

Under the pretext of preventing all kinds of abuse, are social network algorithms gradually taking control of our ideas and opinions? Example with Covid-19.

For the past twenty years, social networks have been imposing themselves on us. In two or three clicks on your smartphone, you can communicate from one end of the planet to the other, in real time, exchanging texts, sounds, images and videos. A real revolution in the distribution of mass information that has made our paper newspapers obsolete and has overtaken all other means of communication such as the telephone, radio and TV…
Social networks (or social media) have quickly appeared as a huge space of freedom. Used for personal or professional reasons, to meet people, or to propose a mega gathering under the nose of the authorities, social networks are now essential. Nearly 50 million French people (8 out of 10) have an account on a social network chosen according to their age or their interest.
However, in the wrong hands, these means of mass communication can become dangerous. There are countless complaints of cyber harassment, threats, racist comments, obscene scenes, blackmail, sexual abuse, digital theft…
Hence the need to control the contents.

They know everything about us!

That’s what algorithms do, those mathematical formulas that constantly mine our digital data. The algorithms of GAFAM (Google, Appel, Facebook, Amazon and Microsoft) hold 80% of humanity’s personal digital information. They aggregate it, cross-reference it, profile each of us, and put us in boxes. This data, of a rare precision, is sold at a high price, most of the time for marketing and commercial purposes in order to offer advertisers a surgical targeting solution. Thanks to the analysis of the different variables of our messages, especially the lexical fields, the algorithms try to decipher our ideas and opinions. They know everything about our friends, our enemies, our community…. Everything!
To be convinced of this, we will read with interest this article from the Journal of Epidemiology and Public Health published in September 2020 entitled “Design of an algorithm to detect human papillomavirus vaccine hesitation within messages from social networks.”
Besides, algorithms don’t just monitor our opinions and behaviors. They direct them.

Censorship and excommunication

Example with the treatment of data concerning the Covid-19 pandemic. The subject concerns all the inhabitants of the planet. With the tests and vaccines, we are all on file. Since the advent of SARS-CoV-2 at the end of 2019, billions of messages are exchanged, every day, on social networks. We wonder about the virus, the interest of the health pass, the reliability of vaccines, the strategy deployed by the authorities…
However, in our globalized world, the questioning of the official word is not allowed. Because it is subversive. All those who think, who doubt, who question are suspected of conspiracy. They are the new heretics promised to the stake. They must therefore be muzzled, banned, prevented from spreading their harmful ideas on social networks at the risk of contaminating other members.
How to identify these miscreants? It is enough to set up the algorithms, to put in place the right filters, to brutally censor their messages and, consequently, to excommunicate them from the right-thinking society.

From the Inquisition to the Nazis

Thousands of Tweets, Posts and videos are deleted from social networks every day, often without any explanation. Sometimes with this mention: “This content has been deleted because it goes against our professional community policy”. Gobbledygook worthy of the Inquisition. Especially since the filters set up to remove certain conspiracy content are inappropriate and arbitrarily penalize posts that have nothing subversive about them.
In the same breath, these same algorithms invite us to take the good word from the WHO, the Haute Autorité de Santé and other official bodies.
Will algorithms end up governing humanity?”Asks RTBF.
We’re getting there. “Rohingya refugees, a Muslim ethnic minority who fled persecution in Burma, have filed a lawsuit against Meta Platforms Inc, formerly known as Facebook,” reveals The World in its December 7, 2021 edition. “The collectively filed complaint claims that Facebook’s algorithms push certain user profiles into even more extremist groups than they already are.”
Censorship, self-denials, control of ideas and opinions take us back to dark pages of history. “The motor of an ideological movement is not a question of understanding, but of faith,” said Joseph Goebbels, head of Nazi propaganda, in the 1930s. “Christ gave no proof for his Sermon on the Mount. He merely made assertions. It is not necessary to prove what is obvious.
Goebbels had his own idea of ​​what was “obvious” and of freedom of expression. He said that “public opinion is manufactured” by distributing an “official” version of the news on a daily basis.
We know what happened to the Nazi gospel.
The dictatorship of algorithms, of a completely different nature, is no less despicable. It acts on a planetary level as a great brainwashing machine, all the more efficient as the young generations, in particular, are addicted to social networks.
An addiction with tragic consequences if we are not careful.
But isn’t it already too late?

fdn-1-general