False information manufactured abroad targets European citizens on a daily basis who are preparing to elect their parliamentarians. And with generative AI, these campaigns become more realistic and viral. Which are most likely to influence citizens? And how to protect yourself? Here are the questions that Laurent Cordonier, sociologist and research director of the Descartes Foundation, studies.
L’Express: Seeing fake news does not necessarily change our minds. But bathing in it on a daily basis is undoubtedly not trivial. What impact does disinformation have on the population?
Laurent Cordonier: Indeed, it is not because you are exposed to false information that you will automatically believe it. Research shows that the more information we encounter, the more we tend to judge it to be true. This is linked to a cognitive mechanism: information is processed more fluidly the second time we receive it. We also know that there are resistance mechanisms. Studies show that the average citizen in a democracy tends to distinguish the informational wheat from the chaff quite well. We tend to more easily accept information consistent with our stock of knowledge and beliefs. As a result, fake news is unlikely to have an effect on well-educated, well-informed populations. However, when it comes to topics about which we have little prior knowledge, we are more likely to believe false information. Few people have, for example, a good knowledge of the geopolitics of Eastern Europe, of military armament. Distinguishing fact from fiction is therefore more difficult when confronted with fake news circulating about the conflict, with their photos of weapons or fighter planes. The great unknown is the penetration of narratives pushed by foreigners. We have launched a study on this subject because the impact that these stories have on the French population has not yet been assessed.
What is the difference between fake news and a narrative pushed by foreigners?
Fake news is one-off false information. A narrative is a telling of a story. When Vladimir Putin says that he invaded Ukraine to protect himself from NATO expansion on its borders, it is a narrative, a retelling of the conflict. We can also adhere to a narrative without adhering to all the fake news created to support it. Motivated belief is a known phenomenon: we more easily believe what suits our point of view.
Format, tone… do certain types of fake news work better than others?
You might think that the most sophisticated fake news has the most impact. It’s not always the case. During Covid, some very popular misinformation content was extremely simple. For example, this viral photo of an empty hospital corridor, accompanied by text that said something like “my cousin who is a nurse sent me this photo and tells me that we are being lied to, that the hospitals are empty.” This is level zero of technicality. However, it went viral. Low-level fake news can be very successful. False information is often anxiety-provoking. However, our cognitive system is especially sensitive to danger signals. It is a consequence of the evolutionary process of our species, which has favored individuals who are most attentive to potential dangers. Fake news arouses fear, but also indignation. However, on social networks, when Internet users react and translate their emotions, with angry emojis for example, this viralizes the content.
Are disinformation campaigns becoming more sophisticated?
Fake news from the Middle Ages, those old anti-Semitic images of the hook-nosed Jew who poisons the well, unfortunately still work. But the circulation techniques are becoming more sophisticated. In troll factories, people create fake accounts to like and reshare false information en masse, in a very coordinated manner. They play on the algorithmic functioning of platforms to gain visibility. AI makes it possible to build bots that mimic the behavior of a human on social media, making them more difficult to detect. These robots will interact with certain content, share it and produce it, building a virtual personality over time. And they are mobilized when the time comes to massively share false information.
What factors generally encourage humans to share this false information?
Some relay them by ideology. Others, to gain notoriety. The most diligent relay disinformation even better than a Kremlin employee, it’s their hobby. A study by David Chavalarias (Editor’s note: research director at CNRS) showed that the people who shared the most climate-sceptical content were also those who had spread a lot of disinformation about Covid, and today, propaganda from the Kremlin. These individuals do not receive a kopek. They do it because they think they are defending the truth, they live as whistleblowers, as resistance. Studies have shown that those who share false information sometimes have high levels of altruism. So in reality they do it with the best intentions in the world, thinking of warning their loved ones, the whole world, of serious and important things.
For some experts, the risk of disinformation is not so much that citizens believe in fake news, but that they end up doubting everything, even the truth. What ultimately benefits misinformants is something called the liar’s dividend. Is this indeed a phenomenon to be feared?
Yes, it is the post-truth regime that Donald Trump embodied, the idea that relevance to reality is not important. Post-truth is not just lying, it is being indifferent to the truth. This is the scenario that scares me the most. The risk is not that people will massively start believing crazy things, but that they will enter into a relativism that is dangerous for democracy. If we start to consider that everything is equal, that the words of an IPCC climatologist are put on an equal footing with those of Bolsonaro on the climate, then we will have a real problem. A report published by Michel Dubois shows that the level of confidence of the French in science is still high. In France, we are therefore not in post-truth. But we must remain vigilant, because there is danger. A point of view is not the same as a fact. We must not give in to those who would have us believe this, and we must, for example, stop teaching the Shoah or the theory of evolution because it may offend certain points of view.
What do we know about the impact of disinformation in countries, such as Russia or China, where it is omnipresent?
The problem in these countries is often more the lack of quality information than misinformation as such. When people have reliable and independent sources at their disposal, they tend to turn to them. In 2021, the Descartes Foundation followed digital uses of 2,300 French people for thirty days. And we noticed that the French overwhelmingly consulted the sites of the major national or regional media. We are lucky in France to have a quality information environment. And during Covid, it was reliable sources that benefited the most from the explosion in information consumption, a study showed. study centered on France, the United States, Germany and England. But in countries where journalists are locked up, killed and where the media relays propaganda, you don’t trust your media, and rightly so.
A study carried out in 26 countries which we have just published shows that the factor which best explains the differences in attitudes towards conspiracy theories is the level of corruption in the public sector. The more corrupt a country’s public sector, the more its population believes in conspiracy theories. Our interpretation is that when you live in a country where you see civil servants being bribed, where you yourself have to slip a ticket to gain access to a doctor or a police officer, your level of social trust is necessarily at the highest down. So when you’re faced with content claiming that governments are lying about the Holocaust or walking on the Moon, your level of distrust inclines you more to think it’s true. France has major conspiratorial figures who sometimes influence the world. Fake news about Brigitte Macron’s gender has thus been exported to the United States. But the French population as a whole resists conspiracy theories quite well.
What can the European Union do to better protect itself from disinformation spread by malicious actors?
The Digital Services Act is a step in the right direction. It requires social networks to be accountable for their strategy to combat disinformation. The idea is not that they censor more, rather that they break the phenomena of artificial virality. We must move forward on these issues with great caution so as not to have actions that are too restrictive. When we fight against interference, we try to defend democracy. If by defending it, we damage it, by attacking freedom of expression too much, we have missed the objectivejactive. And this can generate distrust within the population. The important thing, in my opinion, is to take measures adapted to the state of the threat. And the threat is not the intensity with which fake news circulates, in other words how much we are attacked, but how much our population is affected by it.
What role can the educational sphere play in the fight against disinformation?
An essential role. It is our stock of knowledge that helps us detect false information. The Pisa results are not encouraging in this regard. When you lose twenty points in mathematics in a year, that’s a problem. Because some fake news uses misleading graphics and distorted scales. The second important point is training in critical thinking.
What is critical thinking? At what point can it become excessive and turn into conspiracy?
I defend a precise conception of critical thinking. First, we must realize that every human being is in a state of epistemic dependence. Everything we know, we know through others. If we know that a country has been invaded, it’s because a journalist told me so. By my own senses and experiences I know nothing about the world. The question of knowledge is therefore that of trust. The first pillar of critical thinking is to attribute epistemic confidence on rational bases. Why do I decide or not to believe this Facebook post, this media, this web source? What are my criteria? The second pillar of the mind is being aware of how our own cognitive system works. Often, when we make a mistake, we have deceived ourselves. Exercises can help us identify our own biases, confirmation biases for example. Biases are not statistically stupid. They can be useful to us. The bias that pushes us to believe what our friends believe has a good evolutionary reason: if you see your loved ones running in one direction, it’s better to run with them and think about it afterwards! But in the information world, this can play tricks on us.
.