“Nature abhors a vacuum,” said Aristotle. The same could be said of the contemporary cognitive market, which sees the different representations of the world confront each other in competition without equal in human history. Since the appearance of the digital world, the ways of accessing these representations and information in general have been profoundly disrupted. To find answers, we now turn more often to search engines than to conventional journalistic sources.
In this context, a certain reading of the world is more likely to reach us if it appears on the first page of Google than if it lags on the tenth. Quite often, this is not a problem: if you ask him about the location of a restaurant or the geography of a country, you will find reliable and detailed information. Only, we can artificially take advantage of what Michael Golebiewski and Danah Boyd, experts in digital communication, have called data voids (data voids), i.e. search terms for which the available data is limited or non-existent.
Two types of situation conducive to fake news
When the volume of information on a subject is large, it is difficult to manipulate it. Indeed, an algorithm presides over its scheduling and is based on a complex set of variables, not the least of which is the popularity of the content. This creates a form of epistemic virtue that ensures a certain reliability of the sites put forward. This is not the case when faced with data voids, that allow pernicious forms of manipulation. They can create two types of situation, distinct but equally conducive to the dissemination of false information or the propagation of dubious interpretations of a current event.
The first occurs when nothing is yet known about an event – for example, if gunshots are heard in a school without their origin being known yet, which is not an extremely rare event in UNITED STATES. The goal of some disinformation actors is then to offer a narrative before any official interpretation is given by the conventional media. Under these conditions, it is likely to anchor certain stories in people’s minds. On this subject, two Belgian researchers, Jonas De Keersmaecker and Arne Roets, have shown that the first impression lasts in terms of infox… even when the individual who is confronted with it later learns that it is false.
In the second, we circumvent the editorialization of search engines on certain themes by identifying them with rare, unknown, obsolete words, so that we can lead the curious into “epistemic bubbles”. The association of certain search terms with events will lead to data gaps which will be quickly filled by linking themes to specific content. This is shown by studies by Francesca Tripodi, professor at the UNC School of Information: if “KKK” or “Nazi” are terms blocked or suspected by the search engine, slogan phrases like “you will not replace us pas” are secret passages to supremacist groups. So propagandists are now using strategic terms that allow access to spaces where the quality of information gets out of control.
A query on certain specific themes developed in conspiratorial circles, for example, will automatically lead to content that may be problematic. A recent illustration is that of the term “adrenochrome”. It is a little-known word that designates a biological pigment resulting from the oxidation of adrenaline. It turns out that the conspiratorial imagination, and in particular that of QAnon, has made it a drug made from the blood of children and which would have rejuvenating virtues! Before the inconsistent Cyril Hanouna gave him unexpected visibility during his show Do not touch My TV of February 10, the word was a perfect hook to draw curious minds into paranoid nets.
The idea that we live in the same society but no longer quite in the same world finds a new illustration here. Depending on the way we use the tools that help us orient ourselves in the information maze, we do not end up with the same landscapes, and propagandists know how to use this new cognitive geography.