The new Government intends to generalize algorithmic video surveillance used as an experiment during the 2024 Olympic Games. News which greatly worries the CNIL and associations, who fear for the private lives of citizens.
Video surveillance has become part of our daily lives, and it’s not going to stop any time soon. If video surveillance systems initially relied on analog devices (cameras, video recorders, etc.) and human operators, they increasingly exploit automated digital technologies, more efficient but also more worrying, at a time when Artificial intelligence (AI) is becoming widespread. For the 2024 Olympic Games, the previous Government authorized the use of AI for augmented video surveillance – also called algorithmic video surveillance. A decision which has long been the subject of debate and has particularly worried the National Commission for Information Technology and Liberties (CNIL), which saw it as a great risk for the private lives of citizens.
The experiment was initially scheduled to last until the end of March 2025. But previous governments had taken care to leave the door open so that this temporary control system could be extended and even become permanent (see our article). A door that Michel Barnier, the new Prime Minister, rushed to cross during his general policy declaration on October 1, declaring “generalization of the method experimented during the Olympic Games”. And this, despite the absence of the evaluation report provided for by the law relating to the 2024 Olympics…
Algorithmic video surveillance: an experiment set to last
As a reminder, algorithmic video surveillance (VSA) consists of using artificial intelligence via algorithms to analyze video surveillance images in real time and continuously, in order to alert the police to suspicious behavior. This may involve the presence of abandoned objects or weapons, crowd movements, a gathering of people, the failure of a vehicle to respect the direction of traffic, but also the presence of a person or a vehicle in a prohibited or sensitive area, a person on the ground following a fall or even the start of fires.
The text of the law relating to the Olympic Games, however, prohibits, within the framework of the system, the use of a biometric identification system, the use of biometric data and the use of facial recognition, which consists of precisely identifying an individual filmed . This is one of the red lines imposed by the CNIL.
Algorithmic video surveillance was therefore used within the framework of the Olympic Games, but also during the Roland-Garros tournament, during concerts such as those of Depeche Mode, the Black Eyed Peas, Taylor Swift, or even during the Cannes Festival. According to the authorities, the results of this experiment are positive, even if the system needs to be improved on certain points. A committee responsible for evaluating the experiment must deliver its conclusions in a report to be submitted by the end of the year.
Until now, the Ministry of the Interior assured that the experiment would not go beyond the period indicated in the law. Obviously, the Prime Minister seems determined to continue this momentum, without waiting for this report. The Government here responds favorably to the request of the prefect of Paris Laurent Nuñez. We will have to wait for the decision to be made official to find out more about the terms of application.
Video surveillance with AI: the risks of political abuse
Already at the time, algorithmic video surveillance greatly worried the CNIL, the left and many associations. Defenders of rights and freedoms feared that the Olympic Games were only a pretext to implement the algorithmic video surveillance system beyond the end of the experiment. It appears that this is indeed the case.
In a press release published in January 2023the CNIL asked for guarantees in order to prevent things from getting out of hand and France ending up resembling the Chinese model and its biometric identification, which makes it possible in particular to identify individuals directly in the street – very practical for targeting minorities, the marginalized, the whistleblowers and any other opponents of the regime. Because the question arises of who defines the norm… And, depending on the answer, this can lead to the criminalization of behavior that was previously harmless.
Facial recognition, which matches a human face to a digital image using scans and video surveillance cameras – and which has already been adopted by eleven European Union countries, particularly in a judicial context – is particularly feared. Many organizations fear that algorithmic video surveillance could open Pandora’s box and lead to the use of facial recognition. The digital policeman therefore recommended “the absence of biometric data processing” and of “reconciliation with other files”.
“By legalizing some uses of VSA, the State wishes to legitimize a state of affairs and initiate a much broader surveillance project of public space”alerts Quadrature of the Net. “Behind this unprecedented legalization, which concerns a small number of use cases (starting lights, individuals walking in the wrong direction, etc.), there are other applications which could in turn be legalized. Many responsible “politicians publicly assume that they want to authorize biometric monitoring and categorization of the population, including through facial recognition or even emotion recognition.”
Same story from Human Rights Watch, whose director of the Technology and Human Rights division, Frederike Kaltheuner, declared in March 2023 : “The surveillance provision in the bill would pose a serious threat to civil liberties and democratic principles. It would increase the risk of racial discrimination in law enforcement and would be another step toward normalizing exceptional surveillance measures under the pretext of ensuring the security of major events”.
Furthermore, she considers that this surveillance system will automatically involve facial recognition, despite the Government’s promises to the contrary. “If the use of cameras equipped with algorithms is intended to detect specific suspicious events in public spaces, these cameras will necessarily capture and analyze the physiological traits and behaviors of people present in these spaces. This may involve the posture of their bodies, their gait, their movements, their gestures or their appearance The fact of isolating people from their environment, which proves essential in order to fulfill the objective of the system, constitutes a unique identification”she is alarmed. Because recognition means providing a sufficiently detailed description to allow agents on the ground to spot a person. Remember that once your foot is in the door, it is impossible to close it.