what are we talking about, and why are we worried?

what are we talking about and why are we worried

Four months before the launch of the Olympic and Paralympic Games, close-up on “augmented” video surveillance, the experimentation of which France has authorized. A technology that intrigues as much as it worries.

In an exceptional event, an extraordinary device. To ensure the safety of Paris 2024 Games and the 15 million visitors expected, the State intends to pull out all the stops. Every day, some 35,000 police officers and gendarmes will be deployed, including those from the elite units of the GIGN, the Raid and the BRI, 20,000 soldiers and as many private security agents… But also surveillance cameras doped with artificial intelligence to support them. How much ? Mystery. Asked recently on the question, the Minister of the Interior Gérald Darmanin replied that he was “ too early » to give a number.

Two tests were carried out on March 3 and 5, during concerts by the British group Depeche Mode at the Accor Arena in Paris. It was not yet a question of testing algorithmic video surveillance strictly speaking, but only of “ test and configure software solutions » in real time, Place Beauvau had specified. “ All lights are green », rejoiced Paris police chief Laurent Nuñez after the first test. However, this does not reassure the associations defending freedoms who consider that by authorizing the experimentation of this technology, the French State is opening the Pandora’s box of mass surveillance.

■ What are we talking about?

Algorithmic video surveillance (VSA) consists of coupling surveillance cameras with software whose algorithms are supposed to be able to detect predefined events automatically and in real time in the middle of a continuous stream of images. This is to facilitate the work of agents operating in control rooms. In Nice, the VSA is used to detect violations of the Highway Code, groupings, counting pedestrians, scooters and scooters on a specific axis… In Aulnay-sous-Bois, in the Paris region, it is used to identify abandoned objects, illegal dumping, departures of fire or crowd movements.

“Augmented” video surveillance is already being tested at the municipal level in several dozen municipalities in France. But at the national level, this is a first. At least officially. Last November, the investigative media Disclose revealed that the national police had been using video surveillance image analysis software developed by the Israeli company Briefcam in complete secrecy since 2015.

■ In what legislative framework is it used?

The use of algorithmic video surveillance is this time planned “ as an experiment » by article 10 of the law relating to the Olympic and Paralympic Games promulgated in May 2023. It is indicated that the VSA can be used in the context of sporting, recreational or cultural events until March 31, 2025. That is to say well after the period of the Games. The text emphasizes that algorithmic processing does not use “ no biometric identification system, do not process any biometric data and do not implement any facial recognition techniques “. “ A red line », affirmed Gérald Darmanin. Each use must also be the subject of a prefectural decree, following the opinion of the National Commission for Information Technology and Liberties (Cnil).

A decree taken during the summer specifies the eight abnormal events that the software must look for in the images captured by the cameras: the presence or use of a weapon, the outbreak of fire, a person on the ground, an abandoned package. , crossing a prohibited zone, a crowd movement, excessive density and failure to respect the direction of traffic.

■ Where will it be deployed?

Algorithmic video surveillance can be ensured via cameras installed inside or around the places hosting the events, in transport or by those on board the drones which will fly over the different sites. She should ” probably be deployed in a limited manner » this summer, declared the interministerial delegate to the Games, Michel Cadot, in January. According to him, these so-called “smart” cameras should above all be used “ in high density areas, for example in the center of Paris, around the sites, and before controls “.

The VSA market was divided between four French companies: Wintics, Videtics, ChapsVision and Orange Business. Each has its own geographical area. Wintics will deploy its tools in Ile-de-France and in transport, Videtics in three regions in the South and overseas, ChapsVision in the rest of France. Orange Business will be responsible for monitoring transport in the event of Wintics’ failure.

■ Why is its use criticized?

The criticisms primarily focus on the dissuasive effects that this type of technology can have on freedoms. “ When we know we are being monitored, we will tend to modify our behavior, to censor ourselves, perhaps not to exercise certain rights », observes Katia Roux, technology and human rights specialist within the French branch of the NGO Amnesty International. “ Any surveillance in public space is an interference with the right to privacy. According to international law, it must be necessary and proportionate to a legitimate objective », she recalls. “ It is up to the authorities to demonstrate that there is no way less detrimental to freedoms to guarantee security. However, this demonstration has not been made. »

Another criticism concerns the very functioning of artificial intelligence, on which algorithmic video surveillance is based. A seemingly neutral technology, but in reality developed with data potentially including discriminatory biases that it could amplify. “ We see in other countries who have developed this type of surveillance of public space, a use that disproportionately targets certain groups of the population that are already marginalized », reports Katia Roux.

Above all, organizations defending freedoms fear that experimentation with algorithmic video surveillance will open the way to more intrusive forms of use. “ It’s a foot in the door that announces more problematic applications, such as facial recognition in the short term. », Warns Félix Tréguer, associate researcher at the CNRS and member of the La Quadrature du net association.

■ What future after the Olympics?

While the experimentation with algorithmic video surveillance is due to end on March 31, 2025, the government announces that the technology could be made permanent. “ If it proves itself and with guarantees, it could be used for major events », announced in September the Minister of Sports, Amélie Oudéa-Castéra. The Senate, for its part, has already prepared what comes next. Last June, the Upper House adopted a bill relating to biometric recognition in public spaces, which opens the way to experimenting with facial recognition for the purposes of judicial investigations and the fight against terrorism for a duration of three years. The text must still be examined in the Assembly.

Is France preparing to follow the example of other countries organizing major sporting events by strengthening its post-Olympic security system? In 2012, the London Olympic Games led to the massive deployment of surveillance cameras in the streets of the capital. Six years later, the Football World Cup in Russia was an opportunity to experiment with facial recognition, which is still in place today. In 2020, the Tokyo Games were preceded bya widely criticized legislative turn of the screw.

On the eve of the Paris Games, Amnesty International is therefore calling for more than promises. The NGO is calling for a law banning facial recognition for identification purposes in public spaces. “ Since it was presented as a red line in the debates, it cannot be a dotted line; it must be robust », Supports Katia Roux. Under penalty of sliding towards generalized surveillance.

rf-3-france