Facial recognition: this tool already accessible to the general public

Facial recognition this tool already accessible to the general public

This Monday, June 12, the Senate is examining a Law proposition aiming to create a legal framework for the use of biometric technologies, while facial recognition must be authorized on an experimental basis to secure the next Olympic Games organized in Paris in 2024. This text aims “to lay down clear red lines in the law in order to remove the risk of a surveillance society”.

It proposes to prohibit “any categorization and notation” of people “on the basis of their biometric data” and, in general, “any remote recognition” of people on the basis of this data… But it opens up the possibility, for a period of three years, to use this technology in certain cases, such as investigations into “the most serious crimes” or “the fight against the risk of terrorism”.

The European Parliament is also considering outlawing facial recognition in public spaces in the version drawn up in May of his artificial intelligence bill, the AI ​​Act. But what will be the real impact of these laws, when this technology is already widely available and used by private companies?

Facial recognition just a click away

The famous firm Clearview AI has been providing advanced facial recognition services to states and companies for several years. Sites accessible to the general public already offer them, for a few euros, to identify faces on the web. This is for example the case of PimEye, a Polish application launched in 2017.

“You can take a picture with your device’s camera. Don’t worry, we won’t store it!” the homepage promises. But when you select an image from your computer, a short sentence informs you that the biometric data of your photo will be used to find it on the web.

In what appears to be a way to guard against potential lawsuits, PimEye ensures that its services would be used “solely for personal use”, that is to say to “identify the cases where [sa propre] image has been published online without their consent”. Its ambition would be to “reduce the risk of fraudulent activities, such as identity theft, catfishing, or other scams”. It is also possible for a user to post a request to request that his image be removed from a site where it appears.

The ease of use of the service, however, is disconcerting. Nothing, of course, prevents searching for images of other people without their consent. A free trial is offered, the basic option is priced at just over 33 euros per month and up to almost 335 euros for an unlimited number of searches. In two clicks and a single photo fed to the algorithm, the author of this article for example discovered several blog posts, but also one of her contents copied and pasted by a far-right site and embellished with comments plotters.

Unlike ClearView, PimEye seems restricted to the search of sites and blogs, even if it presents few photos from social networks. The system is also still quite imprecise and presents many photos of wrongly identified people. And if, in theory, only the premium – and paid – version allows you to carry out an “in-depth search” also scrutinizing pornographic content, several images of a sexual nature appear via this free search.

900 million faces stored

The development of this type of service, which sometimes searches and stores huge amounts of data, raises real questions in terms of copyright, personal data security and privacy.

PimEye also ensures that “faceprints” detected on the web are only indexed for research purposes. This biometric data would in theory never be linked to a particular person or personal information, but simply to the URL address. In 2021, its database had approximately 900 million faces.

The facial recognition giant Clearview AI would have in its possession more than 20 billion images drawn from the four corners of the web without the approval of the sites from which they come, nor of the people identified. This sensitive data is also subject to hacking, as has already been the case in the past.

In May 2022, the firm was fined £7.5 million (€8.85 million) by England and ordered to delete the personal data of UK residents. There France followed suit a few months later, in October, fining the company 20 million euros, followed by a penalty payment of 5.2 million euros for delaying payment.

lep-life-health-03