Last Wednesday, the European Commission published a bill for a better fight against child pornography and the sexual abuse of minors. This new text wants to force service providers – and in particular social networks and messaging – to integrate new mechanisms for detecting illegal content.
In principle, this detection would be targeted and would only be done at the request of the courts or an independent national authority. But the scope of illegal content that providers must potentially detect is very broad. This includes not only known and referenced child pornography content, but also content that has not yet been recognized as such. This presupposes the establishment of a general scan of shared audiovisual content using artificial intelligence algorithms.
Also see video:
But that’s not all. The Commission also wants to oblige suppliers to detect “grooming”, that is to say the contact of an adult with a minor for a sexual purpose, in order to prevent the crime which is being prepared. The Commission recognizes that such detection is particularly intrusive, as it presupposes “automatic text scanning of interpersonal communications”. However, it minimizes the impact by emphasizing that the underlying computer processing is not intended to “understanding the content of communications”but rather “search for known and pre-identified patterns”. Very little consolation for human rights and privacy activists.
The bill does not impose any particular technological solution. However, it emphasizes that these obligations must be applicable everywhere, including for end-to-end encrypted communications. And in this case, as we know, there are only two solutions: creating a cryptographic backdoor or scanning the content locally before encryption. It is rather this last solution which is likely to be considered, because it has the advantage of preserving the integrity of the encryption (but whose interest then becomes entirely relative). And that’s what the Chaos Computer Club hackers fear, who had already issued an alert earlier this week on the subject.
“The most terrifying thing I’ve ever seen”
Among security researchers and human rights activists, the reactions are particularly strong. “In case you missed it, today the European Commission declared war on end-to-end encryption and demands access to anyone’s private messages in the name of child protection”tweeted Alec Muffet, a cryptographer who piloted the implementation of end-to-end encryption in Facebook Messenger. For Matthew Greenanother cryptographer, the European text is “the most terrifying thing I have ever seen”. According to him, this text describes nothing less than “the most sophisticated surveillance system ever deployed, outside of China and the USSR. Without exaggeration “. According to him, the use of such technology will inevitably create abuse.
Many associations for the defense of citizens are also up against this text.
“This would be a massive new surveillance system, as it would require an infrastructure for detailed analysis of user messages. This new proposal is too broad, disproportionate and infringes on everyone’s privacy and security.estimates the American association Electronic Frontier Foundation.
“Such detection measures will inevitably lead to dangerous and unreliable client-side scanning practices, undermining the essence of end-to-end encryption. (…) Such interference with the encryption process will also make everyone’s phones vulnerable to attack by malicious actors”writes the European association EDRi.
“No technology can reliably detect this content without countless false positives. Once the integrity of messaging communications is compromised — whether through backdoors in end-to-end encryption or filtering of messages on users’ devices — no one will be able to rely on the privacy of their conversations. . There are no backdoors that can only be used for the noble purpose of protecting children”explains the German association GFF.
The text must now be debated in the European Parliament and the European Council, which will formulate their own proposals. The three bodies will then have to reach an agreement to define a final text.