This is news that should delight Emmanuel Macron, as France currently holds the Presidency of the Council of the European Union (EU). Engaged in the midst of an electoral battle against the candidate of the National Rally Marine Le Pen, he has shown himself on many occasions in the past, attached to digital issues. Faithful to the President, European Commissioner Thierry Breton, in charge of the Internal Market, is currently working on a single regulation which aims to regulate the digital space. What is it really about?
Disinformation, online hatred, counterfeits… The European Union regulation, which should be concluded on Friday April 22, aims to enforce European laws by digital platforms, in particular social networks and service providers. Discussions are continuing between the representatives of the Member States and the Members of the European Parliament. Several points are at the heart of the discussions: the obligations relating to digital service providers, the constraints weighing on the platforms and the question of sanctions in the event of breaches of these rules.
The Digital Services Act (DSA), what is it?
The Digital Services Act (DSA) is digital services legislation that aims to regulate the internet. In summary: make it a safer space for European citizens, strengthen the protection of minors by improving moderation on social networks. Developed by Margrethe Vestager, Vice-President of the European Commission, and Thierry Breton, the proposal was unveiled by the European Commission in December 2020. It was adopted by Parliament in January.
Now the Member States and Parliament must agree on a single text. France hopes that it will be adopted in June 2022, under its presidency. A clear message sent on the country’s role in building a Europe as a digital power, as envisaged by Emmanuel Macron. From the entry into force of the text, several obligations will be imposed.
Obligations on platforms and suppliers
First, online service providers will have to appoint a legal representative in one of the 27 member states. At the same time, they must publish, once a year, a report detailing the actions taken in favor of content moderation within their company. The goal? Make providers and online platforms accountable. Facebook, Twitter, YouTube and all social networks will have to set up a report button, easy to access and easy to use. No more endless forms to report a violent video or cyberbullying.
Platforms are also asked to work upstream to rid social media news feeds of illegal content. The platforms will also have to suspend users who “frequently” provide this illegal content. They will also ensure that each user can be informed of the parameters used to target him. The use of sensitive data relating to political opinion will be prohibited to avoid manipulation and interference in electoral processes.
And penalties
The European Union now has the power to sanction companies that do not apply all of these obligations. Fines may reach up to 6% of annual turnover. A direct user complaints procedure is also in the process of being adopted. With regard to the transmission of user information to the courts when requested, any refusal will henceforth be sanctioned. The penalty may amount to up to 1% of worldwide turnover.
In addition, the very large platforms with more than 45 million active users in the European Union, such as Facebook or Twitter, will be subject to an audit by independent bodies in order to verify compliance with these duties. Each EU Member State will designate a competent authority, which will have the power to investigate and sanction. The 27 authorities will cooperate with each other to enforce the legislation throughout Europe.