The control of online content has become even more urgent during the war in Ukraine. EU countries, Parliament and the Commission are close to agreeing on a new law on digital services. They are negotiating tonight.
22.4. 15:50 • Updated April 22nd. 16:16
If Facebook, YouTube, Twitter or Google, for example, do not remove illegal content from their platforms, they may have to shell out fines of up to 6% of their turnover for the EU Commission. We are talking about big sums, because the turnover of digital giants can be in the tens of billions of euros.
This is at the heart of the new EU Digital Services Act, which is currently being agreed by representatives of the Member States’ Council, Parliament and the Commission in their so-called trilogue negotiations. According to preliminary data, the agreement is close.
The purpose is to prevent the spread of false information online. Controlling online content has become even more urgent during the war in Ukraine as Russia pushes war propaganda and disinformation online.
EU regulators want to use all available means to combat propaganda. Just a few days after the Russian invasion, the EU banned not only Russian Russia Today and Sputnik from operating in their territory, but also sharing their contents with some accounts.
Above all, the Digital Services Act increases the responsibility of digital giants in controlling illegal online content.
An emergency mechanism will be set up in the EU to oblige digital companies to disclose disinformation on their platforms to the authorities. They must also be able to show what steps they have taken to eliminate propaganda and misinformation.
The Digital Services Act is the most advanced attempt to quell the power of digital giants in the world to date.
For example, a former US presidential candidate Hillary Clinton called on the EU to reach an agreement during the negotiations.
“For too long, digital platforms have reinforced the spread of misinformation and extremism without responsibility,” Clinton writes on Twitter.
Obligation to inform consumers
It is not yet clear how the controls would be technically implemented. Most likely, some giants should develop a system through which they could be notified of illegal content.
Therefore, users will be asked for more feedback in the future through click-through or reject-type clicks. The service provider should remove the illegal material immediately. Users would, of course, have the right to complain about the removal of content, and it can be reinstated if the post has been removed for incorrect reasons.
The law would apply to large digital companies
The EU wants to make sure that some giants actually take action in emergencies such as terrorist attacks or war.
“Very large” online platforms would have specific obligations to remove illegal material, as they have a greater responsibility than small service providers to distribute illegal and harmful content.
According to the model initially presented, such a “gatekeeper” is defined as a service provider with a turnover of more than EUR 8 billion per year, more than 45 million users per month and operating in at least three EU countries.
Algorithms open
With the new law, providers of social media, e-commerce and streaming services should share information about their algorithms with users. The user wants the right to be informed about the logic behind the different recommendation algorithms on the sites.
In addition, service providers should provide consumers with at least one referral system that does not profile the user. Nowadays, websites know their users and recommend content based on their profile.
The key people in the evening’s talks will be the European Commission’s Vice-President for Digital Waste Margrethe VestagerCommissioner for the Internal Market Thierry Breton and the French Presidency, Secretary of State for Digital Affairs Cédric O.
Finland Henna Virkkunen (CO) is the main negotiator for ITRE, Parliament’s Committee on Industry, Energy and Research. He anticipates the negotiations will drag on late into the evening.
The biggest remaining disagreement over Friday’s talks was over rules on how companies are required to protect internet users from manipulation, or how the system’s crisis mechanism works exactly during a terrorist attack, pandemic and war.
You can discuss the topic until 11 p.m. Saturday night.