The giants of the Net before the Supreme Court: this “section 230” which protects them

The giants of the Net before the Supreme Court this

Are the Internet giants responsible for the content posted by their users? The question has been around for a quarter of a century. It is officially examined for the first time this week by the Supreme Court of the United States. By assessing the legitimacy of this text considered the pillar of the Internet’s rise, the body could revolutionize the law of the web.

At the center of the debates, “section 230”, a piece of law dating from 1996 which protects platforms like Facebook or Google from being prosecuted for the content posted on their sites and applications. Tuesday February 21 and Wednesday February 22, the American high court must assess its scope by examining two cases brought by victims of jihadist attacks, who accuse Google and Twitter of having “helped” the Islamic State (IS) group by spreading its propaganda.

Are the algorithms responsible?

During the first day of hearings, Tuesday February 21, the Supreme Court did not make it clear whether it thinks the law should be rewritten or not. “We are in a delicate situation, because this text was written in another era, when the Internet was completely different,” summed up judge Elena Kagan. It was then designed to ensure the legal immunity of digital companies which, unlike “publishers”, “host” content posted by users of their platforms.

But many voices have been demanding for years that this law be modified or withdrawn, considering that Google, YouTube, Facebook or Twitter should be held responsible when they facilitate the spread of so-called “problematic” content that can have serious repercussions in real life.

This first complaint against Google was brought by the relatives of Nohemi Gonzalez, a young American killed in the November 2015 attacks in Paris. They accuse him of having supported the growth of the Islamic State (IS) group by suggesting his videos to certain users. Their complaint has so far been dismissed by the courts in the name of “Section 230”. But in their appeal to the Supreme Court, they believe that Google is not a “publisher” protected by this device, since the algorithms it created have “recommended” the videos of the IS.

“The web could slowly die”

For the platforms, it is impossible to be held responsible for the infinity of content they host. “There are 3.5 billion queries on the search engine every day. The answers are different for each person and could all be considered recommendations,” argued Google lawyer Lisa Blatt during the hearing Tuesday.

“The internet would never have taken off if everyone could sue all the time,” she continued. “The web would slowly die”. The judges also noted it: changing the case law could “crash the digital economy, with all kinds of consequences for workers and pension funds, etc”, estimated judge John Roberts.

The idea of ​​parliamentarians in the 1990s was indeed to protect the then embryonic sector from cascading lawsuits, to allow it to flourish, while encouraging it to remove problematic content. But this provision no longer enjoys consensus: in the United States as elsewhere, many on the left criticize social networks for hiding behind this immunity to allow messages of incitement to hatred, racism and conspiracy to flourish. While the American right, outraged in particular by the banishment of Donald Trump from several platforms, accuses them of “censorship” under cover of their right to moderation.

AI complicates the game

The judges expressed their doubts about the validity of “Section 230” during this first hearing. But also their frustration with the complex subject of artificial intelligence, which has taken another leap in recent months with interfaces like ChatGPT. “In a post-algorithmic world, AI can generate content, including following neutral rules. It can generate poetry, it can generate controversy,” said judge Neil Gorsuch.

The temple of American law continues its reflection this Wednesday, February 22 with a very close file: if “section 230” did not exist, could the platforms be condemned under anti-terrorism laws, even without direct support for an attack? The Court is due to give its final answers to both rulings on June 30. If it is very cautious for the moment, by accepting the files while it dismisses the vast majority of those submitted to it, the high court has hinted that it was ready to change the case law.

lep-general-02