In Gaza, the Israeli army’s AI at the center of criticism – L’Express

In Gaza the Israeli armys AI at the center of

The debate on the use of artificial intelligence erupts in the theater of war in Gaza. The alert came from the UN Secretary General who said on Friday April 5 that he was “deeply disturbed” by reports of Israel’s use of AI to identify targets in Gaza, refusing, he says, that “life and death decisions” are delegated to algorithms.

“I am deeply disturbed by reports that the Israeli military’s bombing campaign includes artificial intelligence as a tool to identify targets, particularly in densely populated residential areas, leading to a high number of civilian casualties,” he said. declared Antonio Guterres to the press.

READ ALSO: Fabrice Balanche: “Sooner or later, Israel will want to clean house and strike Hezbollah”

“No portion of life or death decisions that impact entire families should be delegated to the cold calculation of algorithms,” he insisted.

An Israeli army program

What is the UN boss referring to? A survey published by the media +972 Magazine and Local Call, taken up by several American publications this week, describes the existence of an Israeli army program called Lavender which uses artificial intelligence to identify targets in Gaza, with a certain margin of error.

“They collected information on a few hundred known Hamas fighters and, using their data, asked the machine to identify Palestinians with similar data, who then became potential assassination targets.” reports Meron Rapoport, editor-in-chief of the Israeli site Local Call, whose comments were taken up by International mail.

READ ALSO: Gabriel Weimann: “Hamas is much more effective than Russia in psychological warfare”

This means that 37,000 people have been designated as “targets” in Gaza. The investigation also indicates that the precision of these targets would have been sacrificed in favor of speed. The “human verification” part, which ensures that the person targeted is indeed the right one, has been reduced to a minimum, “no more than twenty seconds in certain cases”, details the journalist. “Consequently, more and more civilians are affected and no longer just members of Hamas,” assures Meron Rapoport who estimates that the army’s margin of error is around “10%”.

Israel tries to ‘reduce harm to civilians’

On the side of Israel, an army spokesperson told CNN that AI was not used to “identify suspected terrorists”, but did not dispute the existence of the Lavender system which he described as “mere tools for analysts in the process of identifying targets” . Analysts “must conduct independent reviews, during which they verify that identified targets meet relevant definitions in accordance with international law and additional restrictions stipulated in IDF guidelines […] “Israel is trying to reduce the harm caused to civilians to the extent possible under the operational circumstances in effect at the time of the strike,” he told the American channel.

READ ALSO: Behind the calls to boycott Israel, a real business

Explanations which have difficulty convincing the head of the UN. “I have been warning for years about the dangers of weaponizing artificial intelligence and reducing the essential role of human intervention,” stressed Antonio Guterres. “AI should be used as a force for good, for the benefit of the world, and not contribute to war at an industrial level, blurring accountability,” says the UN Secretary General.

In the midst of a humanitarian drama in Gaza, the death on Monday April 1 in Israeli strikes of seven workers from the NGO World Central Kitchen (WCK) and its revelations on the use of AI are not likely to extinguish the international discontent.

lep-sports-01