The French Navy wants to win the acoustic war

The French Navy wants to win the acoustic war

Artificial Intelligence is being used in all segments of modern combat: in particular for a better reading of the battlefield and now AI is also being used underwater… The French Navy calls them the golden ears, they are the analysts capable of identifying sounds captured under the surface of the sea, AI will revolutionize their profession, the objective: move quickly to win the acoustic war.Rebroadcast from May 19, 2024.

Tac tac tac tac tac tac… This regular sound is the sound of an oil tanker as heard underwater, a characteristic sound. The golden ear of a submarine could tell that the propeller of this ship is composed of five blades and that its shaft line turns at 120 revolutions per minute. Crucial information for the navy, in particular for submarines, underlines frigate captain Vincent Magnan, commander of the acoustic interpretation and reconnaissance center, the Cira in Toulon.

There is a lot going on under the diopter, as they say in our field. To give you some very specific examples, a merchant ship is heard by the sonar of a submarine or a frigate, in particular by what is called radiated noise, which can be composed of several types of sounds. One of the characteristic sounds is what is called the number of shaft revolutions per minute, that is to say the speed of rotation of the shaft line that propels the ship, which is also associated with a number of blades. And when we master this information, we know what the speed of the boat we are looking for is. And depending on the speed of this boat, we are able to put in place an idea for a maneuver. And so the real thinking is to say to ourselves that passive acoustic warfare allows us to discreetly, without raising the level of crisis, capture technical information from which decisive tactical conclusions can be drawn for operations. »

And this is all the more important for a submarine which by definition is blind, but acoustic sensors are increasingly powerful and consequently the golden ears are faced with an inflation of data, underlines Commander Magnan.

In the early 2000s, a sonar operator had equipment that allowed him to hear at about 20 km and simultaneously process about ten acoustic contacts. Today, we are more on sonars capable of detecting up to almost 200 km and allowing the simultaneous processing of almost a hundred acoustic tracks. Which means that the volume of data to be processed has increased considerably. The direct consequence is that for the golden ears at sea, for the analysis of all these acoustic contacts, there is a human commitment that is much greater than before. »

Preligens algorithms

Artificial intelligence will make it possible to discriminate sounds much more quickly. And this is where a French nugget, Preligens, comes in. Well known for its spatial image analyses, the company has put its algorithms to the service of acoustic warfare. A demonstrator was created last year, with an initial experiment. For twelve days, the Navy recorded all the sounds of the sea off the coast of Toulon.

These 12 days needed to be annotated in order to train artificial intelligence algorithms. It took us almost forty days to annotate these 12 days of work “, emphasizes Vincent Magnan. ” Now, with the algorithm and the demonstrators obtained, we inject 12 days of acoustic recordings into the machine, and in about four hours, the machine gives us the phases on which the analysts can bring their business skills. Which means that from an initial 40 days, we have gone down to 5-6 days. The goal is to be able to analyze more and more data. In 2020, CIRA received about one terabyte of data annually. In 2024, we are more like 10 terabytes of acoustic data. We will certainly exceed 100 Terra by 2030. »

But AI cannot do everything, golden ears will always be decisive, assures Vincent Magnan. This is the goal of saying that once we have seen a boat, we will be able to see it again each time it enters our detection volume. With the nuance, which is still very important and which makes the application of artificial intelligence quite complex, is that the same boat, seen in the Mediterranean in January and seen in the North Atlantic in December, will not make the same noise. Because the acoustic environment will have changed, because perhaps the bearings of its shaft line will have been damaged or will have been corroded, perhaps because there will be concretions on its hull, which will modify its cavitation. And therefore the radiated noise will not be quite the same. And that is why today, if artificial intelligence makes it possible to globally detect the main characteristics of a boat, it will also require human know-how to really look for the discordant elements compared to a previous interception for example. »

Golden ears are rare, there are no more than thirty analysts in the Navy. AI will allow them to focus on listening to interests, as for the machine, it will allow to exclude the noises of shrimps and sperm whales.

rf-3-france