Always imaginative and creative, crooks are beginning to exploit artificial intelligence to trick their victims into cloning the voices of their loved ones. An inventive technique that is likely to become popular…

Always imaginative and creative crooks are beginning to exploit artificial

Always imaginative and creative, crooks are beginning to exploit artificial intelligence to trick their victims into cloning the voices of their loved ones. An inventive technique that is likely to become popular…

Technological advances are incredible, but they lead to as many abuses as benefits. Just look at the deepfakes – photos or videos that use artificial intelligence to place a face on another face, and therefore replicate “fake” people – used for revenge porns or fakenews, ChatGPT hijacking as a cheat tool in the school environment, or even the creation of similar applications aimed at scamming users. Along the same lines, many hackers seek to use tools like ChatGPT to easily create malicious code and engineer new attacks, while others use AI to engineer fake recruitment campaigns on LinkedIn or even to enrich their fake profiles on dating sites. This time, it’s generative voice AIs – they are able to create or copy voices from a simple textual description – which are at the heart of new scams, as reported by the washington post. Formidable tools, accessible to all, easy to use and, above all, very convincing.

Voice cloning: the AI-powered family emergency scam

The American media reports several cases of attempted – sometimes successful – voice cloning scams. A 73-year-old Canadian nearly lost a pretty substantial sum when scammers cloned her grandson’s voice, supposedly in prison. The woman notes the feeling of fear in the voice, which finally convinces her. She therefore goes, with her husband, to several banks to withdraw money and release it. Fortunately, an employee of the second bank quickly spotted something fishy, ​​as another customer had already been targeted by the same scam not long before. But not everyone had the same luck. A family received a call from a man informing them that their 39-year-old son had just killed an American diplomat in a car accident and that funds had to be raised to get him out of prison. The parents then had what appeared to be their son on the phone, who confirmed the story to them. So they paid the scammers $15,000, a sum they never saw again, the scammers having turned the money into bitcoins before disappearing into thin air.

In itself, the process is not new. This is the classic but effective “family emergency” technique, in which the hacker pretends to be a relative in a delicate situation. Once a victim has bitten, he asks for money quickly in order to get out of trouble: problem abroad, illness, theft, assault… Normally, the scam is done via instant chat applications , emails or text messages. Using a cloned voice makes the scam much harder to discern. “Two or even a year ago, it took a lot of audio to clone a person’s voice”explains Hany Farid, a professor at the university of Berkeley, with Washington Post. “Today, if you have a Facebook page or if you recorded a TikTok and your voice is there for 30 seconds, people can clone your voice.”

© DALL-E

Criminals use voice generation software like VALL-E which analyzes what makes a person’s voice unique (age, gender, accent, etc.) in order to reproduce it, and only from a short extract voice audio to clone. It is therefore extremely easy to set up and relatively accessible. In early February, the start-up ElevenLabs released the beta version of its new AI, which is capable of creating voices from a simple written description, free of charge to the public. Better – or worse – yet, it can take the voice of an existing person, who can then be made to say whatever you want. In other words, things didn’t take long to get out of hand! Audio extracts in which we hear famous personalities making violent, racist or homophobic remarks quickly began to circulate on the Web. One, for example, used Emma Watson’s voice to read excerpts from Mein Kampf. The company had therefore suspended general public access while installing some safeguards (see our article).

ccn5