A call of a loved one asking for emergency help … What if it was not him? Thanks to AI, crooks can clone a voice and manipulate you. Our advice not to fall into the trap.
Online frauds are constantly evolving, but it reaches a new level of realism. With only a few seconds of audio recovered from a video or vocal message, crooks are able to recreate a voice thanks to artificial intelligence and thus to scam their victims. A false call, an urgent request for money, and voila. The goal? Take advantage of your panic and your emotion to push you to act without thinking.
So far, this type of scam was mainly reported in the United States, but it is starting to make victims in France. A 50 -year -old recently received a call from her “mother”, explaining that she had lost her bag and needed her bank card number. He almost fell into the trap before noticing inconsistencies in the conversation.
Vocal cloning, a formidable weapon for crooks
Online scams now use vocal cloning technologies doped for artificial intelligence. It only takes a few seconds of recording for a tool like OpenAi Voice Engine reproduce a voice with a disturbing realism. Intonations, emotions, accent … everything is there, to the point of making the hoax almost undetectable.
But how do the crooks get these vocal samples? They often have to go to social networks. Many people post videos or stories where they speak, without imagining that these recordings could be diverted. Other methods exist: recover voice messages sent to WhatsApp, Messenger or Telegram, or even exploit online interviews.
Once the voice is copied, the crooks search the internet in search of the relatives of their victim, then set up a trapped call to extract money from them. The scam is simple, but formidably effective. The crook calls by pretending to be a family member or a friend. He speaks quickly, seems panicked, says that he had an accident or that he is in difficulty and that he needs help, right away. Taken by emotion and fear, the person on the other end of the line often does not have the reflex to doubt.
An expanding phenomenon
A study by the British neobank Starling Bank shows that nearly a third of the people interviewed were targeted by this type of scam in the last 12 months. Even more worrying, 46 % of respondents completely ignored the existence of this threat. Lack of information is a real asset for crooks, who take advantage of this flaw to trap a maximum of victims.
And the worst is that it works. According to a study, 8 % of those questioned admit that they could send money, even by finding the ladle. It may seem little, but brought to the scale of a country, it represents millions of potential victims.
Besides the man who narrowly avoided the scam, more and more testimonies circulate. In the United States, a mother believed that her daughter had been kidnapped after receiving a call where she recognized her voice crying and asking for help. A ransom request followed, and only a verification of the entourage made it possible to understand that it was a deception.
How to avoid being trapped?
Faced with this threat, it is essential to adopt simple but effective reflexes. The first rule is never to give in to panic. The crooks play on the urgency to prevent any reflection. If a call seems strange, you must immediately seek to check the identity of the person.
One of the most effective means is to directly recall the close to its usual number. If the initial call comes from an unknown number, it is already an alert signal. If in doubt, ask a question of which only the actual interlocutor knows the answer can make it possible to confirm or not his identity.
It is also recommended to establish a safety sentence with your loved ones, a vocal password known only to the family circle. A precaution that can make all the difference in the face of an attempted scam.
Cybersecurity actors are starting to develop solutions to counter these fraud. Bitdefender Scam Copilot is one of the tools designed to identify these scams. Thanks to an advanced analysis of fraudulent patterns and suspicious behavior, it makes it possible to detect vocal cloning attempts and alert the user in real time. This type of solution, associated with personal vigilance, represents effective protection against this new kind of cybercrime.
Bitdefender Premium Security with Scam Copilot at -36%
A threat that has only just started
With the progress of artificial intelligence, these scams are likely to become even more sophisticated. Vocal Deepfakes could be combined with fake videos, making scams even more credible. Some experts are already alerting to the possibility of seeing false calls for banks, administrations or even public figures used to manipulate victims.
The vocal cloning scam is a striking illustration of the dangers of artificial intelligence when it falls into bad hands. If technology evolves, our prudence reflexes must follow. Before responding to a request for money or giving sensitive information, taking a few moments to check the call can make all the difference. Faced with scams, distrust is often the best shield.