The person on the phone sounded exactly like Ruth Card’s grandson Brandon.
He needed money fast and Ruth, recognizing his voice, went to the bank.
Benjamin Perkins’ parents received a call from a man who said he was a lawyer. Benjamin was alleged to have killed another man and needed bail money. The lawyer handed the phone to someone – who sounded exactly like Benjamin.
In both cases, as Washington Post reported on, it involved a new form of fraud – voices cloned using artificial intelligence.
“Impossible to imagine a year ago”
In the past, large amounts of audio material were required to be able to reliably clone someone’s voice, but AI development is going at a breakneck pace – now it’s faster, easier and more accessible than before.
– What was impossible to imagine a year ago is free today. The companies that make and sell AI want a lot of data – that’s why they give away their services, says Tobias Falk, lecturer at the department of computer and systems science at Stockholm University, to SVT Nyheter.
“Having reached a point where the machines can imitate us”
– Algorithms are trained to imitate how the human brain works. It’s going really fast now, but it’s taken decades to get to the point where the machines can mimic us as well as they do today.
The police’s national fraud center writes in an email to SVT Nyheter that they are “aware of the phenomenon” but that they have not seen any cases in Sweden at the moment. “We monitor this and have communication with international contacts,” writes the police spokesperson.
How many seconds long clip is enough to be able to clone someone’s voice? And how well does it actually work in Swedish?
Watch the video above for our test.