FBI Warns of New Porn Scam – “Will Ruin Lives”

FBI warns of fake porn created by AI for extortion purposes.
Celebrities, but also ordinary people, are depicted in pornographic situations.
– It will destroy so many people’s lives, says Jan Olsson from the police’s national IT crime centre.

The FBI has issued a warning about pornographic images and videos, so-called deepfakes, which are created with the help of AI for blackmail purposes. The material can depict celebrities, but also ordinary people, in explicit situations.

Hard to see what is real

Manipulated images of the pornographic variety are not new, neither in the US nor in Sweden, but with more and more advanced technology it becomes increasingly difficult to distinguish what is real and what is not.

– Development is going furiously fast! The technology will be much faster, easier to use and cost almost nothing – then I see how the type of criminals who want to hang their ex or cheat someone out of money will look at this, says Jan Olsson in Nyhetsmorgon.

He hopes to develop software that can show what is real and what is fake.

The police are always second on the ball

At the same time, he believes that the police are always one step behind the criminals.

– We try to keep up, but we will always be second. But this is not something that only the Swedish police look at – both Interpol and Europol have think tanks that look at how we should respond to the new crime, says Jan Olsson.

Deepfake porn has not yet been seen in Sweden, but Jan Olsson believes that it is “only a matter of time”.

– It is a troublesome and distressing development, because it will destroy so many people’s lives. I really hope that there will be powerful filtering and that the portals will take responsibility and sort this out immediately.

8:04 am

FBI warning: Porn scams are becoming more common

t4-general