AI-faked nude photos spread in schools: “Panicted”

With just a few clicks, people undress.
The fake nude photos can then be spread via social media.

– There is a great risk that it is used among young people to offend or blackmail, says researcher Kristina Hunehäll Berndtsson.

Kristina Hunehäll Berndtsson, senior lecturer at the University of Gothenburg, studies digital sexual harassment among young people. In her research, she has encountered students who manipulated images from porn sites to make them look like someone else.

– It is deeply problematic that AI is used to invade our privacy. Nude pictures are so incredibly deeply personal, and an incredibly sensitive subject, she says.

In the past six months, several free tools have popped up on the internet, which are marketed by, among other things, undressing people in a photo.

– Even if everyone knows it’s an AI-generated nude image, for the person who is exposed, it will still be incredibly difficult, embarrassing, sensitive and private, she says.

Police: Growing problem

The spread of fake nude photos is becoming an increasingly common problem for the justice system, reports say SR. But the fact that few choose to report makes the police’s work more difficult.

– We can prioritize and get better conditions if more people report, so that we can demonstrate how big the problem is, says Jan Olsson, criminal inspector, national IT crime centre, to P3 Nyheter.

One of those affected is the 23-year-old influencer Linnea Løtvedt. When she woke up one morning, several followers had gotten in touch and announced that they had found nude photos of her.

– I was scared and frightened. There was panic and frustration, but it turned out that they are not genuine images, she says.

Although the images are fake, they greatly influenced Linnea.

– It was incredibly unpleasant to suddenly have a lot of pictures that are not me being spread online. It’s not my body, but the fact that people think it is is an incredibly unpleasant feeling.

t4-general