The use of deepfake technology is increasing. Considering that we are in an age where high technologies are used heavily, it is not surprising that the use of deepfake technology is increasing. However, deepfake is also used for malicious videos, which causes many people to become victims of deepfakes. One of these victims is a woman named Kate Isaacs.
HE SEE HIS FACE IN A VIDEO WITH SEXUAL CONTENT
According to the BBC’s report, Kate Isaacs stumbled upon a disturbing video among her notifications while scrolling through Twitter one evening. Isaacs got the shock of his life with this video. Because her face was featured in a sexually explicit video. But the reality was very different, Isaacs said, as someone took his face and pasted it into a sexually explicit video. In other words, it was previously deceived by deepfake technology, which allowed many people’s faces to appear in fake sexual videos. Someone had used artificial intelligence to digitally embroider someone else’s face.
“I DIDN’T THINK CLEAR, IT WAS HORRIBLE”
Kate Isaacs appeared to be having sex in a deepfake video of her face made using footage from TV interviews. “My heart felt like it was going to stop. I couldn’t think clearly. I remember feeling like this video was going everywhere, it was horrible,” Isaacs said. she said.
RECEIVED THREATS
In the news, it was mentioned that the woman who was the victim of the deepfake had no idea who was behind this incident or who might have seen the video. Isaacs also shared that although he knew the video was fake, it was too convincing, so he was worried that others wouldn’t notice it was fake. In addition, she mentioned in the comments below the video that she had received threats of assault and rape.
The video was reported to Twitter and removed from the social media platform. However, after any deepfake video was shared on the internet, it was stated that it was very difficult to completely remove it from circulation.
FAMOUS ARE AT THE TARGET
Celebrities and politicians have also been targets of deepfake videos in the past. For example, a deepfake video featuring Elon Musk’s face has been on the social media agenda in recent months. However, the video in question was fake and the scammers were trying to trap people using Elon Musk’s face and voice. “Absolutely not me,” Musk said of the video.
Yikes. Def not me.
— Elon Musk (@elonmusk) May 25, 2022
However, in the news in question, it was informed that 96 percent of deepfake videos consist of videos with non-consensual sexual content, according to the data of the cybersecurity company Deeptrace.