Women the main victims of AI-generated deepfake porn

Women the main victims of AI generated deepfake porn

  • News
  • Published on
    Updated


    Reading 3 mins.

    Between photo applications that digitally strip women, sexualized messages featuring girls born of artificial intelligence and manipulated images for a “sextortion” racket, deepfake pornography is booming, taking runs the legislators.

    The spread of such artificial intelligence (AI) altered content has outstripped US and European efforts to regulate this young area of ​​technology.

    The terrible rise of AI-generated porn

    Several photographs, fake and perfected, have gone viral in recent months. One, giving rise to a smile, showed Pope Francis in a “down jacket”, an enormous white jacket stuffed with down. The other, more serious, had supposedly immortalized an arrest of former President of the United States Donald Trump.

    But much more widespread is the use of these algorithms to generate pornographic content, without the knowledge of the targeted parties, whose lives are sometimes destroyed, experts believe. And women are the first victims.

    The rise of AI-generated porn normalizes the use of a woman’s image or likeness without her consent“, points out to AFP Sophie Maddocks, researcher at the University of Pennsylvania who follows sexual abuse linked to the image.

    As a society, what message are we sending about consent when you can virtually undress any woman?“, she asks.

    QTCinderella, an American Twitch streamer, learned this the hard way: she recounted in a video that she was harassed after the pornographic use of her image, lamenting, in tears, “constant exploitation and objectificationof women. She said on Twitter that the experience had “devastated” her.

    Numerous victim testimonies

    Because these hypertricks or “in-depth fakes” can ruin a reputation, be used for acts of intimidation or harassment. They illustrate the threat of AI misinformation.

    According to a 2019 study by Dutch AI company Sensity, 96% of fake videos online are non-consensual pornography and most of them depict women: celebrities like singer Taylor Swift and actress Emma Watson, but also a lot of women who do not occupy the forefront of the media.

    The American and European media are full of testimonies from women academics or activists who were shocked to discover their faces in deepfake porn.

    The very private sexual fantasy once developed in a person’s imagination is now transferred to technology and content creators in the real world“, summarizes for AFP Roberta Duffield, director of intelligence at Blackbird.AI.

    Ease of access and lack of oversight – along with the growing professionalization of the industry – entrench these technologies in new forms of exploitation and disempowerment of women“, she points again.

    Among the new text-art generators are free apps that can create “hyperreal AI girls” – avatars from real photos, which can be customized by asking for “dark skin” or “thigh belt” .

    New technologies such as Stable Diffusion, an open source AI model developed by Stability AI, enable realistic images from text descriptions.

    “It’s right under our noses”

    Advances in technology have given rise to what Roberta Duffield calls a “blooming cottage industry” around AI-generated porn, with many players agreeing to generate content – ​​paid – featuring the person chosen by the customer. .

    Last month, the FBI warned of sexual blackmail, fabricated from images circulated on social media and used to extort money from victims or their families, who may be children.

    Faced with this proliferation of AI tools, the regulations are outdated.

    The law must catch up“, professes Dan Purcell, managing director and founder of the Ceartas company, specialized in the protection of marks.This is not some dark corner of the internet where these images are created and shared. It’s right under our noses“, he told AFP, advocating harmonized international laws.”The internet is a jurisdiction without borders“.

    Read also

    The vital card

    In Great Britain, a new legislative project is in preparation to criminalize the sharing of pornographic content thus trafficked.

    Four US states, including California and Virginia, have already banned distribution, but victims often have little recourse if the perpetrators live outside of those jurisdictions.

    In May, a US lawmaker introduced the Preventing Deepfakes of Intimate Images Act, which would make it illegal to share such content without consent. Popular sites such as Reddit have also sought to regulate their burgeoning AI porn communities.

    dts1