It’s getting worse on X! Elon Musk’s social network is now running ads for apps that allow you to strip photos of women using AI. There are no more limits, everything is good to bring in money!

Its getting worse on X Elon Musks social network is

It’s getting worse on X! Elon Musk’s social network is now running ads for apps that allow you to strip photos of women using AI. There are no more limits, everything is good to bring in money!

Elon Musk hits rock bottom, but keeps digging… Since the billionaire bought X (formerly Twitter), the platform’s revenues have continued to decline, in particular because of the flight of advertisers. Indeed, many brands, fearing for their image and refusing to be displayed alongside content deemed problematic, have decided to no longer broadcast their advertisements on the social network – especially since Elon Musk has clearly told them to ‘go “get fucked”. Also, to replenish the coffers, X hosts more and more unsavory advertisements, whether for dangerous products, dropshipping or… applications offering to digitally undress women. Yes, you read correctly.

Pub that undresses: X has never lived up to its name so well!

THE specialized site 404 Media noticed, on December 15, the presence of advertisements on the social network – which has obviously never lived up to its name so well – offering “strip any girl using artificial intelligence” – You can not stop progress ! We see a young woman in a swimsuit, followed by the same photo, except that this time, she is dressed in her simplest camera – the private parts have been blurred. This is called a deepfake. Worse still: the photo seems to be sent to the young lady in question, who asks her interlocutor, in panic, if he hacked it. You will have understood, no need for the consent of the person represented to satisfy all their base desires…

© 404 Media

This is the heart of the problem. This type of application can be used for sexual harassment and tarnishes the image of people whose identity has been stolen, possibly going so far as to ruin their lives. A few weeks ago, eleven-year-old Spanish schoolgirls received completely naked photos of themselves on their cell phones. However, they had never taken these photos. Classmates had simply used artificial intelligence to merge photos of their faces onto naked bodies, to obtain a particularly realistic result (see our article). We’ll let you imagine their shock… This can go as far as sextortion, through financial blackmail… And, unfortunately, X is not proposing any action against advertisements of this type, which will continue to spread , causing more and more victims…

ccn3