Updated 06.17 | Published 06.15
unsaveSave
expand-left
fullscreenTaylor Swift is one of several celebrities who have been exposed to so-called deepfakes, where fake pictures of her naked have been spread online. Archive image. Photo: Ed Zurga/AP/TT
Millions of users on Telegram are letting AI bots undress women and create fake nude photos, according to a new audit. The platform has tried to shut it all down, but new accounts are popping up.
The fact that it is possible to create fake nude photos of anyone using digital technology is nothing new, but the problem has exploded thanks to the development of AI. American Wired has reviewed the app Telegram and found there about 50 bot accounts that promise to remove clothes from people in a picture, or even create pictures where the people perform sexual acts. According to the review, the various accounts had more than 4 million monthly users.
– It is worrying that these tools – which destroy lives and create nightmare-like scenarios mainly for young girls and women – are so easy to reach and find openly, on one of the biggest apps in the world, says an expert that Wired spoke to.
Several famous people have been exposed to public fake nude photos, deepfakes, recently. Including the artist Taylor Swift and Italian Prime Minister Giorgia Meloni. But the images that the AI bots on Telegram “undress” could just as easily be someone’s partner, colleague, daughter, or classmate.
The majority of the bots and the channels used to promote them were shut down by Telegram after Wired contacted them for comment.
However, several similar accounts have appeared since then.
Telegram has not commented on the existence of the AI bots on the platform.
Telegram’s founder, Pavel Durov, was recently arrested in France, accused of criminal content allowed on the app, including child abuse material, drug and arms trafficking, and incitement to violence and terrorism.