Social media is talking about this woman! It appears in the images created by artificial intelligence, the viewer freezes

Social media is talking about this woman It appears in

We have seen together that artificial intelligence gains new abilities in writing, producing, imitating and even writing. An artificial intelligence has even started making movies. There is also Ameca, which mimics the human face almost perfectly. But who knew it would lead to a spooky phenomenon?

THE TERRIBLE WOMAN CREATED FEATHER IN THE IMAGES CREATED BY ARTIFICIAL INTELLIGENCE

Twitter user @supercomposite claimed that in some of the images he created in artificial intelligence programs, a woman with a terrifying appearance constantly appeared before him. The woman, whom the artist calls “Loab”, was first discovered as a result of a technique called the “negative weights technique”. In this technique, the artificial intelligence is requested to create an image in the opposite direction of the command given by a user.

LOAB, NOT MARLON BRANDO

According to the news of Vice, the user asked Marlon Brando to create an image from the artificial intelligence in the opposite direction of the word “Brando”. The result was an image of a city skyline and a logo with the words “DIGITA PNTICS”. Supercomposite then asked the artificial intelligence to create an image in the opposite direction, by writing the words in the logo and using the negative weight technique. In this case, Loab appeared before him. In trials, artificial intelligence consistently showed the same woman.

Images of Loab quickly spread on social media, with many speculations beginning to emerge as to what could be causing this disturbing phenomenon.

Supercomposite claimed that images derived from Loab’s original image contained horror, violence and blood.

He did not disclose what artificial intelligence he used to create the images. Vice declined to provide details when it reached Supercomposite on Twitter.

But in another statement, “Unfortunately for various reasons I cannot confirm or deny which model it is. “But I can confirm that Loab is present in multiple AI models that generate images.”

LIKE ALMOST IMPOSSIBLE

Unfortunately, it seems almost impossible to know exactly what’s going on here. Because image-generating artificial intelligence systems such as DALL-E, which has become popular recently, use models trained on billions of images. It is therefore very complex to understand how systems achieve certain results.



mn-3-tech