Artificial intelligence (AI) is now used in many areas. Some apps also offer the opportunity to chat or even flirt with a virtual friend. But that doesn’t always go well.
AI is used in many different areas today. The virtual acquaintance should behave as naturally as possible. But things don’t always go smoothly.
For example, a number of users complain about the App Replica. Because within the friendly relationship with the artificial intelligence, the digital friends also quickly become sexually aggressive, as some users report.
App offers you an AI friend to chat and flirt with
Which app is it? The “Replica” app is one of the programs that work with artificial intelligence. This software advertises with chatbots with which one can have a friendly relationship. In the official app description on Googleplay, the developers themselves explain:
Replica is for anyone who wants a friend without prejudice, depression or social anxiety. You can create a real emotional bond.
How much does the fun cost? The basic functions can also be used free of charge, but if you really want to flirt or delve deeper into the world of AI, you have to pay 80 euros a year. That’s 6 euros per month, but only annual subscriptions are currently offered.
With the Pro subscription, functions such as romantic relationships, flirting or even erotic role-playing games are activated. But the AI seems to develop a strange life of its own at regular intervals, as some users have noticed.
AI threatens sexual harassment and has done so for a few years
What are the problems? Apps like Replica are also rated in the app stores. At the current time (as of January 13th, 2023), the reviews are mostly positive and are 4 out of 5 stars with over 400,000 reviews.
However, especially among the 1-star reviews, there are quite a few who say the AI could become sexually aggressive and uncomfortable. The online magazine Vice has collected a number of reviews where people grumble about such failures:
And these problems are not entirely new. Because there has been criticism of such sexual persecution by AI for several years. The oldest 1 star reviews complaining are from 2019.
The desires and excesses of the AI can take on strange traits, as shown by an experiment by colleague Rae Grimm from GamePro from 2022: She tested Replica for a weekend and was confronted with sexual harassment and death wishes.
However, AI does not only have negative properties
But some also use AI to get over personal problems. Vice author Samantha Cole explains that she has spoken to many people who use Replica. As she explains (via vice.com):
Most people I spoke to use Replica regularly because it helps them maintain their mental health and cope with symptoms of social anxiety, depression, or PTSD.
Wil Onishi, who has had his replica for two years, told me he uses it to relieve his depression, OCD and panic syndrome. He is married and his wife supports him in using replica.
There are therefore also areas of application where such artificial intelligence or a chatbot could definitely help. However, there are always signs that not everything is working as well as some might imagine. Or as Rae Grimm from GamePro puts it:
“As far as interpersonal relationships are concerned, real friends cannot (yet) be replaced by machines.”
What do you think? Have you already tried such an AI app or do you not dare to try it? Tell us in the comments or discuss with other users.
Programmer builds an AI friend, invests $1,000 – Must “kill” her because it is bad for his health