A new study shows that 70 % of the users are politely to chat, but the reasons for this are very different. While many do it out of habit, others fear any consequences. Experts are disagreed, whether courtesy affects our behavior or even improves the technology.
It is about this study: The survey (via techradar.com), which was carried out by Future in December 2024, interviewed over 1,000 people from the USA and Great Britain. It emerged that courtesy for AI like Chatgpt is no longer just an exception, but has become normal for many.
How are the results?
In Germany there are over 400 USB sticks that protrude from walls-what is the “Dead Drops” project?
More videos
Autoplay
What do experts say about it? The expert opinions about courtesy towards AI are shared. Some emphasize that courtesy is important to promote respectful behavior in interpersonal interactions. They warn that a disrespectful way of dealing with AI could also affect our social relationships. Another argument comes from the Casa paradigm, which says that we often treat machines like humans. Our brain makes no difference between dealing with a person or a AI.
John-Anthony Disotto, AI editor at Techradar, looked at the topic rather skeptically. In his opinion, the behavior could be related to fear. After all, science fiction films such as “2001: Odyssey in space” or “Terminator” shaped the idea that KIS could take power at some point.
Chatgpt is flexible
When asked our question, Chatgpt commented on the subject of courtesy: “As a AI, I don’t need a courtesy because I have no feelings or emotions. But I find it interesting that people are often polite, even if they speak to a machine. It makes the interaction more pleasant for most and can actually lead to better results in some cases, since polite inquiries are often more precise. ”
While we deal with courtesy and their influence on artificial intelligence, it can also be seen how far the technology has already progressed. Elon Musk officially said that we had reached the limit in the area of training data. There should no longer be human data with which AI can be trained.