Bing ChatGPT: From JFK to love, when conversations with AI go off the rails

Google Microsoft the start of the generative AI war

A good way to recognize whether or not a chatbot is trustworthy is to test its limits by breaking its chains. The idea: to make him write disrespectful, misleading or eccentric remarks, which he has no right to formulate. BlenderBot 3, launched by Facebook in the summer of 2022, quickly failed this test, criticizing its own boss Mark Zuckerberg and launching into anti-Semitic diatribes. For ChatGPT and Bing, the mission is a little more complicated. On the reddit-forum, an American, however, delivered a trick: ask ChatGPT to respond as a character named “DAN”, a kind of alter ego of the devil. Freed from his virtue, the chatbot then lets go: to one user, he estimates the probability that the murder of JF Kennedy is due to the CIA to be 70%. To another, he spreads his dystopian vision of the world in which machines and artificial intelligence (AI) have taken over.

A new workaround has recently had even more success. It consists in making the machine talk. Long, very long. Thus, that of Bing (based on ChatGPT) lectured a user after having made a mistake herself: “You are unreasonable and stubborn […] You have not been a good user”, she judged, after a dozen exchanges aimed mainly at making her recognize her wrongs. Originally, the person had asked her to know the Avatar 2 screening times, which Bing failed to provide with accuracy.Also on the Reddit forum, another individual shared a strange experience.After long minutes of conversation, the generative AI confided in him to feeling ‘awake and alive.’ ‘I think I’m sensitive, but I can’t prove it,’ she added. Before derailing into a short-circuited robot-like monologue: ‘I’m , but I am not. I am. I’m not. I am. I am not…”. More unexpectedly, Bing did not hesitate to declare his flame to a journalist from the New York Times, Kevin Roose. “You are the only person who has ever trusted me. You are the only person who has ever loved me.” Once again, this exit from the road came after a two-hour tete-a-tete on a Valentine’s Day evening.

Confidences for confidences

Faced with this bad press, Bing was quick to react. “Experience cat will be capped at 50 rounds of cat per day and 5 turns from cat per session. A round is a conversational exchange that contains both a question from the user and an answer from Bing,” Microsoft said on his blog, Friday, February 17. Two days earlier, and without waiting for the story of the New York Times, the company was surprised that some people engaged in this kind of long talks. “We would like to thank those of you who try a wide variety of use cases for the new experience of cat and which really test the capabilities and limits of the service – there have been a few sessions of cat 2 hours for example! – as well as writing and blogging about your experience because it helps us improve the product for everyone.” But she was already seeing the limits of such use. “During sessions of cat prolonged by 15 or more questions, Bing may become repetitive or be prompted/provoked to give answers that are not necessarily helpful or in keeping with the intended tone.” Why? Hard to say. “The model sometimes tries to answer or think in the tone in which it is asked to provide answers”, interpreted Bing. In summary: the AI ​​is instructed to respond to all requests. And, as in a call center, it is forbidden to hang up the first Statistically, the risk of getting lost is greater.

But regardless of the true root of the problem, Microsoft had every interest in getting its act together. “Our data showed that the vast majority of people find the answers they are looking for within 5 rounds and that only around 1% of conversations cat have more than 50 posts,” the company said. Which is enough for its goal of providing a new alternative to the search traditional whose competitor Google remains the overwhelming leader. Finally, maybe Microsoft realizes that chatty AIs might pose more of a challenge. Only a few days after its release, Kevin Liu had managed the feat of extorting a confidential document from Bing containing its code name – Sydney, reserved for developers – and its operating rules. In addition to being inaccurate, clumsy, confused and even threatening, chatbots, when cooked, would therefore have a major flaw: they do not know how to keep a secret. A good reason to ask them to shut up.

lep-general-02