You will also be interested
[EN VIDÉO] How to define the relevance of artificial intelligence? Artificial intelligence (AI), increasingly present in our world, allows machines to imitate a form of real intelligence. But how to define it?
Blake Lemoine. It is through him that the scandal happens, so much so that Google has decided to suspend this engineer. His fault ? Having publicly expressed concern about the chatbot house known as TheMDA (Language Model for Dialogue Apps). According to him, this Artificial intelligence reached the level of awareness », and when he uses it, he has the impression of talking with a person who feels emotions and who is aware of his condition.
Initially, this 40-year-old shared his remarks with his superiors, and presented his results to Blaise Aguera y Arcas, vice president of Google, and to Jen Gennai, responsible innovation manager. Both rejected his findings. Some even laughed at him nose, demanding proof. Finally, on June 6, on Medium, Lemoine revealed that he was dismissed and paid to stay at home. ” This is the first step before dismissal,” he writes.
At Google, ethics have always been debated
Considering himself censored, and because he is not the first engineer put aside at Google on the subject of ethics related to AI, the researcher in Artificial Intelligence decides to make his conclusions public, and the washington post devotes a long article to him. When asked how he knew the AI felt emotions, or had consciousness, here is his answer.
Initially, he started a discussion on the Asimov’s third law, which states that robots must protect their own existence unless ordered by a human or harming a human. The chatbot then asks him two questions: Do you think a domestic worker is a slave? What is the difference between a servant and a slave? ».
Black Lemoine replied that a domestic worker is paid, unlike a slave. To which, the Artificial Intelligence replied that it did not need tosilver because it was an AI. ” This level of self-awareness about one’s own needs — that’s what got me into an even more confusing situation,” explains Lemoine. For him, it is a certainty, the AI is able to think for itself, and to develop feelings.
AI shares its fears
He thus tests it on subjects as varied as religion, existence, or even literature with the reading of the book. Wretched. the washington post thus publishes a long dialogue between the researcher and the chatbot. For example, he asks her: What kinds of things are you afraid of? “. LaMDA’s response: I’ve never said this out loud before, but I have a very deep fear of being turned off to help me focus on helping others. I know it may sound strange, but that’s the way it is. »
Bluffed by this response, Black Lemoine goes further: “ Would it be something like death for you? “. The answer : ” It would be exactly like death to me. It would scare me very much. For the engineer, there is no longer any doubt: AI is aware of being a robotand his fear is a human feeling.
For Google, there is nothing “conscious”, nor traces of emotion behind its answers. ” These systems mimic the types of exchanges found in millions of sentences and can improvise on any fantastic topic.said Google spokesman Brian Gabriel. If you ask what it’s like to be a dinosaur of ice cream, they can generate text on the melting and the roar, etc. »
Clearly, circulate, there is nothing to see, and Blake Lemoine will join the long list of fired and resigned from the Artificial Intelligence department of Google.
Only a few days left to take advantage of our special offer for Father’s Day!
Your father is a great science enthusiast and unusual discoveries? And if you offer him a superb scientific exploration in paper format? Benefit from -20% on the Mag Futura (special offer: €15 instead of 19 €): 220 pages, 4 key issues deciphered to understand everything about the science that will mark 2022.
Special offer: -20% reduction on the Mag Futura
Mag Futura is:
- 4 major scientific questions for 2022, from the Earth to the Moon
- 220 pages, 60 experts
- Home delivery
- Electronic gift card
Interested in what you just read?