She converses and analyzes your emotions. Does the new ChatGPT version signal the death of psychologists?

She converses and analyzes your emotions Does the new ChatGPT

  • News
  • Published on
    Updated


    Reading 2 min.

    in collaboration with

    Johanna Rozenblum (clinical psychologist)

    On May 13, Open AI presented its new ChatGPT4-o, a version capable of chatting with you in real time and even interpreting images, such as your face. Does he have the power to replace a confidant or a health professional? The opinion of our psychologist.

    Did you know about chatGPT, the artificial intelligence capable of answering any question in a few seconds? This May 13, the company OpenAI presented a new, even more stunning model during a conference, capable of discussing fluidly with the user.

    ChatGPT acquires quasi-human qualities

    ChatGPT-4o, (“o” like omnimodel, or multimodal because it operates on several modes) has many impressive improvements. First of all, its execution speed is now formidable, averaging 320 milliseconds to answer a voice question, the same time as a human conversation in short.

    The voice option is also impressive since it allows you to understand and converse fluently with users. But this new ChatGPT Voice does not stop there since it would also be able to read the emotions on the face of a human, using the smartphone camera, and to guide them whether to do exercises breathing, or to help solve a problem.

    But the psychologist knows how to interpret the non-verbal

    With such functionalities and access, what’s more, free, we can easily imagine that many users will have more quickly turned to AI to answer a question or obtain relationship advice, than to make an appointment to a psychologist. A quick option, which can help in the moment.

    But can ChatGPT-4o replace a real interview with a psychologist? We asked Johanna Rozenblum, clinical psychologist and member of the Doctissimo expert committee. If of course, this pleads in favor of its profession, it provides a weighty argument, which is not yet matched by technology.

    “To be honest, I asked myself the question and I tried the AI ​​myself, in several situations, to get an idea,” she teaches us. “But for psychology, it still works quite poorly, with regard to everything linked to empathy and emotions.”

    In a question, a discussion, a problem there is what we say and what we feel which will pass through the non-verbal “It’s reddened skin, tears in the eyes, a trembling voice and above all ambivalence. ChatGPT perceives what is given to it, but does not know how to perceive ambivalence, hidden suffering, denial, avoidance. He doesn’t know how to decipher that a person who says ‘I’m fine’ can hide depressive symptomsor know your personal way of digesting things.

    In the case of a relational or personal problem, or a trauma, the psychological option therefore remains relevant. But should we therefore oppose humans and AI? Not sure. As Amélie Boukhobza recalled in a previous Doctissimo article:

    “Let us perhaps think differently and know how to separate things. These AI applications are complementary solutions rather than substitutes for traditional care provided by mental health professionals. They can offer support, relief in situations of mild to moderate stress, or serve as a bridge while awaiting more in-depth treatment.”

    dts6