Can the “virtual psychologist” replace humans to confide (and treat) their discomfort?

Can the virtual psychologist replace humans to confide and treat

  • News
  • Published on
    Updated


    Reading 3 min.

    in collaboration with

    Amélie Boukhobza (Clinical psychologist)

    Listening to you and available 24 hours a day, new chatbots based on AI claim to help you resolve your problems, or provide you with support. But are these 100% virtual solutions another tool to get better, or a worrying stopgap, offering only a partial answer?

    Have you been feeling down for several weeks? Magic of AI, as in almost all areas, a solution is available to you from your smartphone or computer screen. Because yes, even in the field of mental health applications are flourishing, and promise, thanks to artificial intelligence, to provide you with a personalized response.

    The simplicity of a psychologist assistant in your pocket

    Thus, hundreds of therapy applications, such as Wysa or Youper, are now available in the App Store or Google Store and downloads are being snapped up by the millions, as noted by the magazine l’ADN. The solution seems to be within reach, or just a click away: simply download the app of your choice, then select the reason for your discomfort (depression, work, love, etc.) to have access to a chat with a “bot ” responsible for listening to you and cheering you up. Fast and discreet support, which you can consult whenever you want.

    A use which has its reasons for existing today

    Although confiding in a chatbot may seem inconsistent, their appearance is not surprising in the current context. So, while the mental health of young people continues to deteriorate according to the latest mental health barometer published in February, it can be complicated to obtain a quick appointment and follow-up with a therapist. The use of AI, accessible 24/7, aims to fill this gap. Without forgetting the free side, in competition with the 70 euros on average spent during a session with a psychiatrist, despite the devices which allow reimbursement.

    This is not its only advantage: for many, engaging with a bot is also less impressive than doing it in front of a therapist, so speech would be freer. What a user confirms, in the pages of The Guardian. “With AI, I feel like I’m speaking in a true non-judgmental zone. I can cry without feeling the stigma of crying in front of someone.” Furthermore, according to a study, 90% of patients lie at least once to their practitioner, particularly about the presence ofsuicidal thoughts.

    But the risk of insufficient or inappropriate advice exists

    It may be easier and more discreet to confide in a bot. But what about the advice received? On this point, opinions differ and the examples are not always convincing. Thus, the magazineDNA relates the experience of Christa, 32, suffering from anxiety, who created a tailor-made therapist on character.ai, whom she hopes is “caring, supportive and intelligent”. If the bot initially whispers encouraging responses like “you’ll get there”, it then becomes more critical and suggests that her boyfriend doesn’t love her enough, based on compiled SMS exchanges. Hurt, the young woman decided to delete the app, but with what benefit in the end?

    A solution that does not offer the empathy or understanding of a psychologist

    For Amélie Boukhobza, clinical psychologist and member of our committee of experts to whom we submitted these examples, the arrival of these intelligences in the field of mental well-being is not a bad signal. But cannot replace the support offered by an experienced psychologist.

    “The use of AI in mental health is growing significantly. AI therapy applications, such as therapeutic chatbots or virtual assistants, offer a series of benefits that may explain their growing popularity ( accessibility, anonymity, immediacy of the response, etc.).

    However, this trend also raises important questions. The quality of interaction and support offered by AIs is not the same as therapy conducted by human professionals, who can offer empathy, understanding and responsiveness that AI cannot fully and completely reproduce”.

    According to her, there is also a risk of deviation in this option, which must be emphasized when we talk about getting naked: “What about data privacy and security? What is the capacity of AI applications to manage complex cases or acute crises? Is there not a limit and a danger there? What about the transfer to the therapist on which the entire foundation of therapy rests?”…

    An option in mild stress situations

    However, it is not a question of demonizing a tool which is gaining momentum today in medicine. But to look at it with a lucid eye, for what it is.

    “Let us perhaps think differently and know how to separate things. These AI applications are complementary solutions rather than substitutes for traditional care provided by mental health professionals. They can offer support, relief in situations of mild to moderate stress, or serve as a bridge while awaiting more in-depth treatment” concludes our expert.

    dts6