In mental health, AI may have gender and ethnicity biases

In mental health AI may have gender and ethnicity biases

  • News
  • Published on
    updated on


    Reading 2 min.

    The saying goes that appearances can be deceptive. And yet, that’s what artificial intelligence seems to be doing. According to an American study from the University of Colorado, some AI-based tools could rely on prejudices to treat patients in the field of mental health.

    Stereotypes die hard. And according to researchers at theUniversity of Colorado Boulderalgorithms have also adopted these tropes. The study led by Theodora Chaspari, associate professor of computer science, reveals a worrying reality: artificial intelligence (AI) tools used to assess mental health problems can be biased based on patients’ gender and ethnicity. This finding raises crucial questions about the equity and effectiveness of mental health technologies at a time when Mediterranean syndrome (a term used to describe a belief that people from the Mediterranean region are less resistant to pain) is still claiming victims.

    Published in the journal “Frontiers in Digital Health”, the study, called “Deconstructing demographic bias in speech-based machine learning models for digital health“demonstrates that algorithms intended to screen for mental health issues like anxiety and depression can make assumptions based on patients’ gender and ethnicity:”If AI is not well trained or does not include enough representative data, it can propagate these human or societal biases.” said Professor Chaspari, associate professor in the Department of Computer Science.

    After running audio samples of people through a set of learning algorithms, the researchers were able to see several flaws that could be dangerous for patients. In this case, for female patients. According to the results, the machines underdiagnosed women at risk of depression more than men.

    In addition to gender discrimination, AI can also misjudge patients’ speech. Researchers found that people with anxiety speak in a higher-pitched and more agitated tone, while showing signs of shortness of breath. Conversely, people with depression tend to speak softly and in a monotone.

    To test these hypotheses, the researchers analyzed the participants’ behavior in front of a group of people they did not know. Another group of men and women talked to each other.

    While in the first group, people of Latin American origin reported being more nervous than white and black participants, the AI ​​did not detect this. In the second, the algorithms assigned the same level of risk of depression to men and women, but the latter actually had more symptoms.If we believe an algorithm underestimates depression for a specific group, we should inform clinicians.“, stressed Theodora Chaspari.

    dts6