Eating disorders: a chatbot that gave incorrect information deactivated

Eating disorders a chatbot that gave incorrect information deactivated

  • News
  • Published on
    Updated


    Reading 2 mins.

    The National Eating Disorder Association helps Americans with eating disorders. After 24 years of service, it announced the end of its hotline and the establishment of a chatbot, or automated robot, to respond to patients. Chatbot which was quickly deactivated due to the inappropriate advice it gave to patients.

    It’s a story that proves that artificial intelligence cannot replace humans in all areas. The National Eating Disorder Association, an American association helping people with eating disorders, has announced the closure of its chatbot deployed to replace the team in charge of the hotline (all dismissed). In question ? Inappropriate advice from the chatbot…

    A chatbot to replace a listening line

    After the crisis line was shut down, people who used the service were left with two options: to consult the online resources of the NEDA website or to converse with a chatbot, or automated robot.

    The latter, named Tessa, has been delegated to run a program called Body Positive, an interactive program for the prevention of eating disorders.

    A chatbot that gave “harmful” information

    Shortly after its implementation, the chatbot was widely used, with the association seeing a 600% increase in traffic. But unfortunately, the answers given by the robot did not live up to expectations.

    Sharon Maxwell shared her experience with Tessa on Instagram. According to her, Tessa would have encouraged him to lose weight, recommending him to lose between 500 g and a kilo per week. Tessa also reportedly advised him to count his calories, cut out 500 to 1000 calories a day, weigh himself and take his body measurements weekly. “All Tessa recommended were the things that led to the development of my eating disorder. This robot is harmful“wrote the young woman on her Instagram post.

    “A robot does not replace human empathy”

    An employee of the association, who had described the dismissals as union busting (anti-union repression in French, editor’s note) had warned: “A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorder community.”

    This is indeed what happened. Last Tuesday, Tessa was taken offline by the organization following a viral post showing that the chatbot encouraged unhealthy eating habits. “We are concerned about the weight loss and calorie restriction comments posted in the chat yesterday, and are working with the Tech and Research team to investigate this further. The comments go against our policies and our fundamental beliefs as an organization specializing in eating disorders” NEDA president Liz Thompson said in a statement. Will the association eventually reinstate the dismissed employees? Case to follow.

    dts4