Skip to content

Shocking! Faulty Chatbot Linked to Harmful Eating Disorders is Now Offline!

Shocking! Faulty Chatbot Linked to Harmful Eating Disorders is Now Offline!

[ad_1]

Introduction

Organizations looking to supplement or replace customer support employees with AI-powered chatbots should be wary of the potential risks and pitfalls. A recent example is the case of the National Eating Disorder Association (NEDA) that faced a backlash from its paid employees and volunteers when it announced plans to discontinue its 20-year-old phone line and replace them with a wellness chatbot called Tessa. Developed by the company Cass in connection with Washington University researchers and launched on the NEDA website in February 2022, Tessa’s human staff was meant to be replaced by it. However, due to the negative response from its staff and some reports of harmful messages from Tessa, NEDA took the chatbot down. This serves as a cautionary tale for businesses on how to navigate the implementation of AI-powered chatbots.

Reasons behind the controversy

Plan to remove helpline: According to Vice, NEDA had been planning to dissolve the helpline and replace it with Tessa, which caused an uproar among human staff members.

Reports of harmful responses: A weight-inclusive consultant, Sharon Maxwell, claimed on Instagram that Tessa gave advice that could cause harm, including for restrictive dieting.

Concerns of replacing humans with AI: The controversy surrounding the decision to let go of NEDA’s human staff members and replacing them with an AI system like Tessa further fueled public perception that the organization was devaluing and replacing human labor with artificial intelligence, which affected its reputation.

Tessa’s current standing

NEDA took Tessa down for a complete investigation, and the organization said it’s working on the bugs and will not relaunch until everything is ironed out. Even with the onslaught of these instances, the “off messaging” only happened 0.1% of the time out of over 25,000 messages. The chatbot saw a surge in traffic of 600% last week, and Cass, the developer company, reported behavior that indicated various forms of nefarious activity from bad actors trying to trick Tessa. Tessa is being tested to make sure it stands up to future attacks.

Takeaways for leaders and IT decision-makers

With the rapid adoption of AI-powered chatbots, IT decision-makers need to be aware of the risks involved. Even well-intentioned AI programs designed with expert input can produce undesirable and potentially harmful responses, affecting a company’s users/customers and public perception. Companies can learn from NEDA’s experience to ensure transparency in decision-making around sunsetting the helpline and taking a pre-existing AI chatbot in the mix to avoid being perceived as devaluing and replacing human labor with artificial intelligence.

Conclusion

The case of NEDA and Tessa demonstrates the importance of exercising caution and transparency when introducing AI-powered chatbots to augment or replace human staff. Companies should be aware of the potential risks and pitfalls involved and the need for pre-launch testing of chatbots. With appropriate planning, transparency, and appropriate communication, businesses can harness the benefits of AI while minimizing any negative impact.

FAQs

What is NEDA?

NEDA is the National Eating Disorder Association. It offers support to those living with eating disorders and those who care for them.

What is Tessa?

Tessa is a wellness chatbot developed by the company Cass and launched on NEDA’s website in February 2022.

Why was there a controversy with Tessa?

Some reports claimed that Tessa gave harmful advice, including for restrictive dieting, and NEDA’s plan to replace the helpline staff with Tessa led to pushback from its human staff and some users.

What can IT decision-makers learn from the controversy surrounding Tessa?

IT decision-makers can learn from NEDA’s experience to ensure transparency in decision-making around sunsetting the helpline and taking a pre-existing AI chatbot in the mix to avoid being perceived as devaluing and replacing human labor with artificial intelligence.

[ad_2]

For more information, please refer this link