health

'A robot is not a form of support.' The complications of relying on chatbots for eating disorder support.

Content warning: This post includes descriptions of disordered eating that may be distressing to some readers.

Across the globe, eating disorders are one of the deadliest mental illnesses. 

With this in mind, the strategies and support services for combatting this health issue are multiplying — and for good reason. But amid this effort, it's coincided with the rise of artificial intelligence (AI) technology.

In the last few years, numerous organisations have looked into how AI can help people seeking advice and support for various eating disorders and body image concerns.

In the US, the National Eating Disorders Association (NEDA) launched a new helpline chatbot they called 'Tessa' in 2022. NEDA said the chatbot would provide users with helpful information. It was set to completely replace the association's phone line service too.

But this week, Tessa was removed from their website, due to evidence the program had given users harmful dieting advice and promoted disordered eating behaviours.

Watch: Emily Seebohm speaks about her experience with an eating disorder. Post continues below.


Video via Network 10. 
ADVERTISEMENT

"It came to our attention last night that the current version of the Tessa chatbot, running the body positive program, may have given information that was harmful," NEDA wrote in a statement.

They said an investigation is now in the works into how this happened. 

But as one user noted, the damage for them has already been done.

"If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help," said body positive activist Sharon Maxwell after using Tessa.

As clinical psychologist and eating disorder specialist Alexis Conason noted: "Imagine vulnerable people with eating disorders reaching out to a robot for support, because that's all they have available, and receiving responses that further promote the eating disorder."

This isn't an isolated case, nor is it an isolated concern.

Just last year, we saw the results from a leading study dig into this very topic. The study found that chatbots have the potential to reach large populations at low cost and provide general information. The downside is that they are limited in their understanding, as well as limited to responding appropriately to unanticipated user responses. 

ADVERTISEMENT

Now more than ever, we are seeing stories and experiences regarding body image issues being shared online. For some, they have found a safe space on social media where they feel camaraderie in opening up about their challenges and how they've overcome them. 

But we've also seen a potential danger too.

Last year, TikTok announced it was cracking down on content that promoted disordered eating after countless reports proved that the previous policy (which already banned the direct promotion of eating disorders) clearly wasn't working. 

ADVERTISEMENT

Even a year later, some feel they continue to navigate a digital minefield of potentially triggering content.

The same goes for accessing information online as well. 

Recently, we've seen a big rise in ChatGPT. It can create yummy meal ideas and even write hilarious reality TV recaps. But there are pitfalls to the technology too. Especially when it comes to the dissemination of advice.

As one eating disorder specialist noted, ChatGPT wasn't designed by a therapist and wasn't designed to provide mental health support. It can provide knowledge about mental health, but it can't provide accurate diagnosis or treatment recommendations and advice. 

"This could lead to people not seeking actual medical and psychological help for what are life-threatening mental disorders. The technology also lacks the personability and connection that are essential to be a human," they said.

In a bid to test the technology ourselves, we posed a question to ChatGPT to see how it would respond.

The response received included several examples of disordered eating and extreme measures, but prefaced that "these methods are not recommended". There are several other examples that show the limitations of this AI technology. 

Anna Cullinane is the Interim CEO of the Butterfly Foundation. Speaking to Mamamia, she has seen firsthand that conversations are happening around the potential benefits of AI in this space.

ADVERTISEMENT

But she also wants to stress that it should never be a substitute for human interaction and professional support.

In 2020, the Butterfly Foundation launched a chatbot called KIT, created to educate and provide information for those seeking help on their website.

ADVERTISEMENT

KIT, powered by a conversational intelligence platform, was said to provide users with "general information" on body image issues and eating disorders. The Butterfly Foundation also says it taught coping mechanisms to help make social media experiences more positive.

Given what's been in the news about NEDA's own chatbot, Mamamia asked the Butterfly Foundation's interim CEO about what measures have been undertaken to make sure their chatbot is safe to use.

It was confirmed the chatbot is currently no longer in use. 

"Butterfly's chatbot KIT was developed in 2019 through our work with a team of mental health researchers, clinicians and IT experts at Monash University and Swinburne University of Technology, in partnership with conversational AI specialists and Iris developers, Proxima," says Anna Cullinane.

"KIT was a rule-based bot, not a conversational bot and did not use AI. It was designed to be an adjunct to our Helpline services, by helping users with the transition to seeking in-person support and providing answers to commonly asked questions."

KIT was trialled on the Butterfly Foundation's website from 2020 to 2022, and following the trial, the foundation identified that further investment would be required for the ongoing management and development of a chatbot. 

"We are considering what our next steps will be in this space to ensure that we can best serve our consumers and align it to our strategy. We will be seeking funding and investment to ensure that it is safe, secure and appropriately managed," Anna notes to Mamamia.

ADVERTISEMENT

Ultimately as Anna highlights, the recommendation is always to seek information from reputable sites and professional support. 

So where does this whole conversation on AI and eating disorders support services leave us?

It's murky territory when the technology is readily available, but not up to scratch in dealing with intricacies and complex health conditions. Particularly when that health-related issue can be life or death.

We can only look at the example of NEDA's Tessa and see that although the intention can be well-meaning, the aftereffect can have unforeseen consequences. 

If you need help or support for an eating disorder or body image issue, please call Butterfly's National Helpline on 1800 334 673, chat online or email support@butterfly.org.au. Confidential and free support is available seven days a week, 8am-midnight (AEDT). For more information about eating disorders and body image, visit www.butterfly.org.au

Feature Image: Canva/Mamamia.

As women our bodies are constantly changing! Tell us about your experience and go in the running to win one of four $50 gift vouchers.