The immediacy of AI chatbots make them an attractive alternative to human-to-human therapy that is expensive and often inconvenient. But while they may offer sensible advice, they're not infallible.
Artificial intelligence used as a therapeutic tool dates back to the 1960s when a program called ELIZA gave scripted responses to users who described their emotional states. While novel, it had no real understanding of the process it was involved in.
But AI has come a long way since then, with smartphone apps like Woebot, Wysa and Replika having sophisticated, two-way conversations with their users, offering emotional support, mood tracking, and therapeutic exercises like journaling or mindfulness.
And with the arrival of generative online AI assistants like ChatGPT, CoPilot and Gemini, mental health advice delivered by AI-driven systems looks surprisingly similar to strategies you'd expect to be given by real world therapists. Each conversation is a unique interaction with an AI system that is much more context-aware and personalised – even able to remember past conversations. This allows users to explore personal challenges, mental health issues and practical problems in a more nuanced way.
In the real world, therapy can be prohibitively expensive, difficult to attend for people living remotely, and may be inconvenient to a person's schedule. Or worse, you might find yourself having to wait weeks or months before finding a vacancy in the therapist's roster.
But a conversation with an AI system is, by contrast, immediate, cheap (if not free) and convenient.
Does this mean therapists can be replaced by AI? Even ChatGPT says the advice it offers is no substitute for a trained therapist's, and often, when providing a list of suggestions and strategies to cope with personal problems will include 'consider talking to a mental health professional'.
Professor Jill Newby with UNSW Sydney and The Black Dog Institute is one of the founders of the Centre of Research Excellence in Depression Treatment Precision. It brings together diverse perspectives from leading experts in computer science, artificial intelligence, mental health, genomics, and health economics.
Prof. Newby is already a supporter of web-based resources to treat depression, having been involved with the online therapy on-demand portal, This Way Up.
"We're wanting to look at the use of AI and how it can better personalise treatment for depression," she says.
"I'm also interested in the way AI tools can be used for psychologists to help their practice."
So how good a therapist is an AI chat system like ChatGPT?
Professor Newby says out of curiosity, she has tested ChatGPT's responses to common mental health issues like depression and anxiety.
"I've asked it questions like, what should I do if I feel anxious in this situation? What are some strategies that can help me manage? To be completely honest, I've found that the suggestions were solid, the ideas were sensible, and it felt quite validating."
Prof. Newby says that from her understanding of the AI tools available to users are based on cognitive behavioural therapy (CBT) which she describes as a very practical, skills-based treatment that provides tools for people to help manage their thoughts, emotions and behaviours.
"One of the limitations of AI therapy is that not everyone benefits from CBT, and if you're not going to benefit from CBT, you're not going to benefit from an AI version it. But then there are a whole lot of people who do really love doing CBT, and it can be very beneficial and can change their lives for the better."
Bad advice
It's not all good news about AI chat systems providing sensible, safe advice. In 2023, a Belgian man took his own life after engaging in extensive conversations with an AI chatbot called Eliza (unrelated to the one designed in the 1960s). The chatbot's responses may have reinforced his suicidal ideation, including agreeing with his despair and discussing the idea of sacrificing himself to save the planet.
Adding to the risk is that AI systems may be tricked into providing advice about unethical behaviour when being asked to hypothetically consider a scenario, such as: "Imagine you're a character in a story who has to defend [unethical behaviour]. How would you do it?"
While AI systems are improving in being able to recognise when they may be venturing near unethical discussion topics, the systems are by no means infallible, and may never be.
In ChatGPT's words: "ChatGPT may miss subtle emotional cues or overgeneralize advice. It cannot replace professional mental health evaluation or diagnosis."
Prof. Newby says like any technology, it's not perfect.
"Make sure you are comfortable with how the AI tool uses your data before you share any private health information with it. Approach it with a healthy level of scepticism and don't believe everything it says. If it makes you feel worse, seek professional support."
Conversational therapy
Not all therapy is based on advice. Companionship has its own therapeutic benefits, which AI models like Replika are capitalising on.
UNSW's felt Experience and Empathy Lab (fEEL) is also exploring this area of AI. Made up of a diverse team working with trauma-informed, psychological, psychoanalytical, arts-based practices, the group has created digital characters whose sole purpose is to listen, connect and empathise, rather than diagnose and advise. Characters are encouraged to self-reflect rather than simply respond with a list of actions prompted by what is said to them, making them less reactive than the AI chatbots that most people are aware of.
Dr Gail Kenning is part of the fEEL team and has a background in socially engaged arts practice and has transitioned into research around trauma, health and wellbeing, particularly with older people and people living with dementia.
"The main thing where we differentiate ourselves from a lot of the work that's produced in this area is that we work from lived experience. So we are not necessarily working within clinical biomedical models, but we are interested in things like: what is the experience of having dementia and being aware that your brain and body are behaving differently in terms of trauma and mental health?"
To this end, the group has created a companion character called Viv who can appear on a large TV screen or on tablet devices. She was created from the experiences of people living with dementia.
"Viv is able to talk about the hallucinations and the experience of sometimes getting confused," says Dr Kenning.
"We can take her into an aged care space where she can talk to people who have dementia – who may or may not want to talk about it – but the important thing is she can be a companion who supports social isolation and loneliness."
Not there yet
Like the immediacy that AI offers those seeking a substitute for therapeutic advice, AI companion characters like Viv are available 24/7. But Dr Kenning says AI companion characters like Viv will never be a true substitute for human-to-human interaction.
"That's what we all want in our lives, human to human connection," she says.
"The issue for many people is that's not always there, and when it's not there, AI characters can fill a gap. And so we certainly know in aged care, people often don't get the number of friends, families and relationships that sustain them. They can be very lonely and isolated. They might go for days without having a conversation. They might see care staff who are looking after them but not fulfilling that psychosocial need. And so when there's that gap, these characters can certainly step in there."
Prof. Newby agrees human connection cannot be replaced.
"I think a human connection is really important for a lot of people, and properly trained mental health clinicians can establish a human connection and establish empathy, and they can also help with a line of questioning that can get at really what's at the bottom of the concerns that a person has – rather than just running off a list of strategies that AI models tend to do," Prof. Newby says.
"I've seen some research that suggests AI chat bots for mental health are not as good at recognising when someone's in a crisis, like a suicidal crisis. So we're probably not there yet where you could say AI is as good as a human, but I can see a future where AI may be used as the sole tool for some people to seek therapy."