By definition, robots can't feel empathy — it requires being able to relate to another person's human experience, to put yourself in their shoes.
But according to new U of T Scarborough research, artificial intelligence (AI) can create empathetic responses more reliably and consistently than humans, even when compared to professionals whose job relies on empathizing with those in need.
"AI doesn't get tired," says Dariya Ovsyannikova (HBSc 2023 UTSC), lab manager in Professor Michael Inzlicht's lab at U of T Scarborough and lead author of the study.
"It can offer consistent, high-quality empathetic responses without the emotional strain that humans experience."
The research, published in the journal Communications Psychology , looked at how people evaluated empathetic responses generated by ChatGPT compared to human responses.
Across four separate experiments, participants were asked to judge the level of compassion (an important facet of empathy) in written responses to a series of positive and negative scenarios created by AI as well as regular people and expert crisis responders. In each scenario, the AI responses were preferred and rated as more compassionate and responsive—conveying greater care, validation and understanding compared to the human responses.
So how could a general chatbot like ChatGPT outperform professionals trained in responding with empathy? Ovsyannikova points to AI's ability to pick up on fine details and stay objective, making it particularly adept at crafting attentive communication that appears empathetic.
Empathy is an important trait not only in fostering social unity, but in helping people feel validated, understood and connected to others who empathize with them. In clinical settings, it plays a critical role in helping people regulate emotions and feel less isolated.
But constantly expressing empathy has its costs.
"Caregivers can experience compassion fatigue," says Ovsyannikova, who herself has professional experience volunteering as a crisis line responder.
She adds that professional caregivers, especially in mental health settings, may need to sacrifice some of their ability to empathize to avoid burnout or balance their emotional engagement effectively for each of their clients.
Humans also come with their own biases and can be emotionally affected by a particularly distressing or complex case, which also impacts their ability to be empathetic. The researchers say that coupled with shortages in accessible health-care services and qualified workers, and a widespread increase in mental health disorders, it means empathy is in short supply.
That doesn't mean we should cede empathy-derived care to AI overnight, says Inzlicht, who was a co-author of the study along with PhD student Victoria Oldemburgo de Mello. "AI can be a valuable tool to supplement human empathy, but it does come with its own dangers," he says.
Inzlicht adds that while AI might be effective in delivering surface-level compassion that people might find immediately useful, something like ChatGPT will not be able to effectively give them deeper, more meaningful care that will get to the root of a mental health disorder.
He notes that over-reliance on AI also poses ethical concerns, namely the power it could give tech companies to manipulate those in need of care. For example, someone feeling lonely or isolated may become reliant on talking to an AI chatbot that is constantly doling out empathy, instead of fostering meaningful connections with another human being.
"If AI becomes the preferred source of empathy, people might retreat from human interactions, exacerbating the very problems we're trying to solve, like loneliness and social isolation," says Inzlicht, whose research looks at the nature of empathy and compassion.
Another issue is a phenomenon known as "AI aversion," which is a prevailing skepticism about AI's ability to truly understand human emotion. While participants in the study initially ranked AI-generated responses highly when they didn't know who had written them, that preference shifted slightly when they were told the response came from AI. However, this bias may fade over time and experience, with Inzlicht noting that younger people who grew up interacting with AI are likely to trust it more.
Despite the critical need for empathy, Inzlicht urges for a transparent and balanced approach to deployment where AI supplements human empathy rather than replaces it.
"AI can fill gaps, but it should never replace the human touch entirely," he says.