Using a pioneering artificial intelligence platform, Flinders University researchers have assessed whether a cardiac AI tool recently trialled in South Australian hospitals actually has the potential to assist doctors and nurses to rapidly diagnose heart issues in emergency departments.
"AI is becoming more common in healthcare, but it doesn't always fit in smoothly with the vital work of our doctors and nurses," says Flinders University's Dr Maria Alejandra Pinero de Plaza, who led the research.
"We need to confirm these systems are trustworthy and work consistently for everyone, ensuring they are able to support medical teams rather than slowing them down."
Developed by Dr Pinero de Plaza and her team, PROLIFERATE_AI is a human-centred evaluation tool that uses artificial intelligence alongside researcher analysis to assess how well AI tools work in hospitals.
"In order to understand if the AI systems are viable, we look at how easy they are to use, how well doctors and nurses adopt them, and how they impact patient care," says Dr Pinero de Plaza, a research fellow in Flinders' Caring Futures Institute.
"It's not just about making AI accurate; it's about making sure it's easy to understand, adaptable, and genuinely helpful for doctors and patients when it matters most."
Published in the International Journal of Medical Informatics, the study used PROLIFERATE_AI to assess the RAPIDx AI tool; designed to help emergency doctors quickly and accurately diagnose cardiac conditions by rapidly analysing clinical and biochemical data.
With chest pain one of the most common reasons for ED visits, the South Australian health system has been part of an NHMRC-funded trial being run across 12 hospitals in metropolitan and rural SA, which is currently analysing its 12-month patient outcomes.
Before and during the trial, the PROLIFERATE researchers evaluated the tool, with medical and nursing staff at the participating hospitals being provided the opportunity to share their insights on interacting with the RAPIDx AI tool.
The results showed that while experienced clinicians, such as ED consultants and registrars, demonstrated high comprehension and engagement with the RAPIDx AI tool, less experienced users, including residents and interns, faced usability challenges.