Despite the remarkable progress in artificial intelligence (AI), several studies show that AI systems do not improve radiologists' diagnostic performance. In fact, diagnostic errors contribute to 40,000 - 80,000 deaths annually in U.S. hospitals. This lapse creates a pressing need: Build next-generation computer-aided diagnosis algorithms that are more interactive to fully realize the benefits of AI in improving medical diagnosis.
That's just what Hien Van Nguyen, University of Houston associate professor of electrical and computer engineering, is doing with a new $933,812 grant from the National Cancer Institute. He will focus on lung cancer diagnostics.
"Current AI systems focus on improving stand-alone performances while neglecting team interaction with radiologists," said Van Nguyen. "This project aims to develop a computational framework for AI to collaborate with human radiologists on medical diagnosis tasks."
That framework uses a unique combination of eye-gaze tracking, intention reverse engineering and reinforcement learning to decide when and how an AI system should interact with radiologists.
To maximize time efficiency and minimize the amount of distraction on the clinical work, Van Nguyen is designing a user-friendly and minimally interfering interface for radiologist-AI interaction.
The project evaluates the approaches on two clinically important applications: lung nodule detection and pulmonary embolism. Lung cancer is the second most common cancer, and pulmonary embolism is the third most common cause of cardiovascular death.
"Studying how AI can help radiologists reduce these diseases' diagnostic errors will have significant clinical impacts," said Van Nguyen. "This project will significantly advance the knowledge of the field by addressing important, but largely under-explored questions."
The questions include when and how AI systems should interact with radiologists and how to model radiologist visual scanning process.
"Our approaches are creative and original because they represent a substantive departure from the existing algorithms. Instead of continuously providing AI predictions, our system uses a gaze-assisted reinforcement learning agent to determine the optimal time and type of information to present to radiologists," said Van Nguyen.
"Our project will advance the strategies for designing user interfaces for doctor-AI interaction by combining gaze-sensing and novel AI methodologies."