AI Algorithms Enhance Medical Image Analysis

Karlsruhe Institute of Technology
2024_095_Kuenstliche Intelligenz Algorithmen verbessern Analyse medizinischer Bilder_72dpi
Automated methods enable the analysis of PET/CT scans (left) to accurately predict tumor location and size (right). (Illustration: Gatidis S, Kuestner T.; for an extensive caption and reference, see the end of text)

Artificial intelligence has the potential to improve the analysis of medical image data. For example, algorithms based on deep learning can determine the location and size of tumors. This is the result of AutoPET, an international competition in medical image analysis, where researchers of Karlsruhe Institute of Technology (KIT) were ranked fifth. The seven best autoPET teams report in the journal Nature Machine Intelligence on how algorithms can detect tumor lesions in positron emission tomography (PET) and computed tomography (CT). (DOI: 10.1038/s42256-024-00912-9)

Imaging techniques play a key role in the diagnosis of cancer. Precisely determining the location, size, and type of tumors is essential for choosing the right therapy. The most important imaging techniques include positron emission tomography (PET) and computer tomography (CT). PET uses radionuclides to visualize metabolic processes in the body. The metabolic rate of malign tumors is considerably higher than that of benign tissues. Radioactively labeled glucose, usually fluorine-18-deoxyglucose (FDG), is used for this purpose. In CT, the body is scanned layer by layer in an X-ray tube to visualize the anatomy and localize tumors.

Automation Can Save Time and Improve Evaluation

Cancer patients sometimes have hundreds of lesions, i.e. pathological changes caused by the growth of tumors. To obtain a uniform picture, it is necessary to capture all lesions. Doctors determine the size of the tumor lesions by manually marking 2D slice images - an extremely time-consuming task. "Automated evaluation using an algorithm would save an enormous amount of time and improve the results," explains Professor Rainer Stiefelhagen, Head of the Computer Vision for Human-Computer Interaction Lab (cv:hci) at KIT.

Rainer Stiefelhagen and Zdravko Marinov, a doctoral student at cv:hci, took part in the international autoPET competition in 2022 and came in fifth out of 27 teams involving 359 participants from all over the world. The Karlsruhe researchers formed a team with Professor Jens Kleesiek and Lars Heiliger from the Essen-based IKIM - Institute for Artificial Intelligence in Medicine. Organized by the Tübingen University Hospital and the LMU Hospital Munich, autoPET combined imaging and machine learning. The task was to automatically segment metabolically active tumor lesions visualized on a whole-body PET/CT. For the algorithm training, the participating teams had access to a large annotated PET/CT dataset. All algorithms submitted for the final phase of the competition are based on deep learning methods. This is a variant of machine learning that uses multi-layered artificial neural networks to recognize complex patterns and correlations in large amounts of data. The seven best teams from the autoPET competition have now reported on the possibilities of automated analysis of medical image data in the Nature Machine Intelligence journal.

Algorithm Ensemble Excels in the Detection Tumor Lesions

As the researchers explain in their publication, an ensemble of the top-rated algorithms proved to be superior to individual algorithms. The ensemble of algorithms is able to detect tumor lesions efficiently and precisely. "While the performance of the algorithms in image data evaluation partly depends indeed on the quantity and quality of the data, the algorithm design is another crucial factor, for example with regard to the decisions made in the post-processing of the predicted segmentation," explains Stiefelhagen. Further research is needed to improve the algorithms and make them more resistant to external influences so that they can be used in everyday clinical practice. The aim is to fully automate the analysis of medical PET and CT image data in the near future.

Original publication

Sergios Gatidis, Marcel Früh, Matthias P. Fabritius, Sijing Gu, Konstantin Nikolaou, Christian La Fougère, Jin Ye, Junjun He, Yige Peng, Lei Bi, Jun Ma, Bo Wang, Jia Zhang, Yukun Huang, Lars Heiliger, Zdravko Marinov, Rainer Stiefelhagen, Jan Egger, Jens Kleesiek, Ludovic Sibille, Lei Xiang, Simone Bendazzoli, Mehdi Astaraki, Michael Ingrisch, Clemens C. Cyran & Thomas Küstner: Results from the autoPET challenge on fully automated lesion segmentation in oncologic PET/CT imaging. Nature Machine Intelligence, 2024. DOI: 10.1038/s42256-024-00912-9

More about the cv:hci of KIT

Ausführliche Bildunterschrift: Automatische Verfahren ermöglichen die Analyse von PET/CT-Scans (links) zur präzisen Vorhersage von Tumorlage und -größe (rechts) für eine verbesserte Diagnose und Therapieplanung. (Abbildung: Gatidis S, Kuestner T. (2022) A whole-body FDG-PET/CT dataset with manually annotated tumor lesions (FDG-PET-CT-Lesions) [Dataset]. The Cancer Imaging Archive. DOI: 10.7937/gkr0-xv29)

Being "The Research University in the Helmholtz Association", KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 10,000 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 22,800 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.