Study finds generative AI can produce high-quality chest x-ray interpretations
Chest x-rays stand as one of the fundamental diagnostic tools in modern medicine, particularly in emergency room (ER) settings. These radiographic images provide a non-invasive and rapid insight into the thoracic cavity, enabling physicians to assess the lungs, heart, and surrounding structures. In the fast-paced environment of the ER, where timely and accurate diagnoses can be a matter of life and death, chest x-rays can quickly detect a plethora of critical conditions, such as pneumothorax (collapsed lung), pulmonary edema, rib fractures, lung infections, and heart-related ailments. The immediate visualization offered by these x-rays is invaluable in guiding treatment decisions, prioritizing care, and determining the need for further specialized investigations.
Moreover, in the chaotic backdrop of the ER, where patients often present with a myriad of overlapping symptoms, the clarity provided by chest x-rays is crucial. They serve as a frontline tool in distinguishing between cardiac and respiratory causes of acute symptoms like chest pain or shortness of breath. By ruling out or confirming specific diagnoses, these x-rays assist in streamlining patient management, ensuring appropriate resource allocation, and reducing the risk of treatment errors. The ability of chest x-rays to deliver such pivotal information in real-time underscores their indispensable role in emergency medicine and patient care.
Announced October 5, 2023, researchers developed a generative AI model to interpret chest x-rays and generate radiology reports. They tested it on 500 random ED chest x-rays from a hospital, comparing AI, on-site radiologist, and teleradiologist reports. Six ED doctors rated each report's accuracy and quality. The AI produced reports of similar quality to radiologists, and higher quality than teleradiologists.
The AI tool is an encoder-decoder model that takes x-ray images as input and outputs text mimicking radiology reports. The test set had 500 chest x-rays from January 2022-2023, excluding patients in the training set. The AI generated a report for each image. Radiologist and teleradiologist reports were collected, anonymized, and truncated to findings and conclusions.
Six ED doctors rated all reports on a 1-5 Likert scale for accuracy and quality. The AI and radiologist reports got significantly higher scores than teleradiology reports, with no significant difference between AI and radiologists. Analysis of clinical significance also showed no difference between report types. This held when examining specific findings like cardiomegaly or pneumothorax.
The most common discrepancies were missed findings. Missed findings were also the most common issue for radiologists and teleradiologists. Using radiologists as the benchmark, the AI tool had 85% sensitivity and 99% specificity for detecting abnormalities. Examples showed the AI improving on radiologist reports in some cases.
In conclusion, the AI model produced chest x-ray reports with similar accuracy and quality to radiologists, and better than teleradiologists. Integrating the AI could enable timely alerts to life-threatening findings and accelerate documentation. Further prospective evaluation of clinical impact is still needed.
Webdesk AI News : AI Reads Chest X-Rays as well as Radiologists, October 5, 2023