Could artificial intelligence (AI) assessment have comparable diagnostic accuracy to clinician assessment for fracture detection? In a recently published meta-analysis of 42 studies, the study authors noted 92 percent sensitivity and 91 percent specificity for AI in comparison to 91 percent sensitivity and 92 percent specificity for clinicians based on internal validation test sets. For the external validation test sets, clinicians had 94 percent specificity and sensitivity in comparison to 91 percent specificity and sensitivity for AI, according to the study. In essence, the study authors found no statistically significant differences between AI and clinician diagnosis of fractures. "The results from this meta-analysis cautiously suggest that AI is noninferior to clinicians in terms of diagnostic performance in fracture detection, showing promise as a useful diagnostic tool," wrote Dominic Furniss, DM, MA, MBBCh, FRCS(Plast), a professor of plastic and reconstructive surgery in the Nuffield Department of Orthopedics, Rheumatology and Musculoskeletal Sciences at the Botnar Research Centre in Oxford, United Kingdom., and colleagues.
AI algorithms can quickly detect and localize wrist fractures in X-ray images, which can augment the work of harried emergency physicians and radiologists. Missing a fracture on an emergency department radiograph is one of the most common causes of diagnostic errors and subsequent litigation. Such errors are due to clinical inexperience, distraction, fatigue, poor viewing conditions and time pressures. The study authors, from the National University of Singapore, hypothesized that automated analysis using artificial intelligence (AI) would be "invaluable" in reducing these misreadings and that an object detection convolutional neural network (CNN) would work better than other CNNs. Object detection CNNs are extensions of image classification models that not only recognize and classify objects on images, but also localize the position of each object.
Artificial intelligence--the mimicking of human cognition by computers--was once a fable in science fiction but is becoming reality in medicine. The combination of big data and artificial intelligence, referred to by some as the fourth industrial revolution,1 will change radiology and pathology along with other medical specialties. Although reports of radiologists and pathologists being replaced by computers seem exaggerated,2 these specialties must plan strategically for a future in which artificial intelligence is part of the health care workforce. Radiologists have always revered machines and technology. In 1960, Lusted predicted "an electronic scanner-computer to examine chest photofluorograms, to separate the clearly normal chest films from the abnormal chest films."3
A new deep learning model could help radiologists in any facility interpret chest X-rays. In a new study published in The Lancet Digital Health, investigators from Australia outlined their new tool. It is designed to alleviate heavy workloads and make it easier for providers who do not have specialty thoracic training to read these scans while reducing errors. Chest X-rays are already the most common imaging study worldwide, and that number is growing, said the team from annalise.ai, the company that created the AI model. Developing a tool to help shoulder the weight and process the workload will be critical. "The ability of the AI model to identify findings on chest X-rays is very encouraging," said Catherine Jones, MBBS, thoracic radiologist, chest lead at annalise.ai,
Emergency room and urgent care clinics are typically busy and patients often have to wait many hours before they can be seen, evaluated and receive treatment. Waiting for x-rays to be interpreted by radiologists can contribute to this long wait time because radiologists often read x-rays for a large number of patients. A new study has found that artificial intelligence (AI) can help physicians in interpreting x-rays after an injury and suspected fracture. "Our AI algorithm can quickly and automatically detect x-rays that are positive for fractures and flag those studies in the system so that radiologists can prioritize reading x-rays with positive fractures. The system also highlights regions of interest with bounding boxes around areas where fractures are suspected. This can potentially contribute to less waiting time at the time of hospital or clinic visit before patients can get a positive diagnosis of fracture," explained corresponding Ali Guermazi, MD, PhD, chief of radiology at VA Boston Healthcare System and Professor of Radiology & Medicine at BUSM.