Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Artificial Intelligence in Fracture Diagnosis on Radiographs: Evidence, Pitfalls, and Pathways for Clinical Integration (2020–2025)
0
Zitationen
4
Autoren
2025
Jahr
Abstract
Missed fractures remain one of the most frequent sources of diagnostic errors in emergency departments, often leading to delayed treatment, morbidity, and increased healthcare costs. Artificial intelligence (AI), particularly deep learning systems, has been increasingly investigated as an adjunct for musculoskeletal imaging. Over the past five years, multiple studies have evaluated the diagnostic performance, clinical utility, and limitations of AI in fracture detection. This article is a narrative synthesis of literature published from 2020 to 2025, focusing on systematic reviews, meta-analyses, and high-quality prospective studies addressing AI-assisted fracture diagnosis on radiographs and other imaging modalities. Key themes examined include diagnostic accuracy, anatomical and modality-specific performance, real-world deployment, regulatory approvals, and remaining challenges to integration. The results of the current review showed that decent meta-analyses demonstrated pooled sensitivity and specificity above 90%, showing AI performance comparable to that of radiologists. The strongest results were observed in extremity fractures (wrist, ankle, shoulder), while performance was more variable in ribs and spine. Reader studies confirmed that AI assistance improved radiologists' sensitivity by 6-8% without loss of specificity. Real-world deployments in emergency departments showed modest reductions in reporting discrepancies and patient length of stay. Commercial platforms, such as OsteoDetect and BoneView, received U.S. Food and Drug Administration (FDA) and Conformité Européenne(CE) approvals, and draft guidance from the National Institute for Health and Care Excellence (NICE) in 2024 recommended AI fracture detection in urgent care. However, challenges persist, including dataset bias, limited generalizability, interpretability, and uncertain patient-centered outcomes. Several studies have reported that AI may serve as a reliable adjunct in fracture diagnosis, with diagnostic accuracy approaching that of radiologists and potential improvements in clinical workflows when used as an assistive tool. Nevertheless, safe integration is best supported by regulatory approval (FDA, CE, NICE) and, where appropriate, supplemented by external validation, robust datasets, and transparent reporting to ensure performance across different clinical environments. Future directions include multi-modal AI, adaptive learning, pediatric-focused applications, and strengthened real-world evidence from retrospective outcome evaluations and post-market surveillance.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.231 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.084 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.444 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.423 Zit.