Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Automated Bone Fracture Detection and Radiology Report Generation from X-Rays
0
Zitationen
6
Autoren
2025
Jahr
Abstract
Detecting bone fractures through medical imaging is very important for helping radiologists and making diagnoses more accurate. Interpreting X-rays by hand takes a long time and is easy to make mistakes, especially when dealing with complicated fractures. We suggest a framework for automatic bone fracture detection, localization, and report generation that combines deep learning and a large language model (LLM). Our system takes X-ray images as input, sorts them into fractured and non-fractured groups, and then uses bounding boxes to find the areas of the fractures. After that, the identified class information is sent to a Groq-powered LLM, which makes a structured report like a radiology report. We assess our methodology using the publicly accessible XR-bones dataset, which encompasses various anatomical regions (elbow, finger, forearm, hand, and shoulder) featuring both positive (fracture) and negative (no fracture) classifications. The experimental results show that this system is a useful tool for computer-aided diagnosis (CAD) in healthcare because it can accurately find fractures, pinpoint their locations, and create reports that take the context into account.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.380 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.243 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.671 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.496 Zit.