OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 17.03.2026, 12:14

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Explainable Deep Learning approach for Shoulder Abnormality Detection in X-Rays Dataset

2022·8 Zitationen·INTERNATIONAL JOURNAL OF NEXT-GENERATION COMPUTINGOpen Access
Volltext beim Verlag öffnen

8

Zitationen

2

Autoren

2022

Jahr

Abstract

Computer vision researchers and decision-makers have struggled to understand how deep neural networks (DNNs) accomplish image classification tasks and interpret their results. Due to a lack of understanding of their internal functioning, these models are commonly referred to as "black boxes." As a part of the development process, the DNNs can be easily explained. In this research work, we introduce an explainable technique for shoulder abnormality detection. The motivation behind this study is to enhance patients' and medical professionals' trust in DNNs technology. DNNs are implemented frequently in the medical domain. The suggested abnormality detector, which is based on IGrad-CAM++, is capable of detecting Shoulder X-rays abnormality. Grad-CAM is a common approach that combines the activation maps received from the model to create such a visualization. The average gradient-based terms used in this technique, on the other hand, understate the contribution of the model's identified representations to its predictions. In order to address this issue, we offer a technique that uses Grad-CAM++ to compute the route integral of the gradient-based terms. By assessing different techniques, it is discovered that the recommended procedure can perform very effectively and efficiently in X-ray images provides more visual explanation than existing techniques.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationCOVID-19 diagnosis using AIClinical Reasoning and Diagnostic Skills
Volltext beim Verlag öffnen