Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
XAIMed: A Diagnostic Support Tool for Explaining AI Decisions on Medical Images
2
Zitationen
6
Autoren
2024
Jahr
Abstract
Convolutional Neural Networks have demonstrated high accuracy in medical image analysis, but the opaque nature of such deep learning models hinders their widespread acceptance and clinical adoption. To address this issue, we present XAIMed, a diagnostic support tool specifically designed to be easy to use for physicians. XAIMed supports diagnostic processes involving the analysis of medical images through Convolutional Neu- ral Networks. Besides the model prediction, XAIMed also provides visual explanations using four state-of-art eXplainable AI methods: LIME, RISE, Grad-CAM, and Grad-CAM++. These methods produce saliency maps which highlight image regions that are most influential for a model decision. We also introduce a simple strategy for aggregating the different saliency maps into a unified view which reveals a coarse-grained level of agreement among the explanations. The application features an intuitive graphical user interface and is designed in a modular fashion thus facilitating the integration of new tasks, new models, and new explanation methods
Ähnliche Arbeiten
Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization
2017 · 20.311 Zit.
Generative Adversarial Nets
2023 · 19.841 Zit.
Visualizing and Understanding Convolutional Networks
2014 · 15.238 Zit.
"Why Should I Trust You?"
2016 · 14.210 Zit.
On a Method to Measure Supervised Multiclass Model’s Interpretability: Application to Degradation Diagnosis (Short Paper)
2024 · 13.104 Zit.