Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Designing, Implementing, and Evaluating AI Explanations: A Scoping Review of Explainable AI Frameworks
1
Zitationen
3
Autoren
2025
Jahr
Abstract
As AI systems become increasingly integrated into our lives, the need to support appropriate human understanding of AI continues to grow. With new AI capabilities being deployed in different contexts, human-centered explainability is crucial to ensure people can interact with novel AI systems safely and effectively. To address evolving explainability needs, the field of Explainable AI (XAI) has produced numerous frameworks. But what do these frameworks entail and how can they be used in practice? What drives their development? As AI systems continue to grow in complexity, it is important to understand and reflect upon the value of these frameworks and their potential to address upcoming human-centered needs for XAI. Towards this, we performed a scoping review following the PRISMA-ScR procedure, gathering and analyzing a corpus of 73 papers to understand how XAI frameworks can support different stages of human-centered XAI design. We present a unified model and a set of guiding questions to help identify, compare and select relevant XAI frameworks across various design stages, making it easier for designers and researchers to apply human-centered approaches in real-world XAI contexts. We also analyze how frameworks are developed and evaluated, highlighting gaps and opportunities to improve both methodological as well as existing HCXAI practices.
Ähnliche Arbeiten
Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization
2017 · 20.561 Zit.
Generative Adversarial Nets
2023 · 19.893 Zit.
Visualizing and Understanding Convolutional Networks
2014 · 15.297 Zit.
"Why Should I Trust You?"
2016 · 14.383 Zit.
On a Method to Measure Supervised Multiclass Model’s Interpretability: Application to Degradation Diagnosis (Short Paper)
2024 · 13.163 Zit.