Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
iSee: Intelligent Sharing of Explanation Experience by Users for Users
6
Zitationen
6
Autoren
2023
Jahr
Abstract
The right to obtain an explanation of the decision reached by an Artificial Intelligence (AI) model is now an EU regulation. Different stakeholders of an AI system (e.g. managers, developers, auditors, etc.) may have different background knowledge, competencies and goals, thus requiring different kinds of interpretations and explanations. Fortunately, there is a growing armoury of tools to interpret ML models and explain their predictions, recommendations and diagnoses which we will refer to collectively as explanation strategies. As these explanation strategies mature, practitioners will gain experience that helps them know which strategies to deploy in different circumstances. What is lacking, and is addressed by iSee, is capturing, sharing and re-using explanation strategies based on past positive experiences. The goal of the iSee platform is to improve every user’s experience of AI, by harnessing experiences and best practices in Explainable AI.
Ähnliche Arbeiten
Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization
2017 · 20.336 Zit.
Generative Adversarial Nets
2023 · 19.841 Zit.
Visualizing and Understanding Convolutional Networks
2014 · 15.241 Zit.
"Why Should I Trust You?"
2016 · 14.227 Zit.
On a Method to Measure Supervised Multiclass Model’s Interpretability: Application to Degradation Diagnosis (Short Paper)
2024 · 13.114 Zit.