OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 08.04.2026, 08:52

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Disentangling Stakeholder Role and Expertise in User-Centered Explainable AI

2025·0 ZitationenOpen Access
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2025

Jahr

Abstract

Identifying explanation needs based on user characteristics has been the focus of human-centred research within XAI for some time.In Ribera et al. 's proposal of user-centred XAI, expertise was used as a proxy for characterising the user, and in turn guide explanation design.Since then, the research landscape has evolved to include a broader notion of stakeholders, ranging from AI developers to external regulators to the affected users of AI decisions.However, with this broadening of stakeholder roles, there emerged a pattern of conflating expertise and role, such as the term "end user" being used interchangeably for domain experts using (X)AI for decisionmaking and lay users impacted by AI decisions, with both having vastly different explanatory needs.In this work, we revisit previous surveys with the aim to identify and classify stakeholders in the XAI ecosystem.We propose to consistently categorise stakeholders along separate expertise and role dimensions.By disentangling both, we present a framework that highlights the diversity of stakeholder goals and the challenges of aligning explanation design with varied user requirements.Our analysis maps stakeholders onto these dimensions and discusses how using both expertise and role can inform the development of more tailored and effective XAI solutions.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Explainable Artificial Intelligence (XAI)Artificial Intelligence in Healthcare and EducationEthics and Social Impacts of AI
Volltext beim Verlag öffnen