Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Personalizing explanations of AI-driven hints to users' characteristics: an empirical evaluation
2
Zitationen
3
Autoren
2024
Jahr
Abstract
The paper extends an existing Intelligent Tutoring System (ITS) that supports students' learning via AI-driven personalized hints and can generate explanations to justify why/how the hints were generated. In this work, we investigate personalizing these hint explanations to students with low levels of two traits, Need for Cognition and Conscientiousness in order to enhance their engagement with the explanations, based on prior findings that these students generally do not ask for the explanations although they would benefit from them. We evaluate the effectiveness of the personalized hint explanations with a formal user study. Our results show that the personalization increases our target users' interaction with the hint explanations, their understanding of the hints, and their learning. Hence, this work contributes to exiting initial evidence on the value of Personalized Explainable AI (PXAI) in education.
Ähnliche Arbeiten
Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization
2017 · 20.408 Zit.
Generative Adversarial Nets
2023 · 19.841 Zit.
Visualizing and Understanding Convolutional Networks
2014 · 15.253 Zit.
"Why Should I Trust You?"
2016 · 14.286 Zit.
On a Method to Measure Supervised Multiclass Model’s Interpretability: Application to Degradation Diagnosis (Short Paper)
2024 · 13.132 Zit.