Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Interpretable Dropout Prediction: Towards XAI-Based Personalized Intervention
85
Zitationen
2
Autoren
2023
Jahr
Abstract
Abstract Student drop-out is one of the most burning issues in STEM higher education, which induces considerable social and economic costs. Using machine learning tools for the early identification of students at risk of dropping out has gained a lot of interest recently. However, there has been little discussion on dropout prediction using interpretable machine learning (IML) and explainable artificial intelligence (XAI) tools.In this work, using the data of a large public Hungarian university, we demonstrate how IML and XAI tools can support educational stakeholders in dropout prediction. We show that complex machine learning models – such as the CatBoost classifier – can efficiently identify at-risk students relying solely on pre-enrollment achievement measures, however, they lack interpretability. Applying IML tools, such as permutation importance (PI), partial dependence plot (PDP), LIME, and SHAP values, we demonstrate how the predictions can be explained both globally and locally. Explaining individual predictions opens up great opportunities for personalized intervention, for example by offering the right remedial courses or tutoring sessions. Finally, we present the results of a user study that evaluates whether higher education stakeholders find these tools interpretable and useful.
Ähnliche Arbeiten
Determining Sample Size for Research Activities
1970 · 17.669 Zit.
Scale Development : Theory and Applications
1991 · 14.736 Zit.
Online Learning: A Panacea in the Time of COVID-19 Crisis
2020 · 4.919 Zit.
Systematic review of research on artificial intelligence applications in higher education – where are the educators?
2019 · 4.447 Zit.
Blended learning: Uncovering its transformative potential in higher education
2004 · 4.406 Zit.