Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Development and translation of human-AI interaction models into working prototypes for clinical decision-making
9
Zitationen
11
Autoren
2024
Jahr
Abstract
In the standard interaction model of clinical decision support systems, the system makes a recommendation, and the clinician decides whether to act on it. However, this model can compromise the patient-centeredness of care and the level of clinician involvement. There is scope to develop alternative interaction models, but we need methods for exploring and comparing these to assess how they may impact clinical decision-making. Through collaborating with clinical, AI safety, and HCI experts, and patient representatives, we co-designed a number of alternative human-AI interaction models for clinical decision-making. We then translated these models into ‘Wizard of Oz’ prototypes, where we created clinical scenarios and designed user interfaces with different types of AI output. In this paper, we present alternative models of human-AI interaction and illustrate how we used a co-design approach to translate them into functional prototypes that can be tested with users to explore potential impacts on clinical decision-making.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.260 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.116 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.493 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.438 Zit.