Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Radiology staff perspectives after decommissioning an artificial intelligence (AI) tool for expedited lung cancer triage
0
Zitationen
5
Autoren
2026
Jahr
Abstract
<title>Abstract</title> <underline> <bold>Objectives</bold> </underline> <bold>:</bold> Adoption of artificial intelligence (AI) in radiology is well described, but little is known about staff attitudes when tools are withdrawn. This study examined how AI decommissioning influenced radiology staff perceptions of workflow, patient care, and future digital innovation. <underline> <bold>Methods</bold> </underline> <bold>:</bold> Anonymous electronic survey was distributed to all radiology staff at a multi-site NHS Trust following decommissioning of a commercially available CXR AI triage tool used to expedite patients for same-day CT chest. Surveys conducted during AI adoption at three time points (pre-implementation, early post-implementation, and late post-implementation) have previously been published. This fourth survey repeated key items from earlier phases with additional questions specific to AI decommissioning. Responses were compared across phases. <underline> <bold>Results</bold> </underline> <bold>:</bold> The response rate was comparable to prior rounds at 21.4% (40/187). Post-decommissioning, belief that AI had improved patient care remained high (70.0% versus 71.1%, 65.5%, and 67.9%). Simultaneously, 35% recalled that AI had caused logistical issues (higher than at late post-deployment, 26%). Personal comfort with AI being used on one’s own healthcare imaging was low post-withdrawal (25.6% versus 31.1%, 48.3%, and 47.2% previously). Among reporting staff, 40% (2/5) were disappointed to no longer use the tool, while 20% (1/5) reported reliance on it. Free-text responses described relief at reduced workflow disruption and emotional burden (from frontline radiographers), alongside regret over lost patient benefits. <underline> <bold>Conclusions</bold> </underline> : Decommissioning generated mixed responses, combining operational relief with perceived loss of clinical benefit. This highlights the importance of proactively managing AI withdrawal through clear communication and attention to staff expectations.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.200 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.051 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.416 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.410 Zit.