Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Understanding and modeling human-AI interaction of artificial intelligence tool in radiation oncology clinic using deep neural network: a feasibility study using three year prospective data
1
Zitationen
10
Autoren
2024
Jahr
Abstract
<i>Objective.</i>Artificial intelligence (AI) based treatment planning tools are being implemented in clinic. However, human interactions with such AI tools are rarely analyzed. This study aims to comprehend human planner's interaction with the AI planning tool and incorporate the analysis to improve the existing AI tool.<i>Approach.</i>An in-house AI tool for whole breast radiation therapy planning was deployed in our institution since 2019, among which 522 patients were included in this study. The AI tool automatically generates fluence maps of the tangential beams to create an<i>AI plan</i>. Human planner makes fluence edits deemed necessary and after attending physician approval for treatment, it is recorded as<i>final plan</i>. Manual modification value maps were collected, which is the difference between the<i>AI-plan</i>and the<i>final plan</i>. Subsequently, a human-AI interaction (HAI) model using full scale connected U-Net was trained to learn such interactions and perform plan enhancements. The trained HAI model automatically modifies the<i>AI plan</i>to generate AI-modified plans (<i>AI-m plan</i>), simulating human editing. Its performance was evaluated against original<i>AI-plan</i>and<i>final plan. Main results. AI-m plan</i>showed statistically significant improvement in hotspot control over the<i>AI plan</i>, with an average of 25.2cc volume reduction in breast V105% (<i>p</i>= 0.011) and 0.805% decrease in Dmax (<i>p</i>< .001). It also maintained the same planning target volume (PTV) coverage as the<i>final plan</i>, demonstrating the model has captured the clinic focus of improving PTV hot spots without degrading coverage.<i>Significance.</i>The proposed HAI model has demonstrated capability of further enhancing the<i>AI plan</i>via modeling human-AI tool interactions. This study shows analysis of human interaction with the AI planning tool is a significant step to improve the AI tool.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.402 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.270 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.702 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.507 Zit.