Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
The Role of Artificial Intelligence (ChatGPT-4o) in Supporting Tumor Board Decisions
7
Zitationen
9
Autoren
2025
Jahr
Abstract
<b>Background/Objectives:</b> Artificial intelligence (AI) has emerged as a promising field in the era of personalized oncology due to its potential to save time and workforce while serving as a supportive tool in patient management decisions. Although several studies in the literature have explored the integration of AI into oncology practice across different tumor types, available data remain limited. In our study, we aimed to evaluate the role of AI in the management of complex cancer cases by comparing the decisions of an in-house tumor board and ChatGPT-4o for patients with various tumor types. <b>Methods:</b> A total of 102 patients with diverse cancer types were included. Treatment and follow-up decisions proposed by both the tumor board and ChatGPT-4o were independently evaluated by two medical oncologists using a 5-point Likert scale. <b>Results:</b> Analysis of agreement levels showed high inter-rater reliability (κ = 0.722, <i>p</i> < 0.001 for tumor board decisions; κ = 0.794, <i>p</i> < 0.001 for ChatGPT decisions). However, concordance between the tumor board and ChatGPT was low, as reflected in the assessments of both raters (Rater 1: κ = 0.211, <i>p</i> = 0.003; Rater 2: κ = 0.376, <i>p</i> < 0.001). Both raters more frequently agreed with the tumor board decisions, and a statistically significant difference between tumor board and AI decisions was observed for both (Rater 1: Z = +4.548, <i>p</i> < 0.001; Rater 2: Z = +3.990, <i>p</i> < 0.001). <b>Conclusions:</b> These findings suggest that AI, in its current form, is not yet capable of functioning as a standalone decision-maker in the management of challenging oncology cases. Clinical experience and expert judgment remain the most critical factors in guiding patient care.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.239 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.095 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.463 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.428 Zit.