Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Integrating generative artificial intelligence, predictive analytics and human intelligence in peri‐operative care
0
Zitationen
2
Autoren
2025
Jahr
Abstract
Ke et al. highlight the potential of generative artificial intelligence in streamlining peri-operative care and underscore the significant advancements in safety, consistency and clinician support [1]. The evaluation of a system that showed accuracy exceeding 97% reaffirms the growing role of artificial intelligence in modern clinical workflows. Generative artificial intelligence reinforces evidence-based and consistent care whilst reducing the likelihood of deviations and human error. Although the authors discuss the operational and safety benefits of their chatbot in the peri-operative setting, we believe the role artificial intelligence has in addressing individualised and population-level risks warrants further exploration. Predictive analytics complements generative artificial intelligence models such as those described. The integration of real-world data, analysed through machine and deep learning, could further enhance risk stratification and provide a more comprehensive understanding of individual patient needs and population health trends [2]. By strengthening predictive analytical algorithms, clinicians would gain the ability to: assess individual patient risks dynamically [3]; prevent clinical variations proactively; and support precision and personalised medicine goals, integrating population-wide data to inform tailored interventions for individuals. The relationship between generative artificial intelligence and predictive analytics underscores the possibility for individual patient data and dynamic risk prediction to enhance care quality. An example is the ‘tri-hybrid model’ which integrates generative artificial intelligence (guideline adherence and decision making); predictive analytics (identifying risk); and human intelligence (controllers and interpreters) [4]. This model would promote personalised care and ensure that clinicians remain in active control, providing an optimal balance of technology and human expertise. The potential to contextualise artificial intelligence systems to specific patient populations and healthcare systems cannot be overstated. By leveraging real-time feedback loops and iterative updates, such systems could be adapted to handle diverse clinical scenarios, ensuring their evolving relevance. Moreover, this proactive application of artificial intelligence enhances safety, reduces variability and offers clinicians a robust framework for managing complex decisions.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.303 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.155 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.555 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.453 Zit.