Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
A Scoping Review of Artificial Intelligence Algorithms in Clinical Decision Support Systems for Internal Medicine Subspecialties
7
Zitationen
3
Autoren
2021
Jahr
Abstract
Abstract Objectives Artificial intelligence (AI)-based clinical decision support systems (CDSS) have been developed to solve medical problems and enhance health care management. We aimed to review the literature to identify trends and applications of AI algorithms in CDSS for internal medicine subspecialties. Methods A scoping review was conducted in PubMed, IEEE Xplore, and Scopus to determine articles related to CDSS using AI algorithms that use deep learning, machine learning, and pattern recognition. This review synthesized the main purposes of CDSS, types of AI algorithms, and overall accuracy of algorithms. We searched the original research published in English between 2009 and 2019. Results Given the volume of articles meeting inclusion criteria, the results of 218 of the 3,467 articles were analyzed and presented in this review. These 218 articles were related to AI-based CDSS for internal medicine subspecialties: neurocritical care (n = 89), cardiovascular disease (n = 79), and medical oncology (n = 50). We found that the main purposes of CDSS were prediction (48.4%) and diagnosis (47.1%). The five most common algorithms include: support vector machine (20.9%), neural network (14.6%), random forest (10.5%), deep learning (9.2%), and decision tree (8.8%). The accuracy ranges of algorithms were 61.8 to 100% in neurocritical care, 61.6 to 100% in cardiovascular disease, and 54 to 100% in medical oncology. Only 20.1% of those algorithms had an explainability of AI, which provides the results of the solution that humans can understand. Conclusion More AI algorithms are applied in CDSS and are important in improving clinical practice. Supervised learning still accounts for a majority of AI applications in internal medicine. This study identified four potential gaps: the need for AI explainability, the lack of ubiquity of CDSS, the narrow scope of target users of CDSS, and the need for AI in health care report standards.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.250 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.109 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.482 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.434 Zit.