Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Misconceptions in the health technology industry that are delaying the translation of artificial intelligence technology into relevant clinical applications
2
Zitationen
1
Autoren
2021
Jahr
Abstract
There is great optimism that artificial intelligence (AI), as it disrupts the medical world, will provide considerable improvements in all areas of health care, from diagnosis to treatment. In addition, there is considerable evidence that AI algorithms have surpassed human performance in various tasks, such as analyzing medical images, as well as correlating symptoms and biomarkers with the diagnosis and prognosis of diseases. However, the mismatch between the performance of AI-based software and its clinical usefulness is still a major obstacle to its widespread acceptance and use by the medical community. In this article, three fundamental concepts observed in the health technology industry are highlighted as possible causative factors for this gap and might serve as a starting point for further evaluation of the structure of AI companies and of the status quo.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.245 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.100 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.466 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.429 Zit.