Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Potential for Algorithmic Bias in Clinical Decision Instrument Development
1
Zitationen
6
Autoren
2025
Jahr
Abstract
Clinical decision instruments (CDIs) face an equity dilemma. They reduce disparities in patient care through data-driven standardization of best practices. However, this standardization may perpetuate bias and inequality within healthcare systems. We perform a quantitative, systematic review to characterize four potential sources of bias in the development of 690 CDIs. We find evidence for potential algorithmic bias in CDI development through various analyses: self-reported participant demographics are skewed-e.g. 73% of participants are White, 55% are male; investigator teams are geographically skewed-e.g. 52% in North America, 31% in Europe; CDIs use predictor variables that may be prone to bias-e.g. 1.9% (13/690) of CDIs use Race and Ethnicity; outcome definitions may introduce bias-e.g. 26% (177/690) of CDIs involve follow-up, which may skew representation based on socioeconomic status. As CDIs become increasingly prominent in medicine, we recommend that these factors are considered during development and clearly conveyed to clinicians.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.200 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.051 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.416 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.410 Zit.