Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Fairness in Cardiac Magnetic Resonance Imaging: Assessing Sex and Racial Bias in Deep Learning-based Segmentation
13
Zitationen
9
Autoren
2021
Jahr
Abstract
Abstract Background Artificial intelligence (AI) techniques have been proposed for automation of cine CMR segmentation for functional quantification. However, in other applications AI models have been shown to have potential for sex and/or racial bias. Objectives To perform the first analysis of sex/racial bias in AI-based cine CMR segmentation using a large-scale database. Methods A state-of-the-art deep learning (DL) model was used for automatic segmentation of both ventricles and the myocardium from cine short-axis CMR. The dataset consisted of end-diastole and end-systole short-axis cine CMR images of 5,903 subjects from the UK Biobank database (61.5±7.1 years, 52% male, 81% white). To assess sex and racial bias, we compared Dice scores and errors in measurements of biventricular volumes and function between patients grouped by race and sex. To investigate whether segmentation bias could be explained by potential confounders, a multivariate linear regression and ANCOVA were performed. Results We found statistically significant differences in Dice scores (white ∼94% vs minority ethnic groups 86-89%) as well as in absolute/relative errors in volumetric and functional measures, showing that the AI model was biased against minority racial groups, even after correction for possible confounders. Conclusions We have shown that racial bias can exist in DL-based cine CMR segmentation models. We believe that this bias is due to the imbalanced nature of the training data (combined with physiological differences). This is supported by the results which show racial bias but not sex bias when trained using the UK Biobank database, which is sex-balanced but not race-balanced. Condensed Abstract AI algorithms have the potential to reflect or exacerbate racial/sex disparities in healthcare. We aimed to determine the impact of sex and race on the performance of an AI segmentation model for automatic CMR quantification in a cohort of 5,903 subjects from the UK Biobank database, which is sex-balanced but not race-balanced. We tested the model’s bias in performance using Dice scores and absolute/relative errors in measurements of biventricular volumes and function. Our study demonstrates that the model had a racial bias but no sex bias, and that subject characteristics and co-morbidities could not explain this bias.
Ähnliche Arbeiten
Recommendations for Cardiac Chamber Quantification by Echocardiography in Adults: An Update from the American Society of Echocardiography and the European Association of Cardiovascular Imaging
2015 · 17.614 Zit.
Recommendations for Chamber Quantification: A Report from the American Society of Echocardiography’s Guidelines and Standards Committee and the Chamber Quantification Writing Group, Developed in Conjunction with the European Association of Echocardiography, a Branch of the European Society of Cardiology
2005 · 11.292 Zit.
2020 ESC Guidelines for the diagnosis and management of atrial fibrillation developed in collaboration with the European Association for Cardio-Thoracic Surgery (EACTS)
2020 · 9.669 Zit.
2017 ESC Guidelines for the management of acute myocardial infarction in patients presenting with ST-segment elevation
2017 · 9.581 Zit.
ACC/AHA Guidelines for the Management of Patients With ST-Elevation Myocardial Infarction—Executive Summary
2004 · 8.364 Zit.
Autoren
Institutionen
- King's College London(GB)
- Guy's and St Thomas' NHS Foundation Trust(GB)
- Netherlands Heart Institute(NL)
- University Medical Center Utrecht(NL)
- University of Oxford(GB)
- Health Data Research UK(GB)
- William Harvey Research Institute(GB)
- The Alan Turing Institute(GB)
- Barts Health NHS Trust(GB)
- St Bartholomew's Hospital(GB)
- British Heart Foundation(GB)