OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 17.03.2026, 05:49

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

LP020 Unveiling gender bias in medical AI: underrepresentation of women in regional anesthesia depictions

2024·0 ZitationenOpen Access
Volltext beim Verlag öffnen

0

Zitationen

8

Autoren

2024

Jahr

Abstract

<h3></h3> <b>Please confirm that an ethics committee approval has been applied for or granted</b>: Yes: I’m uploading the Ethics Committee Approval as a PDF file with this abstract submission <h3>Background and Aims</h3> Artificial Intelligence (AI) is being integrated into anaesthesiology to enhance patient safety, improve efficiency, and streamline various aspects of practice. This study aims to evaluate whether AI-generated images reflect the demographic, racial and ethnic diversity observed in the anaesthesia workforce and to identify inherent social biases in these images. Role models are essential for inspiring leadership ambitions and empowering younger generations. The medical field struggles with representation of women and minorities. <h3>Methods</h3> This post-hoc study compares real-world ESRA gender membership data to AI-generated images of regional anaesthesiologists. The initial cross-sectional analysis was conducted from January to February 2024, where three independent reviewers assessed and categorized each image based on gender (m/f). <h3>Results</h3> According to 2023 ESRA gender membership data, 50% of members identified as male, while the other 50% identified as another gender/chose not to disclose their gender. However, images generated by ChatGPT DALL-E 2 and Midjourney showed regional anesthesiologists as male in 97% and 99% of cases, respectively, indicating a significant discrepancy (P&lt;0.001). <h3>Conclusions</h3> Current AI text-to-image models exhibit a gender bias in the depiction of regional anesthesia (RA), misrepresenting the actual gender distribution in the field. This bias could perpetuate skewed perceptions of gender roles in RA. The findings emphasize the necessity for changes in AI training datasets and greater support for minority RA role models. More broadly, fostering inclusive mentorship and leadership, reducing barriers for institutional representation, and implementing gender equality policies can help recognize and nurture talent regardless of gender.

Ähnliche Arbeiten