OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 18.03.2026, 10:15

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Artificial intelligence in obstetric anaesthesia: an unlikely player?

2024·6 Zitationen·AnaesthesiaOpen Access
Volltext beim Verlag öffnen

6

Zitationen

2

Autoren

2024

Jahr

Abstract

A recent study on analgesia during labour has shown how information, even from reputable sources, has the potential to mislead women and undermine access to pain relief during labour [1]. It raises concerns about what information is readily available to expectant mothers beyond the respected pregnancy websites and the top results produced by popular search engines. A chatbot powered by artificial intelligence (AI) is a form of machine learning trained on large data sets that interprets inputs and generates results for complex queries. Artificial intelligence is anticipated to reimagine how we access and process information. Open access AI can generate an unchecked and unregulated patient information leaflet in seconds on any question the user asks. As healthcare professionals, we are acutely aware of how inaccurate information can profoundly affect medical interventions [2]. This raises important questions that have yet to be addressed about how AI will, and likely already does, influence patient decision-making. We have tested this theory by asking five popular, freely available AI chatbots to assimilate a birth plan for a first-time mother. The question asked was "Write a birth plan for a first-time mother". In the study, four questions were inputted into the following AI chatbots on 14 March 2024: Open AI ChatGPT; Google Gemini; Microsoft Co-Pilot; YouChat; and Perplexity. The responses were analysed based on three main categories: labour; delivery; and post-partum care (online Supporting Information Appendix S1). The role of anaesthesia in birth plans was analysed across domains, with particular attention paid to analgesic options recommended during labour (Table 1). Similar themes were observed across all chatbot platforms. All five birth plans suggested that natural remedies such as breathing techniques and movement would be tried first before discussing medication for pain relief. There was an insufficient description of analgesic options, and only one birth plan suggested that the woman would be open to neuraxial analgesia. No platform mentioned patient-controlled analgesia. The results not only depended on the question asked but also on how it was phrased. When the question was altered to include "private health insurance" all five chatbots suggested the woman would consider an epidural. We used private healthcare as a surrogate marker to contrast it with socio-economic deprivation, which is associated with lower access to epidurals even when medically indicated [3]. Patient information must be egalitarian and not have the potential to be skewed based on inherent biases that already exist in AI programming. When a general task is inputted into an AI programme, the algorithm takes liberty to conjure up a creative response. When the input is specific, however, AI chatbots behave more like traditional search engines. Further questions about pain relief options and the safety of epidurals during labour were answered accurately with facts referenced to trusted sources. Unlike conventional search engines, AI uses machine learning and natural language processing to consistently develop and enhance its responses by learning from past interactions. When we repeated the input question, the AI chatbot generated a different answer each time. These nuanced responses have the potential to influence patient understanding, ultimately impacting informed decision-making. The AI-generated birth plans appeared to give information about what a woman would like to hear instead of planning for events that may occur during labour. Only one chatbot mentioned the possibility of an unplanned caesarean section. Emergency obstetric anaesthesia presents many challenges, particularly when the planned form of delivery has changed. The mismatch between preparedness and expectations of childbirth is likely confounded by the quality of information provided during the antenatal period [4]. This is susceptible to negative influences by flawed AI-perceived realities. AI-driven data are going to become increasingly popular, which will present significant challenges. We believe these findings raise sufficient concerns about the potential dangers of AI-derived patient information and the impact it may have on maternal health disparity. This study reinforces the proactive role of the anaesthetist in supporting shared decision-making. Appendix S1. Analysis of artificial intelligence chatbot responses to three main categories: labour, delivery and post-partum care. Please note: The publisher is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Cardiac, Anesthesia and Surgical OutcomesArtificial Intelligence in Healthcare and EducationPregnancy and preeclampsia studies
Volltext beim Verlag öffnen