OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 13.05.2026, 11:32

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

ChatGPT for advice on common GI endoscopic procedures: the promise and the peril

2023·9 Zitationen·iGIEOpen Access
Volltext beim Verlag öffnen

9

Zitationen

5

Autoren

2023

Jahr

Abstract

Background and Aims: Artificial intelligence (AI) chatbots may be used by patients to obtain information. However, no studies have examined whether factual inaccuracies or limitations occur with such information. Our study aimed to determine if ChatGPT, an AI chatbot, can provide correct responses to patient questions on common endoscopic procedures. Methods: Study team members posed standard questions on EGD, colonoscopy, ERCP, and EUS to ChatGPT. The responses were recorded and systematically appraised for factual correctness and potential safety issues. Results: ChatGPT provided clear, plain English advice on a range of common questions related to endoscopy. It provided generally accurate information on periprocedural care. EGD and colonoscopy were accurately described, with indications, alternatives, risks, and follow-up correctly described with minor errors. However, ChatGPT provided multiple factually incorrect responses on indications, alternatives, and risks of ERCP. We postulate that these may be because of underlying AI biases, such as representational, learning, and historic bias. Conclusions: ChatGPT provided generally correct and safe advice for EGD and colonoscopy but had major factual errors when providing advice on ERCP. Use of ChatGPT and other AI chatbots for patient counseling must take into account errors that can arise from AI biases.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationColorectal Cancer Screening and DetectionClinical Reasoning and Diagnostic Skills
Volltext beim Verlag öffnen