OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 17.03.2026, 22:20

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Editorial Comment

2023·0 Zitationen·Urology Practice
Volltext beim Verlag öffnen

0

Zitationen

1

Autoren

2023

Jahr

Abstract

You have accessUrology PracticeHealth Policy1 Jan 2024Editorial CommentThis article comments on the following:Comparison of ChatGPT and Traditional Patient Education Materials for Men’s Health Jason Jameson Jason JamesonJason Jameson https://orcid.org/0000-0002-0487-3445 Phoenix VA Healthcare System, Phoenix, Arizona Editorial Committee, Urology Practice More articles by this author View All Author Informationhttps://doi.org/10.1097/UPJ.0000000000000490.01AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookTwitterLinked InEmail The study evaluated the readability and accuracy of ChatGPT vs provider-created material from the Urology Care Foundation™ (UCF) regarding 6 different men’s health topics (erectile dysfunction, premature ejaculation, low testosterone, sperm retrieval, penile augmentation, and male infertility).1 The National Institutes of Health and the American Medical Association recommend educational health material be written at the sixth to eighth grade level. Overall, the readability was low for both ChatGPT and UCF material. By prompting ChatGPT to “Explain it to me like I am a sixth grader,” the readability did improve, but most patients will not have the insight to do so. The accuracy of ChatGPT was comparable to UCF in this study but has been inconsistent in other studies. A recent study showed 96.9% accuracy of cancer information on ChatGPT compared to the National Cancer Institute when addressing common cancer myths and misconceptions.2 However, ChatGPT scored poorly when compared to human reviewers using the DISCERN tool in identifying poor-quality information regarding shock wave therapy for erectile dysfunction.3 Communication in health care is critical, and there is always room for improvement at all levels—provider-created materials, AI and ChatGPT, telehealth, and in-person visits. Certainly ChatGPT is not a substitute for the important patient-doctor discussions with mutually agreed upon shared medical decisions. And provider-created material will be more effective when more patients are able to read and understand. ChatGPT may be helpful to patients as one of many resources for medical information. As the clinical input improves in AI and ChatGPT, so too will the accuracy. References 1. Comparison of ChatGPT and traditional patient education materials for men’s health. Urol Pract. 2024; 11(1):86-95. Google Scholar 2. . Using ChatGPT to evaluate cancer myths and misconceptions: artificial intelligence and cancer information. JNCI Cancer Spectr. 2023; 7(2):pkad015. Crossref, Medline, Google Scholar 3. ChatGPT's ability to assess quality and readability of online medical information: evidence from a cross-sectional study. Cureus. 2023; 15(7):e42214. Medline, Google Scholar © 2023 by American Urological Association Education and Research, Inc.FiguresReferencesRelatedDetailsRelated articlesUrology Practice1 Nov 2023Comparison of ChatGPT and Traditional Patient Education Materials for Men’s Health Volume 11 Issue 1 January 2024 Page: 94-94 Advertisement Copyright & Permissions© 2023 by American Urological Association Education and Research, Inc.Metrics Author Information Jason Jameson Phoenix VA Healthcare System, Phoenix, Arizona Editorial Committee, Urology Practice More articles by this author Expand All Advertisement Advertisement PDF downloadLoading ...

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and EducationDigital Mental Health InterventionsSocial Media in Health Education
Volltext beim Verlag öffnen