Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Artificial Intelligence Chatbots as Information Sources on Testicular Cancer: Quality, Readability and Actionability
0
Zitationen
6
Autoren
2026
Jahr
Abstract
Background/Objectives: Testicular cancer is one of the most common malignancies affecting young adult males. With the rise in artificial intelligence (AI) platforms, many patients seek health information online. Yet chatbot responses specific to testicular cancer remain unassessed. This study aims to evaluate the role of AI chatbots in providing patient information about testicular cancer in terms of its quality, readability and actionability. Methods: Fourteen frequently asked questions about testicular cancer were identified using Google Trends and the Cancer Council Australia website. Questions were then inputted into four different publicly accessible AI platforms: ChatSonic, Bing AI, ChatGPT 4.0 and Perplexity. Chatbot responses were recorded and evaluated using three validated instruments: DISCERN (1–5), Patient Education Materials Assessment Tool (PEMAT)-Understandability and Actionability (0–100%) and Flesch-Kincaid readability scores. Results: All platforms scored low on the DISCERN score with a median of 1 (interquartile range [IQR] 1–4). The median readability score was 34.1 (IQR 26.0–52.2), indicating a reading level suitable for college students. The median word count was 61.5 (IQR, 41.3–91.3). The overall PEMAT-Understandability was moderate (median 58.3, 50.0–66.7), whilst the PEMAT-Actionability was very poor (median 0, IQR 0–25). Conclusions: AI chatbots deliver moderately understandable information on testicular cancer, but this information is typically not actionable and is delivered at an above-average reading level. Despite this, patients may continue to use AI chatbots (AICs) to access health information. It is important that clinicians counsel patients on the benefits and downfalls of this strategy, advocating for the use of AICs as an adjunct rather than a replacement for clinician-led education.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.551 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.443 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.942 Zit.
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
2019 · 6.792 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.