OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 23.03.2026, 12:13

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Liver Cysts and Artificial Intelligence: Is AI Really a Patient-Friendly Support?

2025·0 Zitationen·SurgeriesOpen Access
Volltext beim Verlag öffnen

0

Zitationen

7

Autoren

2025

Jahr

Abstract

Background: With the advancement of AI-powered online tools, patients are increasingly turning to AI for guidance on healthcare-related issues. Methods: Acting as patients, we posed eight direct questions concerning a common clinical condition—liver cysts—to four AI chatbots: ChatGPT, Perplexity, Copilot, and Gemini. The responses were collected and compared both among the chatbots and with the current literature, including the most recent guidelines. Results: Overall, the responses from the four chatbots were generally consistent with the literature, with only a few inaccuracies noted. For questions addressing “grey areas” in clinical research, all chatbots provided generalized answers. ChatGPT, Copilot, and Gemini highlighted the lack of conclusive evidence in the literature, while Perplexity offered speculative correlations not supported by data. Importantly, all chatbots recommended consulting a healthcare professional. While Perplexity, Copilot, and Gemini included references in their responses, not all cited sources were academic or of medium/high evidence quality. An analysis of Flesch Readability Ease Scores and Estimated Reading Grade Levels indicated that ChatGPT and Gemini provided the most readable and comprehensible responses. Conclusions: The integration of chatbots into real-world healthcare scenarios requires thorough testing to prevent potentially serious consequences from misuse. While undeniably innovative, this technology presents significant risks if implemented improperly.

Ähnliche Arbeiten