Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Hallucination in artificial-intelligence-assisted diagnosis of arthritis: a case report
1
Zitationen
1
Autoren
2025
Jahr
Abstract
Artificial intelligence (AI) technologies are increasingly being integrated into clinical practice, offering potential enhancements in diagnostic accuracy and clinical efficiency. In this case report, a diagnostic attempt assisted by ChatGPT-4o in a 51-year-old female patient presenting with hand arthralgia is described. The AI-generated interpretation demonstrated hallucination-namely, the fabrication of unsupported or inaccurate information-in the analysis of radiologic and laboratory findings, as well as in treatment recommendations. This case underscores the importance of exercising caution when applying AI tools in clinical contexts. To ensure diagnostic accuracy, patient safety, and ethical responsibility, expert oversight and multi-step verification processes are essential in the deployment of AI-generated clinical outputs.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.260 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.116 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.493 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.438 Zit.