Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Large language models and their impact in medical imaging education
0
Zitationen
2
Autoren
2025
Jahr
Abstract
The recent emergence of large language models (LLMs), which have unprecedented capabilities to process, analyze, and synthesize complex medical information with remarkable proficiency, is poised to have a disruptive impact on health care. In the field of medical imaging, LLMs can be applied with promise in generating radiology reports along with detecting and correcting errors, explaining medical imaging findings, indicating differential diagnoses based on imaging patterns, and providing recommendations on imaging modality and protocol selection. In parallel, LLMs could offer innovative solutions for individualized learning, intelligent tutoring, content generation, and clinical decision support in medical education. However, challenges such as incorrect responses, negative influence on critical thinking, academic integrity concerns, bias, and privacy issues must be addressed to ensure safe and effective implementation of LLMs. This review summarizes the current applications, potential benefits, inherent limitations along with appropriate mitigation strategies, and future directions of LLMs in medical imaging education, emphasizing the need for responsible integration to maximize their utilities while mitigating risks.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.214 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.071 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.429 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.418 Zit.