Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Ai-Based Medical Education Through Computer-Interpretable Clinical Guidelines: Project and First Advances
0
Zitationen
8
Autoren
2025
Jahr
Abstract
Taking advantage of our 25-year experience with the Computer Interpretable Guideline (CIG) decision support systems, we have started to analyze the adoption of computerinterpretable clinical guidelines and AI techniques to complement medical education. Moving from decision support to the educational task involves significant research challenges, that we are addressing in a two-year project, AI-LEAP, started in May 2023. In this paper, we discuss one of the main technical advances that we have already achieved in the project: the design of an ontology-based approach to support students' self-verification. On the basis of a “shadow” of a (part of a) computerized guideline and of medical ontologies, students are challenged to “reconstruct” the original guideline. At each reconstruction step, the students' solutions are compared with the “golden standard” guideline, and explanations for differences are automatically provided.
Ähnliche Arbeiten
"Why Should I Trust You?"
2016 · 14.210 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.586 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.100 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.466 Zit.
Artificial intelligence in healthcare: past, present and future
2017 · 4.382 Zit.