Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Ontology-based student testing through clinical guidelines: An AI approach
1
Zitationen
6
Autoren
2025
Jahr
Abstract
On the basis of our 25-year experience with the GLARE (Guideline Acquisition, Representation and Execution) clinical decision support system, we have started to analyze the adoption of computer-interpretable clinical guidelines (CIGs) and AI techniques to train and test medical students about how to act on patients. Moving from decision support to the educational task involves significant research challenges. In this paper, we propose a new facility that supports teachers in the definition of tests, by selecting and hiding to students specific parts of the CIG, and asking students how they would act on the given case study (patient) in the selected parts. Students are provided with a medical ontology to identify proper actions/decisions, and students' proposals are then automatically compared with what the CIG (considered as a "golden standard") would suggest to do to the patient through knowledge representation and reasoning techniques. Our basic explanation mechanism exploits the medical ontology to show to students the differences (if any) between their proposals and the ones of the CIG.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.239 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.095 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.463 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.428 Zit.