Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Operationalizing Osteoporosis Guidelines via Deterministic Logic: A Rule-Based Decision Support System that Neutralizes the Expertise Gap
0
Zitationen
5
Autoren
2026
Jahr
Abstract
<title>Abstract</title> Background The complexity of osteoporosis risk stratification often leads to substantial knowledge-to-practice gaps and diagnostic variability. This study evaluates the efficacy of a deterministic rule-based clinical decision support system (CDSS) in operationalizing the 2021 Thai Osteoporosis Foundation (TOPF) guidelines to standardize clinical decision-making. Methods A within-subject randomized experimental study was conducted with 65 physicians (22 juniors, 43 seniors) who evaluated 20 clinical vignettes. Participants were randomized to assess cases under manual and CDSS-supported conditions. Primary outcomes included decision accuracy compared to an expert consensus gold standard, decision time, and system usability, analyzed using paired statistical tests. Results The CDSS significantly improved overall risk classification accuracy from 70.3% to 84.8% ( <italic>p</italic> < 0.001) and treatment selection accuracy from 78.5% to 85.4% ( <italic>p</italic> = 0.003). Crucially, the intervention demonstrated a robust "equalizing effect": junior physicians achieved a 22.3% accuracy gain ( <italic>p</italic> < 0.001), effectively eliminating the baseline expertise gap compared to senior physicians. Decision efficiency was maintained, with no statistically significant increase in processing time ( <italic>p</italic> = 0.322). The system usability score was 69.1, indicating good potential for workflow integration. Conclusion Implementing a transparent, rule-based logic engine successfully mitigates cognitive biases and neutralizes experience-based performance variability. This study confirms that operationalizing clinical guidelines through digital analytics is a scalable strategy to bridge the expertise gap and ensure guideline-concordant management in primary care.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.231 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.084 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.444 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.423 Zit.