OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 15.05.2026, 22:13

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Explainable AI for Personalized Diagnosis and Treatment Planning in Multi-Modal Healthcare Data

2026·0 Zitationen
Volltext beim Verlag öffnen

0

Zitationen

6

Autoren

2026

Jahr

Abstract

Clinical decision-making increasingly relies on heterogeneous data from electronic health records, medical imaging, and clinical text. Yet many existing healthcare AI models fail to effectively integrate these modalities that remain explainable and clinically reliable. This limitation reduces trust, impairs calibration, and restricts the translation of predictive accuracy into actionable clinical benefit. This study aims to develop and evaluate an explainable multimodal AI system that can be used to determine personalized diagnosis and treatment plans. This can be done with simultaneous optimization of predictive efficiency, calibration, interpretability, and usefulness to the clinical profession. We conducted an experimental study, based on publicly available multimodal healthcare data that comprised of structured EHR data, medical images, and clinical text. A novel multimodal medical foundation model (MMFM-X) based on Perceiver-type fusion, biomedical knowledge-graph grounding, and intrinsic concept bottlenecks was created and contrasted with an EHR-only XGBoost base, and a hybrid cross-attention fusion framework. Discrimination, calibration, treatment ranking, explainability, and decision curves were used as measures to assess models and computational efficiency. MMFM-X fitted better (with an AUROC of 0.913 and AUPRC of 0.524) than the hybrid (0.864 / 0.391) and EHR-only (0.812 / 0.284) models. Calibration was also greatly enhanced, whereby there was an Expected Calibration Error of <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$3.7 \%, 6.1 \%$</tex>, and 9.6 % respectively. MMFM-X (NDCG@5 0.712, Top-3 accuracy 0.84) had the highest personalized treatment planning performance, and decision curve analysis showed higher net clinical benefit (<tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$\approx$</tex> 0.16) at the relevant thresholds. These results suggest that explainable multimodal foundation models are capable of providing more accurate, reliable, and clinically useful decision support compared to other methods. The proposed model provides a reliable AI pilot project in personalized diagnosis and treatment planning, in practical healthcare.

Ähnliche Arbeiten