OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 15.03.2026, 18:04

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Explainable AI for Maternal Health Risk Prediction in Bangladesh: A Hybrid Fuzzy-XGBoost Framework with Clinician Validation

2026·0 ZitationenOpen Access
Volltext beim Verlag öffnen

0

Zitationen

3

Autoren

2026

Jahr

Abstract

<title>Abstract</title> Bangladesh faces a maternal mortality ratio of 156 per 100,000 live births, with 2,459 maternal deaths reported in 2022. While machine learning shows promise in risk prediction, black-box models limit clinical adoption in resource-constrained settings where explainability is crucial. This study develops a hybrid fuzzy-XGBoost framework combining ante-hoc fuzzy logic interpretability with post-hoc SHAP explanations, validated through clinician feedback. We trained the model on 1,014 maternal health records with clinical parameters (age, blood pressure, blood sugar) augmented with synthetic regional features based on Bangladesh health data. The hybrid model achieved 88.67% accuracy with ROC-AUC of 0.9703, outperforming the best baseline (Gradient Boosting: 86.21%) by 2.46 percentage points. SHAP analysis identified healthcare access score (most important), blood sugar, and fuzzy risk score as primary predictors. Clinician validation (N=14) showed strong preference for hybrid explanations (71.4% across cases) with 54.8% expressing trust in clinical practice. Fairness analysis revealed equitable performance across regions (σ=0.0766), with better accuracy in underserved areas (r=-0.876 correlation with healthcare access), highlighting potential to address disparities.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Explainable Artificial Intelligence (XAI)Machine Learning in HealthcareArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen