Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
DALA-XGBoost: An AI-ML Framework for Cross-Domain Maternal-Fetal Risk Assessment
0
Zitationen
3
Autoren
2025
Jahr
Abstract
In the delicate balance of life before birth, even small maternal health issues can pose serious risks to the fetus. Recent reports indicate that over 260,000 women lost their lives in 2023 due to preventable pregnancy complications, which indicates the urgent need for proactive and automated methods to detect pregnancy risks early. The lack of clinically annotated maternal data sets, along with the problem of integrating maternal and fetal health records, limits the effectiveness of existing prediction tools. To address this, we introduce a new cross-domain classification framework that combines Dual Autoencoder Latent Alignment (DALA) with Extreme Gradient Boosting (XGBoost) classifier. In this approach, separate autoencoders are trained independently on the maternal and fetal datasets to learn domain- specific latent representations, and then aligned in a common space through unsupervised learning. Labeled fetal embeddings are used to pseudo-label unlabeled maternal data through distance clustering, thereby effectively transferring risk knowledge across domains. These maternal embeddings are then used to train an XGBoost classifier to predict the fetal risk categories Normal, Suspect, and Pathological exclusively based on maternal - indicators. The model obtained an overall accuracy of 95.8% and F1 scores of 98%, 89%, and 91% for the three risk classes. SHapley Additive exPlanations (SHAP) are used to highlight the key maternal features that influence predictions, Principal Component Analysis (PCA) validates the latent alignment, and Local Interpre table Model-agnostic Explanations (LIME) explain the individual pathological predictions for clinical transparency. In summary, this research offers a scalable and potent solution for detecting fetal risks using only maternal data, thus enabling early intervention in low-resource environments where fetal monitoring is unavailable.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.493 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.377 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.835 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.781 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.555 Zit.