Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Balancing Privacy and Accuracy in Healthcare AI: Federated Learning with AutoML for Blood Pressure Prediction
1
Zitationen
4
Autoren
2025
Jahr
Abstract
The widening gap between life expectancy and healthy life years underscores the need for scalable, adaptive, and privacy-conscious healthcare solutions. In this study, we integrate the AMPER (Aim–Measure–Predict–Evaluate–Recommend) framework with Bidirectional Encoder Representations from Transformers (BERT), Automated Machine Learning (AutoML), and privacy-preserving Federated Learning (FL) to deliver personalized hypertension management. Building on sequential data modeling and privacy-preserving AI, we apply this framework to the MIMIC-III dataset, using key variables—gender, age, systolic blood pressure (SBP), and body mass index (BMI)—to forecast future SBP values. Experimental results show that combining BERT with Moving Average (MA) or AutoRegressive Integrated Moving Average (ARIMA) models improves predictive accuracy, and that personalized FL (Per-FedAvg) significantly outperforms local models while maintaining data confidentiality. However, FL performance remains lower than direct data sharing, revealing a trade-off between accuracy and privacy. These findings demonstrate the feasibility of integrating AutoML, advanced sequence modeling, and FL within a structured health management framework. We conclude by discussing theoretical, clinical, and ethical implications, and outline directions for enhancing personalization, multimodal integration, and cross-institutional scalability.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.397 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.878 Zit.
Deep Learning with Differential Privacy
2016 · 5.604 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.592 Zit.
Large-Scale Machine Learning with Stochastic Gradient Descent
2010 · 5.569 Zit.