Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Between Privacy and Utility: Navigating Inference Risks in De-Identified Health Data
1
Zitationen
5
Autoren
2025
Jahr
Abstract
Protecting healthcare data from inference attacks, where adversaries deduce sensitive information from de-identified data, is critical. This study examines the vulnerability of such datasets, focusing on Tennessee facilities serving predominantly African American populations, while also incorporating analyses based on the MIMIC-III dataset representing Massachusetts. We apply differential privacy with varying ϵ values to assess its impact on statistical integrity and predictive model accuracy. Results show a clear trade-off: lower ϵ enhances privacy but degrades performance, while higher ϵ preserves utility at the cost of increased leakage risk. These findings underscore the importance of carefully balancing privacy and utility when allocating the privacy budget in clinical prediction tasks.
Ähnliche Arbeiten
k-ANONYMITY: A MODEL FOR PROTECTING PRIVACY
2002 · 8.395 Zit.
Calibrating Noise to Sensitivity in Private Data Analysis
2006 · 6.867 Zit.
Communication-Efficient Learning of Deep Networks from Decentralized\n Data
2016 · 5.591 Zit.
Deep Learning with Differential Privacy
2016 · 5.587 Zit.
Large-Scale Machine Learning with Stochastic Gradient Descent
2010 · 5.559 Zit.