Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Empirical Comparison of Post-processing Debiasing Methods for Machine Learning Classifiers in Healthcare
4
Zitationen
4
Autoren
2025
Jahr
Abstract
Machine learning classifiers in healthcare tend to reproduce or exacerbate existing health disparities due to inherent biases in training data. This relevant issue has brought the attention of researchers in both healthcare and other domains, proposing techniques that deal with it in different stages of the machine learning process. Post-processing methods adjust model predictions to ensure fairness without interfering in the learning process nor requiring access to the original training data, preserving privacy and enabling the application to any trained model. This study rigorously compares state-of-the-art debiasing methods within the family of post-processing techniques across a wide range of synthetic and real-world (healthcare) datasets, by means of different performance and fairness metrics. Our experiments reveal the strengths and weaknesses of each method, examining the trade-offs between group fairness and predictive performance, as well as among different notions of group fairness. Additionally, we analyze the impact on untreated attributes to ensure overall bias mitigation. Our comprehensive evaluation provides insights into how these debiasing methods can be optimally implemented in healthcare settings to balance accuracy and fairness. Supplementary Information: The online version contains supplementary material available at 10.1007/s41666-025-00196-7.
Ähnliche Arbeiten
The global landscape of AI ethics guidelines
2019 · 4.806 Zit.
The Limitations of Deep Learning in Adversarial Settings
2016 · 3.895 Zit.
Trust in Automation: Designing for Appropriate Reliance
2004 · 3.552 Zit.
Fairness through awareness
2012 · 3.317 Zit.
AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, and Recommendations
2018 · 3.289 Zit.