Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Ein externer Link zum Volltext ist derzeit nicht verfügbar.
Mitigating Algorithmic Bias Through Sampling: The Role of Group Size and Sample Selection
0
Zitationen
3
Autoren
2026
Jahr
Abstract
This study proposes a structured framework for mitigating algorithmic bias through sampling-based preprocessing techniques, with particular attention to the roles of group size adjustment and sample selection strategies. We focus on SMOTE-based methods and introduce a 3 × 3 matrix to categorize bias mitigation techniques. This matrix combines three group size strategies, Equalized Representation, UP-Focused Equalized Representation, and Balanced, group sizes and three sample selection strategies. This framework enables systematic evaluation of each technique’s impact on fairness metrics, including Demographic Parity and Equalized Odds, as well as predictive performance. Evaluations across ten diverse datasets show that methods focusing on the unprivileged positive group and leveraging decision-boundary-aware sampling yield significant fairness improvements without substantial accuracy loss. These results highlight the efficacy of targeted oversampling strategies in achieving equitable outcomes in machine learning applications. State-of-the-art methods like preferential sampling continue to excel in optimizing Demographic Parity, while uniform sampling remains superior for achieving Equalized Odds.
Ähnliche Arbeiten
The global landscape of AI ethics guidelines
2019 · 4.708 Zit.
The Limitations of Deep Learning in Adversarial Settings
2016 · 3.884 Zit.
Trust in Automation: Designing for Appropriate Reliance
2004 · 3.501 Zit.
Fairness through awareness
2012 · 3.300 Zit.
AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, and Recommendations
2018 · 3.191 Zit.