Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Cognitive Biases:
0
Zitationen
2
Autoren
2025
Jahr
Abstract
Artificial Intelligence (AI) systems, while advancing software development, are often susceptible to cognitive biases that lead to unfair outcomes. This study explores the roles of confirmation bias, anchoring bias, and automation bias in influencing AI decision-making. These biases commonly emerge from unrepresentative datasets, algorithmic design flaws, and subjective human decisions. Through a qualitative methodology involving literature review and case analysis, the research identifies the origins and manifestations of cognitive bias in AI, particularly within domains like criminal justice, healthcare, and recruitment. The study proposes several mitigation strategies: incorporating diverse and representative data, adopting fairness-aware algorithm designs, and conducting routine bias audits. Evaluation criteria include each strategy’s effectiveness, feasibility, transparency, and scalability. Findings indicate that while these techniques significantly improve fairness in AI outputs, they also present practical challenges such as reduced model precision and resource constraints. The study emphasizes that eliminating cognitive bias requires not only technical adjustments but also interdisciplinary collaboration and ethical considerations. The findings serve as a guide for developers, stakeholders, and policymakers aiming to design responsible AI systems that uphold transparency, accountability, and social equity across software development environments.
Ähnliche Arbeiten
The global landscape of AI ethics guidelines
2019 · 4.582 Zit.
The Limitations of Deep Learning in Adversarial Settings
2016 · 3.868 Zit.
Trust in Automation: Designing for Appropriate Reliance
2004 · 3.417 Zit.
Fairness through awareness
2012 · 3.279 Zit.
Mind over Machine: The Power of Human Intuition and Expertise in the Era of the Computer
1987 · 3.183 Zit.