Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Co-Design for Gender Equality in an AI-Based Virtual Assistant for Intensive Care Units
2
Zitationen
5
Autoren
2022
Jahr
Abstract
Artificial-intelligence-based products increasingly influence decision-making in professional settings. If bias becomes engrained in the design and use of these products, they can reify existing discriminatory structures. This has sparked a discussion regarding the fairness of AI-based products. We add to this discussion by exploring how members of development teams in a medical startup company see their possibilities to influence the gender fairness of the AI-based product they are developing. We conducted workshops to co-design interventions for gender fairness with the development team of an intelligent assistance system for hospital decision makers. While the concept of gendered products was largely unfamiliar, learning about the potential of reification of societal stereotypes through the product they were developing elicited concern. Reflecting on the co-construction of technology and gender, the team was sensitized to the ease with which discrimination creeps into AI-based products, identified empathy and concern as a prerequisite for the development of gender-fair products, saw how previous commitment sets the stage for interventions, was confronted with the fact that facing bias can be taxing and that a lack of clarity regarding the scope of responsibility for preventing bias in AI that might lead to a diffusion of responsibility.
Ähnliche Arbeiten
The global landscape of AI ethics guidelines
2019 · 4.504 Zit.
The Limitations of Deep Learning in Adversarial Settings
2016 · 3.856 Zit.
Trust in Automation: Designing for Appropriate Reliance
2004 · 3.378 Zit.
Fairness through awareness
2012 · 3.267 Zit.
Mind over Machine: The Power of Human Intuition and Expertise in the Era of the Computer
1987 · 3.182 Zit.