OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 15.03.2026, 07:58

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Justice, trust, and moral judgements when personnel selection is supported by algorithms

2023·26 Zitationen·European Journal of Work and Organizational PsychologyOpen Access
Volltext beim Verlag öffnen

26

Zitationen

4

Autoren

2023

Jahr

Abstract

Although algorithm-based systems are increasingly used as a decision-support for managers, there is still a lack of research on the effects of algorithm use and more specifically on potential algorithmic bias on decision-makers. To investigate how potential social bias in a recommendation outcome influences trust, fairness perceptions, and moral judgement, we used a moral dilemma scenario. Participants (N = 215) imagined being human resource managers responsible for personnel selection and receiving decision-support from either human colleagues or an algorithm-based system. They received an applicant preselection that was either gender-balanced or predominantly male. Although participants perceived algorithm-based support as less biased, they also perceived it as generally less fair and had less trust in it. This could be related to the finding that participants perceived algorithm-based systems as more consistent but also as less likely to uphold moral standards. Moreover, participants tended to reject algorithm-based preselection more often than human-based and were more likely to use utilitarian judgements when accepting it, which may indicate different underlying moral judgement processes.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Ethics and Social Impacts of AIArtificial Intelligence in Healthcare and EducationPsychology of Moral and Emotional Judgment
Volltext beim Verlag öffnen