Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Hidden Algorithms of Culture: A Review and Critical Analysis of Cultural Bias in General-Purpose Generative AI Chatbots
0
Zitationen
1
Autoren
2025
Jahr
Abstract
The aim of the article is to review and systematise the results of the latest empirical studies on the manifestations of cultural bias in the content produced by general-purpose generative AI chatbots such as ChatGPT, Copilot, Gemini, Claude and DeepSeek, and to identify their potential social consequences. The following research questions were formulated: What types and what is the scale of cultural biases in generative AI chatbots? What are the social consequences of their occurrence and possible ways and directions of counteraction? A review study was based on a critical analysis of 17 recent empirical studies published in 2024-2025. The analysis shows the complex nature of the presence and consequences of cultural bias in current AI models. It has been clearly demonstrated that they reflect and reinforce Western cultural patterns. Four types of cultural bias have been identified: axiological-civilisational, racial-ethnic, national, and religious-ideological. The analysis also showed that cultural bias is not only a technical problem of algorithms, but a deeply rooted social phenomenon resulting from the contexts of training data and design decisions made by technology developers.
Ähnliche Arbeiten
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
An Experiment in Linguistic Synthesis with a Fuzzy Logic Controller
1999 · 5.632 Zit.
An experiment in linguistic synthesis with a fuzzy logic controller
1975 · 5.559 Zit.
A FRAMEWORK FOR REPRESENTING KNOWLEDGE
1988 · 4.548 Zit.
Opinion Paper: “So what if ChatGPT wrote it?” Multidisciplinary perspectives on opportunities, challenges and implications of generative conversational AI for research, practice and policy
2023 · 3.342 Zit.