Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
Problematic ChatGPT Use Scale: AI-Human Collaboration or Unraveling the Dark Side of ChatGPT
6
Zitationen
5
Autoren
2025
Jahr
Abstract
Abstract Artificial intelligence (AI)-driven tools like ChatGPT have rapidly integrated into daily life, raising concerns about problematic use and psychological impacts. However, validated tools to assess maladaptive interaction patterns are limited. This study evaluated the psychometric properties of the Problematic ChatGPT Use Scale (PCGUS) within a Turkish sample. Study I included 391 participants (61.4% female, mean age = 30.89), while study II involved 473 participants (74.4% male, mean age = 28.25). In study I, confirmatory factor analysis (CFA) confirmed the unidimensional structure of the 9-item scale, and measurement invariance (MI) was established at configural, metric, and scalar levels. The scale demonstrated acceptable reliability, and Item Response Theory (IRT) analysis showed strong item discrimination. Problematic ChatGPT use (PCGU) correlated positively with AI addiction, internet gaming disorder, and internet addiction, and negatively with conscientiousness. Study II revealed that psychological distress and self-control mediated the relationship between PCGU and well-being, highlighting the importance of these factors in predicting healthy AI engagement.
Ähnliche Arbeiten
Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI
2019 · 8.245 Zit.
Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead
2019 · 8.102 Zit.
High-performance medicine: the convergence of human and artificial intelligence
2018 · 7.468 Zit.
Proceedings of the 19th International Joint Conference on Artificial Intelligence
2005 · 5.776 Zit.
Peeking Inside the Black-Box: A Survey on Explainable Artificial Intelligence (XAI)
2018 · 5.429 Zit.