OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 16.03.2026, 13:19

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Exploring Cognitive and Ethical Predictors of University Students’ Use of Generative Artificial Intelligence in Academic Writing: An Extended UTAUT Approach

2025·0 Zitationen·Preprints.orgOpen Access
Volltext beim Verlag öffnen

0

Zitationen

1

Autoren

2025

Jahr

Abstract

With the growing integration of generative artificial intelligence (GAI) tools such as ChatGPT into higher education, understanding the factors influencing students’ use of GAI for academic writing tasks has become increasingly urgent. This study investigates the key factors influencing students’ behavior in using GAI tools to complete academic writing tasks. In addition to core constructs of Unified Theory of Acceptance and Use of Technology (performance expectancy, effort expectancy, social influence, and facilitating conditions), the study incorporates three new constructs relevant to ethical aspects of academic writing (trust in GAI, ethical AI literacy, and academic integrity assurance). Data were collected from 1400 undergraduate students at three Beijing universities using a structured questionnaire. Structural equation modeling revealed that all eight hy-pothesized paths were supported. Among these predictors, academic integrity assurance (β = 0.194, p < 0.001) and ethical artificial intelligence literacy (β = 0.177, p < 0.001) emerged as the strongest predictors of behavioral intention, while behavioral intention (β = 0.429, p < 0.001) had a strong effect on actual use behavior. These findings highlight the role of ethical and cognitive factors in shaping students’ adoption of GAI for academic writing and offer valuable implications for valuable insights for AI-informed teaching and governance.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Artificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen