OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 14.04.2026, 07:55

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

The Global Use of Generative Artificial Intelligence for Emotional and Mental Health Support (Preprint)

2025·1 Zitationen
Volltext beim Verlag öffnen

1

Zitationen

6

Autoren

2025

Jahr

Abstract

<sec> <title>BACKGROUND</title> Generative artificial intelligence (GenAI) models have emerged as a promising and controversial tool for mental health. </sec> <sec> <title>OBJECTIVE</title> The purpose of this study is to understand the experiences of individuals who used ChatGPT for emotional and mental health support (EMS). </sec> <sec> <title>METHODS</title> We prescreened 4,387 individuals online and recruited 270 adult participants across 29 countries who have used ChatGPT for EMS recurrently. Participants responded to quantitative survey questions on the frequency and helpfulness of using ChatGPT for EMS, and qualitative items regarding its therapeutic purposes, user emotional experiences, and rationales for helpfulness evaluations, which were analyzed using thematic analysis. </sec> <sec> <title>RESULTS</title> Most people used ChatGPT for EMS at least 1-2 times per month for purposes spanning traditional mental health needs (diagnosis, treatment, psychoeducation) and everyday personal needs (companionship, relational guidance, well-being improvement, decision-making). Users reported various emotional experiences during interactions (e.g., connected, relieved, curious, awkward, or disappointed). Most users found it at least somewhat helpful for EMS due to perceived changes, emotional support, professionalism, information quality, and free expression, while some found it less helpful because of superficial emotional engagement, limited information quality, and lack of professionalism. </sec> <sec> <title>CONCLUSIONS</title> Despite lacking ethical regulations for EMS use, GenAI has become a widely used self-help tool for mental health globally, with heterogeneity in emotional experiences and perceived rationales for helpfulness or unhelpfulness. These results highlight the positive, and sometimes idealized view of genAI from the general public for mental health use, and underscore the urgent need to promote AI literacy and ethical awareness among community users and healthcare providers, examine its effectiveness experimentally, and identify who may benefit or be harmed. </sec>

Ähnliche Arbeiten

Autoren

Themen

Artificial Intelligence in Healthcare and EducationDigital Mental Health InterventionsMachine Learning in Healthcare
Volltext beim Verlag öffnen