OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 16.03.2026, 04:09

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

A chatbot for the soul: mental health care, privacy, and intimacy in AI-based conversational agents

2025·1 Zitationen·Communication and ChangeOpen Access
Volltext beim Verlag öffnen

1

Zitationen

3

Autoren

2025

Jahr

Abstract

Abstract Artificial intelligence-based conversational agents—chatbots—are increasingly integrated into telehealth platforms, employee wellness programs, and mobile applications to address structural gaps in mental health care. While these chatbots promise accessibility, they are often deployed without sufficient impact assessment or even basic user testing. This paper presents a case study using community red-teaming exercises to evaluate a chatbot designed for wellness and spirituality. Unlike traditional red-teaming, which is conducted by engineers to assess vulnerabilities, community red-teaming treats impacted users as experts, uncovering concerns related to privacy, ethics, and functionality. Our fieldwork, conducted with undergraduate beta testers ( n = 28), revealed that participants were often more comfortable sharing private information with the chatbot than with a stranger. Prior experience with commercial AI systems, such as ChatGPT, contributed to this ease. However, participants also raised concerns about misinformation, inadequate guardrails for sensitive topics, data security, and dependency. Despite these risks, users remained open to the chatbot’s potential as a spiritual wellness guide. We further examine how algorithmic impact assessments (AIAs) both capture and overlook key aspects of the spiritual chatbot user experience. These chatbots offer hyper-personalized, AI-mediated divination and wellness interactions, blurring the boundaries between astrology, mental health support, and spiritual guidance. The chameleon-like nature of these technologies challenges conventional assessment frameworks, necessitating a nuanced approach that considers specific use cases, potential user bases, and indirect impacts on broader communities. We argue that AIAs for care and wellness chatbots must account for these complexities, ensuring ethical deployment and mitigating harm.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Digital Mental Health InterventionsAI in Service InteractionsArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen