Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
If a therapy bot walks like a duck and talks like a duck then it is a medically regulated duck
2
Zitationen
5
Autoren
2025
Jahr
Abstract
Large language models (LLMs) are increasingly used for mental health interactions, often mimicking therapeutic behaviour without regulatory oversight. Documented harms, including suicides, highlight the urgent need for stronger safeguards. This manuscript argues that LLMs providing therapy-like functions should be regulated as medical devices, with standards ensuring safety, transparency and accountability. Pragmatic regulation is essential to protect vulnerable users and maintain the credibility of digital health interventions.
Ähnliche Arbeiten
Amazon's Mechanical Turk
2011 · 10.034 Zit.
The Epidemiology of Major Depressive Disorder
2003 · 7.969 Zit.
The Transtheoretical Model of Health Behavior Change
1997 · 7.709 Zit.
Acute and Longer-Term Outcomes in Depressed Outpatients Requiring One or Several Treatment Steps: A STAR*D Report
2006 · 5.450 Zit.
Depression Is a Risk Factor for Noncompliance With Medical Treatment
2000 · 4.140 Zit.