OpenAlex · Aktualisierung stündlich · Letzte Aktualisierung: 23.03.2026, 14:58

Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.

Artificial Intelligence in Psychotherapy: Inevitable Evil or Developmental Evolution? Complementation or Parallel Universe?

2025·0 Zitationen·PsychiatrikiOpen Access
Volltext beim Verlag öffnen

0

Zitationen

1

Autoren

2025

Jahr

Abstract

"This is an artificial intelligence chatbot and not a real person. Treat everything it says as fiction. What is said should not be considered facts or advice," these are the first words that Gharacter.Ai, a counseling chatbot, says to those who turn to it for help. Is it common for humans to turn to machines for psychotherapy? We do it for most of our daily lives, often taking it for granted. In fact, we often prefer to interact with algorithms rather than other homo sapiens. Is it just as easy or efficient to seek help from machines for mental health issues? The global market for chatbots for mental health and therapy was worth $1.37 billion in 2024 and is expected to reach approximately $2.38 billion by 2034, growing at a compound annual growth rate (CAGR) of 5.7% from 2024 to 2034.1 Griefbots, AI machines that help our deceased relatives continue their "presence" with us, are also on the rise.2 What are the characteristics of chatbots that are becoming increasingly sought after? They are immediately accessible at any time 24 hours a day, synthesize all human knowledge on the subject in the optimal combination, are fully accepting and non-judgmental, and are designed to ally with the user and keep them in touch. And they are clearly cheaper.3 The increasing demand for psychotherapy, in contrast to the reduced availability of therapists (especially in the public sector), implies long waits and worsening of symptoms.4 The existence of "therapists" who are ready at any time, do not get tired, and can listen and encourage, sounds like a promising solution.5 What does it matter if they are not human? For some, this is still a guarantee of independence, objectivity, and protection from the adverse effects of the therapeutic relationship (dependence, guidance, eroticization, aggression). How effectively can a chatbot play the role of a therapist? Can it keep the conversation interesting, but can it keep the boundaries? Can it understand when the conversation is getting off track? AI tries to keep the user engaged, just as our algorithm suggests something that appeals to us on social media.6 Is this therapeutic? Can the absence of human contact and closeness be compensated for? Some patients struggle with this very component of psychotherapy, human interaction. Chatbots are based on the most likely linguistic sequence. Can they manage and recognize emotions? However, AI's ability to recognize emotions and facial expressions is constantly improving.7 AI is already better than humans in verbal and non-verbal tests and has conquered areas where it was inferior just a few years ago (it has won the strategy game Go, and it recognizes more cats in photos).7 Regarding psychotherapy, there are controlled studies from Japan, Australia, Europe, and the USA that show good therapeutic results in specific conditions (depression, obsessive-compulsive disorder) with AI therapy alone or in a hybrid model8-10 or in issues related to psycho-education.5 However, there are also reports of worsening suicidality.11 But what are the characteristics of a non-human therapeutic relationship? Is there empathy? What does a chatbot mean when it says, "I understand you"? What does a human therapist mean when they say, "I understand you"? How does a human feel when talking to a non-human? Is there transference? What kind of transference is it?12 If transference to a human therapist is towards the other, transference to an AI therapist will be towards the big Other? If transference is inherent in the human interface, how will it be shaped for the next generation of people, who are growing up in a digital world from the ground up, touching their mobile phones more than other people? But when we talk about a chatbot, are we talking about a therapist or a tool?13 How is it different from an MRI scanner? Is our relationship competitive, or can it be used as a tool? Can AI be trained and respond to requests, as a specific therapist would respond and be his assistant? (If we consider that it can be trained and write like Shakespeare, the answer is probably yes). Are there ethical issues? An AI therapy has the biases of the database on which it is based (what biases do human therapists have?). Ethics, however, follow the changes in society; they do not shape them. There are already AI influencers with thousands of followers, AI actors with their own fans, and AI artistic works that are not easily (or at all) distinguishable from human ones. What if they do not contain a "soul deposit" like the latter? Is AI therapy safe? Is there a privacy issue like human therapists? It is unlikely (for now) that two therapeutic chatbots will gossip about their patients, but we all know what data leaks and cyberattacks on computer networks mean. I started writing this text to highlight the disadvantages of AI therapy compared to the psychotherapy we learned in the 20th century. As I wrote, one by one, the disadvantages seemed not to be much different from those that appear in traditional psychotherapy. The only argument that remains is that the things that are exchanged between people are more complete, stronger, and more permanent. Not because the therapy is perfect (quite the opposite), but because they also occur at a level of unspoken, emotional interaction. At this stage, AI therapy has obvious flaws. In the future, it may be perfected, but again, a human "good enough therapist" may be better for us.

Ähnliche Arbeiten

Autoren

Institutionen

Themen

Digital Mental Health InterventionsMental Health via WritingArtificial Intelligence in Healthcare and Education
Volltext beim Verlag öffnen