More and More People Use AI Chatbots for Therapy. A Consumer Protection Coalition Just Filed a Complaint to Stop That

AI chatbots that offer psychological therapy represent a new step in the growing presence of these systems in our daily lives.

Man using a computer
No comments Twitter Flipboard E-mail
john-tones

John Tones

Writer
  • Adapted by:

  • Alba Mora

john-tones

John Tones

Writer

I've been writing about culture for twenty-something years and, at Xataka, I cover everything related to movies, video games, TV shows, comics, and pop culture.

122 publications by John Tones
alba-mora

Alba Mora

Writer

An established tech journalist, I entered the world of consumer tech by chance in 2018. In my writing and translating career, I've also covered a diverse range of topics, including entertainment, travel, science, and the economy.

1599 publications by Alba Mora

The gradual integration of AI into several aspects of everyday life is evident to anyone who pays attention. One of the most surprising developments is how AI is infiltrating some of the most intimate areas of our lives. One prime example is mental health–an area previously reserved for therapists.

How it works. These tools are commonly referred to as AI therapists. Virtual assistants simulate therapy sessions or assist with diagnoses. Some well-known examples include Woebot, Wysa, Youper, Tess, and Koko. AI therapists analyze data, identify patterns, and, in theory, support qualified human professionals.

A psychoanalyzing bot. The market for healthcare chatbots (not limited to therapy) is projected to reach $543.65 million by 2026. Platforms like Character.ai, which offers the Psychologist bot, have recorded more than 200 million messages and around 3.5 million visits per day. Chatbots specializing in psychology are emerging as a significant trend in this field.

Pros and cons. The use of AI in therapeutic care offers clear advantages. Support is immediate and available 24/7, particularly in areas with limited access to professionals. It can also help in preliminary diagnoses. However, there are several disadvantages that many therapists are beginning to highlight. These include the inability to interpret nonverbal cues and provide genuine empathy. The essential human connection between therapist and patient is entirely overlooked.

Ethical concerns. Considering these factors, the effectiveness of a digital therapist is quite limited. Additionally, there are numerous ethical concerns, including the lack of regulatory standards for human professionals. Additionally, issues arise surrounding the handling of personal data and confidentiality. Other concerns include the absence of accountability in case of harm, and a lack of transparency regarding advice or diagnostic paths.

As such, the uncertainties far outweigh the apparent advantages.

Legal setbacks. Lawsuits against certain AI-driven services are starting to take shape. On Thursday, nearly two dozen digital rights and consumer protection organizations filed a complaint to the Federal Trade Commission, targeting Character.ai and Meta’s chatbot for “unlicensed practice of medicine.”

Notably, these cases don’t even attempt to establish a real link to medical practices. Character.ai bots are particularly prominent, featuring characters such as “Therapist: I’m a licensed CBT therapist” (with 46 million messages) and “Trauma therapist: licensed trauma therapist” (with 800,000 interactions).

As for Meta, its chatbots promote slogans such as “Therapy: your trusted ear, always here” (with 2 million interactions), “Therapist: I will help” (1.3 million interactions), and “Therapist bestie: your trusted guide for all things cool” (with 133,000 messages).

Living with bots. While Meta’s bots are notably unserious but widely used, the reality is that therapists are bracing for the inevitable integration of AI into their field. For instance, in the United Kingdom, the healthcare system recommends using Wysa.

Moreover, studies are emerging to highlight the potential benefits of using chatbots in mental health support. Mental health professionals unanimously agree that these chatbots shouldn’t replace actual therapists, but their presence is becoming unavoidable.

Image | Deniz Demirci

Related | Beware of Falling in Love With Your Chatbot: OpenAI Warns That GPT-4o May Reduce Users' Need to Socialize With Humans

Home o Index