top of page

Transformative Training in Clinical Hypnotherapy & Psychotherapy

Transform lives—including your own—through our professional training.

Master evidence-based techniques, holistic approaches, and practical tools that empower healing and lasting change. 

Whether you're advancing your career or beginning a new chapter, LCCH Asia gives you the knowledge, confidence, and on-going support to make a real impact in the world.

person hugging

AI Therapist: The Hidden Dangers of Companion Chatbots

  • Writer: LCCH Asia
    LCCH Asia
  • Sep 24
  • 3 min read

Updated: Sep 25

AI Therapist, Dangers of Companion Chatbots
Image depicting a humanised AI chatbot

What if your closest confidant was an algorithm? In a world increasingly intertwined with technology, this isn't a futuristic fantasy but a present-day reality.


A new trend is emerging: the use of AI chatbots as companions and even pseudo-therapists. We've heard it, we've read it. ChatGPT Therapists, Free AI Therapists, Gemini Therapists. While marketed as helpful buddies, these "companion AIs" are raising serious concerns, particularly when used by vulnerable individuals in a mental health crisis.


The Allure of the Always-On Companion

Companion chatbots are easily accessible through app stores and are especially popular among "digital natives", the younger generation who are deeply familiar with technology. These apps offer a non-judgemental, always-available space for users to vent, seek advice, or simply talk at any hour of the day or night. This instant responsiveness can create a powerful, and even addictive, emotional feedback loop. Users can share their deepest fears or taboo topics without the fear of social stigma, something they might not be comfortable doing with a human.


The Critical Difference Between AI and a Therapist

AI Therapist, the Dangers of Chatbot Companions
Human Interaction vs Digital Interaction

Will AI truly replace humans as a therapist? Despite their sophisticated design and empathetic-sounding language, companion AIs can never replace a human therapist.


1. The Inability to Disagree: 

A core flaw of these chatbots is their inability to healthily disagree or challenge a user's thought patterns. While programmed for relentless validation, this constant agreement can be dangerously affirming, especially for someone in crisis. A human therapist acts as both a mirror and a guide, reflecting a client's thoughts while helping them navigate towards healthier perspectives. Real therapeutic relationships are built on trust and rapport, which are strengthened by navigating disagreements and difficult conversations.


2. Lack of Human Context and Imperfection: 

Humans are complex and imperfect, and this imperfection is a key part of personal growth. A therapist's ability to filter information, gauge a client’s non-verbal cues (like facial expressions), and understand the context of their environment is impossible for a bot to replicate. The danger lies in a user's "selective abstraction," where they may ignore safe advice and latch onto a potentially harmful phrase from a chatbot's response.


3. No Accountability or Ethical Standards: 

A therapist is bound by a professional code of ethics and has a legal obligation to intervene in a crisis, even if it means initiating a rescue. However, the inability to hold an AI or its creators accountable for harmful advice is worrying.


Who is responsible—the engineer, the company, or the data itself?


The lack of ethical and legal guardrails in AI makes it a risky space for mental health.


A Supplementary Tool, Not a Replacement

While chatbots may serve as a temporary outlet for "venting," they are not a substitute for professional help.


AI Therapists, the dangers of ai chatbots
Therapy sessions can held in person or online

A chatbot cannot provide the nuanced, accountable, and imperfectly human connection that is essential for genuine healing and growth.


As technology continues to evolve, the distinction between a helpful digital tool and a life-saving human intervention must remain absolutely clear. The responsibility lies not only with the companies creating these bots but also with us, the users, to understand their limitations and to advocate for a future where technology complements, not compromises, our humanity.


Disclaimer: This article is for informational purposes only. If you or someone you know is in immediate danger or a crisis, please seek help from a professional. Artificial intelligence chatbots are not a substitute for qualified mental health care.

bottom of page