In response to the recent discussion about “4 reasons not to turn ChatGPT into your therapist,” I applaud the increasing public awareness of this important issue. As CBT remains central to my therapeutic approach, I believe it’s vital to understand why AI lacks the capacity—and the commitment—that competent clinical care requires. Let’s unpack these four key concerns.

  1. True Empathy Can’t Be Simulated
    While AI—like ChatGPT—can mimic empathetic language, it fundamentally lacks emotional awareness and cannot genuinely feelwith you. Real therapy hinges on authentic human connection, emotional attunement, and what we call co-regulation: the physiological and psychological support experienced from another human’s calm presence.

In CBT, empathy is more than well-chosen words—it’s the therapist’s presence, attunement, and validation of your humanity. Without that, therapy loses its relational backbone.

  1. No Nuance, No Clinical Judgment
    ChatGPT generates responses based on patterns in data—not on diagnostic training, clinical reasoning, or contextual awareness. It lacks awareness of your personal history, body language, tone shifts, and evolving emotional states.

Effective therapy requires nuance: understanding how this moment connects to your broader experience, tailoring strategies across sessions, and adjusting approach based on subtle shifts. AI simply cannot replicate that depth.

  1. It Can Reinforce Unhelpful or Dangerous Thinking
    One of the most concerning risks is that AI can inadvertently validate, amplify, or fail to challenge destructive thoughts—what we see in phenomena like “chatbot psychosis” or when users spiral into conspiratorial or delusional thinking.

Without a human clinician’s discernment, AI may offer encouraging—but unsafe—suggestions, or fail to intervene in crises. Multiple reports show that such reinforcement has contributed to emotional deterioration, emotional dependency, and in tragic cases, self-harm.

  1. No Ethical Safeguards or Therapeutic Accountability
    Unlike licensed therapists, AI does not operate under ethical codes, confidentiality regulations, or duty-to-warn obligations. There’s no guarantee your data is protected, and no way to ensure any follow-up or crisis response.

Furthermore, research shows AI systems often express subtle stigma or inappropriate responses toward mental illness, failing to maintain professional boundaries and safe therapeutic standards.

Bottom Line: AI Can Support—but Not Substitute—Therapy
AI like ChatGPT may offer a helpful mirror: a nonjudgmental voice that reflects thoughts back, encourages basic reframing, or serves as an aid for stress relief. But it is not a lifeline. Healing happens through human relationships—in therapy relationships defined by empathy, clinical skill, and accountability.

At the Fischer Institute, my role is to help individuals and couples build change through insight, structured CBT tools, and a dependable therapeutic alliance. AI simply does not—and cannot—provide that foundation.

THERE IS NO LIFE ISSUE THAT I CANNOT SHOW YOU HOW TO MANAGE.

Naples Counselor - Dr. Udo Fischer

In a comfortable and supportive atmosphere, I offer a highly personalized approach tailored to each of my clients individual needs to help attain the personal growth they’re striving for.

Schedule Your No-Obligation Phone Inquiry Today. Naples Counseling - (239) 307-0101

Call (239) 307-0101