Your Therapist Might Be Using AI. Should You Be Worried?

Let’s talk about the quiet integration of AI in therapy and what it really means for your sessions.

You’re sitting across from your therapist, in that comfortable chair you’ve come to know so well. You’re sharing something personal, something you’ve maybe never said out loud before. It’s a moment built entirely on trust. But what if there was a third party in the room, one you couldn’t see? This is the new, complicated reality of AI in therapy. More and more, therapists are quietly using artificial intelligence like ChatGPT to help with their work, and it’s opening up a conversation we need to have.

It’s not as sci-fi as it sounds. Most therapists aren’t asking an AI chatbot for direct advice on what to tell you. Instead, they’re using it for the mountains of administrative work that comes with the job. Think of it as a super-smart assistant. But where do you draw the line between efficiency and ethics?

Why Your Therapist Might Be Curious About AI in Therapy

Let’s be honest, therapists are overworked. Between sessions, they’re writing clinical notes, creating treatment plans, and handling billing. It’s a lot of paperwork. So, the appeal of using AI is pretty clear: efficiency.

Some of the common uses include:

  • Summarizing Sessions: A therapist could verbally dictate their notes, and an AI could transcribe and organize them into a structured format. This saves a ton of time.
  • Drafting Communications: Writing emails or letters to insurance companies or other providers can be tedious. AI can help generate a first draft.
  • Brainstorming Ideas: A therapist might use an AI to explore different therapeutic approaches for a complex issue, almost like consulting a digital textbook.

On the surface, this sounds great. More efficiency means the therapist has more time and mental energy to focus on what truly matters: you. But the methods and the secrecy are where things get tricky.

The Big Problem: Trust, Privacy, and AI

The entire foundation of therapy is built on confidentiality. You trust that what you say in that room, stays in that room. But what happens when your deeply personal stories are fed into a large language model owned by a massive tech company?

This is the central ethical dilemma of using AI in therapy. Standard versions of tools like ChatGPT are not HIPAA-compliant. The Health Insurance Portability and Accountability Act (HIPAA) is a US law that creates a national standard for protecting sensitive patient health information. Pasting client notes into a public AI tool is a massive violation of that privacy.

Even if a therapist is using a more secure, “private” version of an AI, the problem of trust remains. The fact that this is often happening without the client’s knowledge is what feels so jarring. It changes the dynamic of the relationship. Suddenly, you might find yourself wondering if the thoughtful question your therapist just asked was their own, or if it was suggested by an algorithm.

Beyond Data: The Human Element of AI in Therapy

Let’s set aside the privacy concerns for a moment and ask another question: Can an AI truly understand the human experience? An AI model doesn’t have empathy. It hasn’t experienced joy, heartbreak, or a tough childhood. It’s a remarkably sophisticated pattern-matching machine, trained on vast amounts of text from the internet.

It can mimic the language of empathy, but it can’t feel it. As the American Psychological Association notes, the technology is evolving fast, and the guidelines for its use are still being written. There’s a risk that relying on it could flatten the nuanced, intuitive, and deeply human art of therapy. A therapist’s job isn’t just to offer solutions, but to sit with you in your discomfort, to build a relationship, and to provide a connection that makes you feel seen. An AI can’t do that.

So, what’s the answer? Banning AI from mental health entirely feels like a step backward. The potential for good is there, but it has to be handled with extreme care. The future will likely involve developing secure, specialized AI tools built specifically for mental health professionals—with the full knowledge and consent of their clients.

For now, it’s a conversation worth having. The relationship you have with your therapist is yours. And in that space, there should be no secrets.