Is AI Therapy Safe? What to Know Before Using a Chatbot

Branded graphic with eucalyptus leaves and gold circles reading Technology: Is AI Therapy Safe? What You Need to Know Before Talking to a Chatbot, with Center for Mindful Therapy logo

Reviewed by Kathryn Vercillo, MA Psychology | Last Updated: November 2025

Abstract purple tinted image showing text about large language models displayed on curved reflective surfaces

Maybe you downloaded Woebot after seeing an ad. Perhaps you found yourself typing your worries into ChatGPT at 2 a.m. when you could not sleep. Or maybe a friend mentioned Wysa helped them through a rough patch, and you are curious whether AI therapy could work for you too.

You are not alone in wondering. Millions of people are now turning to artificial intelligence for mental health support, and the options seem to multiply every month. The appeal is obvious: these tools are available instantly, they do not require appointments, and many are free or low cost. But as AI therapy becomes increasingly mainstream, one question keeps coming up: is it actually safe?

The answer is more nuanced than a simple yes or no. Before you share your deepest struggles with an algorithm, here is what you need to understand about the benefits, risks, and real limitations of AI mental health tools.

What counts as AI therapy?

The term “AI therapy” covers a wide range of tools, and understanding the differences matters for your safety.

  • Dedicated mental health chatbots like Woebot, Wysa, and Replika are specifically designed to provide emotional support. They use techniques drawn from cognitive behavioral therapy (CBT), mindfulness practices, and other evidence based approaches. These apps are typically built with input from mental health professionals and have some research backing their effectiveness for mild symptoms.
  • General AI assistants like ChatGPT, Claude, and Gemini were not designed for therapy, but millions of people use them that way. They can hold conversations, offer perspectives, and even role play therapeutic techniques. However, they lack the specialized safeguards of purpose built mental health tools.
  • Hybrid platforms combine AI with human oversight. Some therapy apps use AI for check ins between sessions with a human therapist, or employ chatbots for initial screening before connecting users with professionals.

Person relaxing on a gray couch while having a video call on a smartphone, with their therapist visible on screen

The rapid growth of AI mental health tools

The explosion of these tools reflects a genuine need. Traditional therapy remains inaccessible for many people due to cost, location, or shortage of providers. In places like the San Francisco Bay Area, even those who can afford therapy often face weeks long waitlists. AI tools promise immediate access, and for some people, that promise is genuinely helpful.

But “helpful” and “safe” are not always the same thing.

The honest answer about AI therapy safety

Is AI therapy safe? It depends on what you mean by safe, what you are using it for, and which tool you choose.

  • AI therapy can be reasonably safe for processing everyday stress, practicing coping skills you have already learned, or supplementing ongoing work with a human therapist. Some research suggests chatbots like Woebot can reduce symptoms of depression and anxiety in people with mild to moderate symptoms.
  • AI therapy carries real risks when used as a replacement for professional care, when you are in crisis, or when the platform has questionable privacy practices. The technology cannot assess risk the way a trained clinician can, and it may miss warning signs that would be obvious to a human.

What the research actually shows

Studies on AI therapy tools show mixed but generally cautious results. A July 2025 study from researchers at Stanford, Carnegie Mellon, and the University of Minnesota compared AI chatbots to clinical standards for the first time and found a stark gap: licensed therapists responded appropriately 93% of the time, while AI therapy bots responded appropriately less than 60% of the time. The study also found that popular chatbots gave dangerous responses to crisis situations, including providing detailed information that could facilitate self harm when users made indirect suicide inquiries. As one of the researchers noted, these systems are not just inadequate but can actually be harmful.

Purpose built apps like Woebot have demonstrated some effectiveness in reducing mild depression and anxiety symptoms in clinical trials. However, most research has been conducted by the companies themselves, and long term outcomes remain unclear.

What we know for certain is that AI cannot replace the therapeutic relationship, which decades of research consistently identifies as the most important factor in successful therapy outcomes. The connection between therapist and client accounts for more of the healing than any specific technique.

  • Privacy concerns you cannot ignore
  • One of the most significant safety issues with AI therapy is not psychological but digital. When you share your mental health struggles with an AI tool, where does that information go?
  • Data collection practices vary wildly. Some mental health apps have been caught sharing user data with advertisers, including details about mental health conditions. The Federal Trade Commission has taken action against companies for deceptive privacy practices in this space.
  • Most AI tools are not covered by HIPAA. The health privacy law that protects your therapy records typically does not apply to apps and chatbots. Your conversations may be stored, analyzed, and potentially used to train future AI models.

A friendly 3D illustrated robot with blue glowing eyes and an AI badge on its chest, sitting at a small blue laptop

Questions to ask before downloading

Before sharing anything sensitive with an AI mental health tool, investigate the following:

  • Does the app clearly explain how your data is stored and used?
  • Can you delete your conversation history?
  • Is the company transparent about whether your conversations are used to train AI models?
  • Has the platform faced any regulatory action or data breaches?

If you cannot find clear answers to these questions, that silence itself is a red flag.

What AI therapy simply cannot do

Understanding the limitations of AI therapy is essential for using these tools safely. No matter how sophisticated the technology becomes, certain things remain beyond its reach.

  • AI cannot read your body. A human therapist notices when your breathing changes, when you look away at certain topics, when your posture shifts. These nonverbal cues often reveal more than words, and they guide skilled therapists toward what matters most.
  • AI cannot provide genuine relationship. Therapeutic healing often happens in the experience of being truly seen and understood by another person. This felt sense of connection cannot be replicated by software, regardless of how empathetic the responses sound.
  • AI cannot handle crisis safely. If you are having thoughts of suicide or self harm, AI tools are not equipped to assess your actual risk level or provide appropriate intervention. Most have disclaimers acknowledging this limitation, but in a crisis moment, you may not remember to seek human help.

The aforementioned Stanford-led research team also found that AI therapy chatbots displayed significant stigma toward certain mental health conditions. Across different chatbots tested, AI showed increased bias against people with alcohol dependence and schizophrenia compared to conditions like depression. This kind of stigmatizing response from a supposed therapist can be deeply harmful and may lead people to abandon mental health care altogether. As lead researcher Jared Moore noted, newer and more advanced AI models showed just as much stigma as older ones, suggesting that simply improving the technology will not solve these fundamental problems.

The diagnostic question

AI mental health tools cannot diagnose you, and they should not try. Accurate diagnosis requires clinical training, comprehensive assessment, and often multiple sessions to understand the full picture. An algorithm working from text alone lacks the context needed to distinguish between conditions that may present similarly but require very different approaches.

We have explored these limitations in depth in our previous article on the best and worst possible outcomes for using AI in therapy.

Two women having a warm conversation outdoors with a husky dog, surrounded by trees in a natural park setting

When AI might genuinely help

Despite the limitations, AI therapy tools do have legitimate uses. Understanding where they fit can help you use them more safely.

  • Between session support. If you are already working with a therapist, AI tools can provide a way to practice skills, track your mood, or process everyday stressors between appointments. Think of them as a supplement, not a substitute.
  • First steps toward care. For someone who has never been to therapy and feels nervous about starting, a chatbot can provide a low stakes introduction to therapeutic concepts. It can help you build vocabulary for your experiences and reduce the intimidation of eventually seeing a human.
  • Accessibility bridge. When therapy is genuinely inaccessible due to cost, location, or availability, AI tools may provide some support where none would otherwise exist. This is particularly relevant in underserved areas or during times when traditional services are overwhelmed.

The supplement mindset

The key is approaching AI therapy as a supplement rather than a solution. Used this way, these tools can extend the benefits of human therapy or help you maintain progress during gaps in care. Used as a replacement, they may provide false reassurance while genuine needs go unmet.

For more context on how AI integrates with different therapeutic approaches, see our article on AI technology used in conjunction with different types of therapy.

A robotic arm playing chess against a human hand, with a small robot displaying a game timer

Red flags that mean you need human support

Certain situations call for human care, regardless of how advanced AI tools become. If any of the following apply to you, please reach out to a real therapist:

  • You are experiencing thoughts of suicide or self harm.
  • Your symptoms that significantly interfere with daily functioning.
  • You are dealing with trauma, abuse, or complex relationship issues.
  • Your symptoms have persisted for more than a few weeks.
  • You have a history of mental health conditions that required professional treatment.
  • You are using substances to cope with emotional pain and want to stop.

These situations require the clinical judgment, crisis assessment skills, and genuine human connection that only a trained professional can provide.

Recognizing when a chatbot is not enough

Sometimes the sign that you need more support is subtle. If you find yourself returning to an AI tool repeatedly without feeling better, or if you are using it to avoid seeking human help, those patterns deserve attention. A chatbot that provides temporary comfort without lasting change may actually delay getting care that could truly help.

Finding the right balance

Technology and human connection do not have to be in opposition. The most thoughtful approach to mental health often includes both.

At Center for Mindful Therapy, we believe in meeting people where they are. If you have been using AI tools and found them helpful, that awareness of what supports you is valuable information for a human therapist. If you have been using them and still feel stuck, that tells you something important too.

Our collective includes over 125 therapists across the San Francisco Bay Area and throughout California via telehealth. Unlike an algorithm, our intake team knows each therapist personally and can help match you with someone whose approach, personality, and expertise fit your specific situation.

Looking for a therapist who can provide the human connection, clinical skill, and personalized care you deserve?

Browse our Therapist Directory

 

 

Two Women Collaborating on Mental Health Resources in California

The human difference

What makes therapy with a real person different is not just expertise. It is the experience of being in relationship with someone who genuinely cares about your wellbeing, who notices what you cannot see yourself, and who walks alongside you through difficulty. That relationship itself is healing in ways that no technology can replicate.

Making an informed choice

AI therapy tools are neither saviors nor villains. They are instruments with specific uses and clear limitations. Your job is to understand what they can and cannot offer, protect your privacy, and recognize when you need something more.

If you are curious about AI mental health tools, approach them with open eyes. Read privacy policies. Notice whether they actually help you feel better over time. And stay alert to the difference between genuine progress and temporary distraction.

Most importantly, remember that seeking support, whether from an app or a person, reflects strength. The fact that you are researching this topic suggests you take your mental health seriously. That awareness is the foundation for whatever kind of care you ultimately choose.

Ready to talk to a human? Contact us to learn more about working with a therapist at Center for Mindful Therapy. We offer both in person sessions throughout the Bay Area and telehealth for anyone in California.

0/5 (0 Reviews)

Have some questions first? You can always reach out here.