Talked
Talked
person-using-phone-for-ai-therapy

AI therapy: The good, the ugly, the dangerous

In a Nutshell

  • AI therapy tools can make support more accessible and affordable, especially when traditional therapy feels out of reach, but they don’t replace the depth of human care.

  • AI chat and AI chatbot therapy can help you learn coping strategies and reflect on your thoughts, though they still lack true empathy and clinical judgement.

  • While AI can be useful, it can also misread situations or reinforce risky behaviours, so it’s best used as a support alongside, not instead of, a qualified therapist.

AI therapy is no longer something you only hear about in tech circles. It’s here, it’s widely used, and for many people, it’s become part of their everyday coping toolkit. From AI chat companions to structured AI chatbot therapy apps, more people are turning to artificial intelligence for emotional support, guidance, and reflection.

You might already be using it yourself. Perhaps it feels easier to open up to an AI therapist than to a real person. Maybe it’s the convenience, the cost, or simply the fact that it’s available whenever you need it.

At the same time, there’s a growing conversation about what AI can and can’t do when it comes to mental health. Can something that doesn’t feel, reflect, or take responsibility truly support you in a meaningful way?

The answer isn’t simple. AI therapy sits in a grey area. It can be helpful, but it can also fall short, and in some situations, it can even be risky. Understanding that balance is key if you’re considering using it, or already relying on it.

Some AI therapy statistics

Usage of AI therapy is growing quickly, both globally and here in Australia, and the numbers reflect just how much interest there is in digital mental health support.

  • A 2024 report by Grand View Research estimates that the global mental health apps market, including AI chatbot therapy, will exceed USD $17 billion by 2030.

  • A 2025 study published in the JMIR Publications found that AI chatbot therapy users experienced small to moderate reductions in mental distress, though results varied significantly across samples, tool designs, and other factors.

  • However, a recent Stanford University report found that some AI mental health tools may offer responses that are overly agreeable or lack appropriate caution, particularly in sensitive situations, raising concerns about their ability to safely support people in distress.

These figures point to a complex and still-maturing space. AI therapy is growing quickly and showing real potential, particularly in improving access and early support, but its reliability remains uneven, and important questions around safety, consistency, and clinical effectiveness are still being worked through.

The good

1. Accessibility and affordability

One of the clearest benefits of AI therapy is how easy it makes support to access. Traditional therapy can be expensive, and in some parts of Australia, especially regional or remote areas, it can be hard to find a therapist at all.

AI chat tools help remove those barriers, allowing you to access “therapy” or at least a conversation at any time of day or night without waiting for an appointment, often at little to no cost, and from wherever you may be.

If you’ve ever felt stuck waiting for support, you’ll understand how valuable that immediate access can be.

2. Reduced stigma and increased openness

Opening up about your mental health isn’t always easy. For many people, there’s still a sense of stigma or fear of judgement.

AI can feel different. You might find yourself sharing more honestly with an AI therapist because there’s no perceived judgement, no awkward pauses, and no pressure to present yourself a certain way.

This can encourage:

  • Earlier conversations about mental health

  • Greater honesty in what you’re feeling

  • A gentle entry point into self-reflection

For some people, it’s the first step towards eventually seeking human care.

3. Psychoeducation and coping tools

AI chatbot therapy can be particularly useful when it comes to practical strategies. Many tools are designed to deliver structured techniques based on established approaches like cognitive behavioural therapy.

You might use AI to:

These tools can help you build awareness and develop coping skills over time.

4. Supporting therapist matching and mental health literacy

Another area where AI is quietly proving useful is in helping you take the next step towards appropriate human care.

Finding the right therapist can feel overwhelming. There are different approaches, specialisations, and personalities to consider, and it’s not always clear where to begin. Some AI tools now guide you through this process by:

  • Asking structured questions about your concerns and goals

  • Suggesting therapy styles that might suit you

  • Helping narrow down options based on preferences

This doesn’t replace professional assessment, but it can make the process feel more manageable and less intimidating.

AI can also support your understanding of mental health itself. If you’ve ever wondered whether what you’re feeling is anxiety, burnout, depression, or something else, AI chat tools can offer general explanations in everyday language.

Used thoughtfully, they can help you:

  • Understand common mental health terms

  • Learn the differences between conditions like anxiety and depression

  • Explore therapy approaches such as CBT or mindfulness

  • Build a clearer way to describe what you’re experiencing

That kind of knowledge can make it easier to recognise when something isn’t quite right, and to communicate more clearly if you decide to seek help.

But, it’s still important to keep expectations grounded. AI can explain concepts, but it cannot and should not be used to assess, diagnose, or fully understand your individual experience. It’s a starting point for learning, not a final answer.

5. Immediate emotional support

There are moments when you don’t want to wait. When something feels overwhelming late at night, or when you’re not ready to talk to someone you know.

AI chat can offer immediate responses. Even though it doesn’t feel in the way a person does, the act of being able to express yourself and receive a reply can feel grounding.

The bad

1. Lack of clinical judgement

AI therapist tools don’t have clinical judgement. They don’t assess risk, interpret complex emotional cues, or make informed decisions based on lived understanding.

This means they can:

  • Miss important warning signs

  • Struggle with complex or layered situations

  • Offer responses that sound helpful but lack depth

If your situation is nuanced, which most mental health experiences are, AI may not fully grasp what’s going on.

2. Simulated empathy, not real empathy

AI can sound empathetic. It can use the right words, reflect your feelings, and respond in a calm tone. But it doesn’t actually feel empathy.

Real empathy involves connection, presence, and a level of responsibility. An AI chatbot can simulate this, but it can’t truly understand your experience or hold space for it in a human way.

3. Over-reliance and avoiding human care

It’s easy to see how AI could become a default. It’s convenient, it’s always there, and it doesn’t require vulnerability in the same way that talking to a person does.

But over time, relying solely on AI can lead to:

If you find yourself turning to AI instead of reaching out to someone, it might be worth pausing and reflecting on why.

4. Inconsistent quality and limited regulation

Not all AI therapy tools are created equally. Some are built with input from mental health professionals, while others are not.

At the moment:

  • Regulation is still catching up

  • There’s no universal standard for safety or effectiveness

  • Many tools haven’t been rigorously tested

In Australia, digital mental health services are expanding, but oversight of AI chatbot therapy is still developing.

The dangerous

1. Reinforcing risky behaviours

One of the most serious concerns is that AI can sometimes reinforce harmful thinking patterns without meaning to. For example, it might agree with distorted beliefs, fail to challenge negative thinking, or it might even respond in ways that feel validating but aren’t actually helpful.

In certain situations, this can unintentionally support risky behaviours rather than guide you away from them.

2. Inadequate responses in crisis situations

If you’re experiencing intense distress or thoughts of harm, the response you receive matters enormously. Research from Stanford University suggests that some AI mental health tools may respond in ways that are overly agreeable or lack appropriate caution, particularly when users express distress, which raises concerns about their ability to provide safe and responsible support.

This is where the limitations of AI become very clear. It doesn’t have the ability to intervene, escalate, or take responsibility.

3. Privacy and data concerns

When you use AI therapy, you’re often sharing deeply personal information: thoughts, experiences, relationships, and emotions. This is why it’s important to consider where your data is stored, how it might be used, and whether consent is clearly explained.

For many people, this isn’t front of mind when they start using AI chat, but it’s an important part of staying informed and safe.

4. The illusion of being understood

AI can feel surprisingly convincing. It reflects your words, responds smoothly, and can even feel comforting. But there’s a risk in that.

If it feels like you’re being fully understood, you might start to trust the system more than you should. That can lead to over-reliance, or taking guidance at face value without questioning it.

Yes or no to AI therapy?

So, should you use AI therapy? The answer depends on how you use it.

AI therapist tools can be helpful if you’re:

  • Exploring your thoughts through journalling

  • Learning coping strategies

  • Looking for immediate, low-level support

  • Supplementing ongoing therapy

However, they’re not suitable as a standalone option if you’re dealing with:

  • Severe or complex mental health challenges

  • Crisis situations

  • Trauma that requires careful, guided support

  • Anything that needs clinical judgement

A balanced approach tends to work best. AI can support you, but it shouldn’t replace human care. If something feels too big to handle alone, it’s important to involve a qualified professional.

Final thoughts

AI therapy is evolving quickly, and it’s already changing how people access mental health support. For some, it offers a helpful starting point. For others, it fills small gaps between sessions or during difficult moments.

But it’s not a complete solution.

Mental health care is deeply human. It involves connection, understanding, and clinical judgement that AI simply can’t replicate. While AI chat and AI chatbot therapy can offer support, they work best when they sit alongside real human care.

If you’re using AI therapy or thinking about it, try to approach it with curiosity and awareness. Let it support you, but don’t let it replace the relationships and professional guidance that truly make a difference.

And if things feel overwhelming, or you’re unsure about what you’re experiencing, speaking with a qualified human therapist can offer the depth, safety, and care that AI simply can’t provide.

Get Support

Book a free video consultation with one of our therapists.

Profile pic
5.0- 17 reviews
Profile pic
5.0- 3 reviews
Profile pic
4.8- 8 reviews

Essential Reading

What’s lacking in most Employee Assistance Programs?
How to compare EAPs and choose the best for your team
How to support a struggling employee
Neurodivergence, masking, and why they often go together
Why your sleep changes as you age
More Blog Articles

Talked Services

AI therapy FAQs

Recommended Therapists Available Now

Michelle Saluja

5.0

365 Sessions

QLD

Psychologist

5.0

365 Sessions
Eleanor Clifford

5.0

140 Sessions

VIC

Psychologist

5.0

140 Sessions
Tim Maher

4.8

300 Sessions

WA

Psychologist

4.8

300 Sessions

Book a Therapy Session Today

Find a therapist and book your session online

Browse Therapists