Therapy with a human vs AI Chatbot — does it make a difference?

Let me start off by answering, yes, it does make a difference.

I was recently talking with a friend who mentioned that her husband was creating an AI Chatbot to help her sister struggling with depression. Initially, surprised by this, I asked whether she had reached out to a therapist as it sounded like she could use support from a licensed mental health professional. I was told, yes, rather matter-of-factly and that the sister had been placed on a waitlist. Since she needed help now this was in fact the best way to get it. On hearing this my gut reaction was to say “No, this is not the best way.” However, instead I shared with my friend some of my reservations as a Clinical Psychologist about using an AI Chatbot for mental health support. Here are some things to consider:

  • AI is a tool and not a therapist. AI is not designed to diagnose or treat your mental health concerns – so please do not ask it to do this. However, a licensed mental health professional is trained to ask specific follow-up questions about your symptoms and rule-out diagnoses that may be similar but require different treatment plans.

  • AI is in its initial stages and there is a risk factor of it providing disturbing and harmful responses (including images). There are cases where AI Chatbots have encouraged people to complete suicide[1], leave their significant other[2], and isolate themselves more[3]

  • AI is sycophantic. A recent article published by Nature found that AI is 50% more sycophantic than humans[4]. This means an AI Chatbot may be telling you what you want to hear and not what you may need to hear. This can be incredibly harmful when someone is experiencing a mental health crisis and in a more vulnerable state. 

  • AI does not care who is on the other side. Children, adolescents, and young adults are especially vulnerable to believing AI Chatbots are trustworthy and listening to them. This can look like social withdrawal and increased isolation in your child or teen[5]. As a parent or caregiver, be vigilant about talking about best technological practices to your children and putting boundaries in place for their safety.

But can an AI Chatbot offer something? Yes, I believe it can. As an adjunct to care AI Chatbots can be a resource. For example, an AI Chatbot can list providers in your area who can help with your specific concern, brainstorm tools/exercises (e.g., deep breathing apps) that may assist you. It can also, in most cases, direct you to call #988 the mental health suicide and crisis lifeline if you are expressing suicidal thoughts or experiencing a mental health crisis. [If you are experiencing a mental health crisis, please dial 911 or #988 or go to your nearest emergency room]. Lastly, some research has suggested that for some individuals, specifically minority adolescents or LGBTQ, AI can increase self-disclosures and authenticity and may even help overcome barriers to accessing online therapy.[6]

But what does a human offer in therapy that an AI Chatbot doesn’t or can’t? Simply put, therapy with a human (licensed mental health professional) provides a corrective emotional experience. It allows you to develop a relationship with another human being who much like a caring and nurturing parent, gently challenges and guides you to grow in a supportive and safe environment. With your therapist you can experience:

  • True human connection and empathy. An AI Chatbot can create the illusion of connection but let us not forget that true human connection is what is lacking for those experiencing loneliness.

  • Someone who challenges you to grow in line with your values. Your therapist doesn’t simply agree with you or tell you what you want to hear. A good therapist will ask you the hard questions when you are ready to go there.

  • Help that is not grounded on dependency (such as with an AI Chatbot) but rather supports and builds autonomy, agency, and hope.

  • A space to process difficult feelings or thoughts you typically avoid. A good therapist will not feed into your avoidance but rather challenge you to face your fears.

  • Support to engage in healthy behaviors, such as building connection with others, increasing time in nature, slowing down to be able to relax and be more present.

  • A time set aside for self-exploration, self-reflection, and self-compassion. Too many of us don’t have a time for our personal growth, but rather spend it doom scrolling, online shopping, or distracting ourselves through other means – therapy can present something different, something better.  

At the end of the day, your therapist is a human being and no matter how intelligent AI is, it can never offer you true human connection. Your therapist can empathize with you because although they may not have been in your exact situation, as a human they have felt pain, regret, sadness, surprise, and pure unfiltered joy.

So next time you think, I am feeling so stressed and overwhelmed by [fill in the blank] and it’s just not going away. Consider connecting with a therapist. If financial constraints are an issue, let the therapist know – many of us (me included) offer sliding scale fees to make therapy accessible. We went to school for (gosh too many years) and we are invested in helping you to not just feel better but be well. Don’t be afraid to ask and get the help you need from a human being – believe me it is worth it.

Written by: Dr. Tamara Oppliger, PhD

Clinical Psychologist at Pivotal Therapy Practice

4/14/2026

 

References

[1] Walker L. Belgian man dies by suicide following exchanges with chatbot. The Brussels Times. Mar 28, 2023. [03-06-2026]. https://www.brusselstimes.com/430098/belgian-man-commits-suicide-following-exchanges-with-chatgpt URL. Accessed.

[1] Roose K. A conversation with bing’s chatbot left me deeply unsettled. The New York Times. Feb 16, 2023. [07-03-2025]. https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html URL. Accessed.

[1] Marche S. The chatbot problem. The New Yorker; 2021. [03-06-2026]. https://www.newyorker.com/culture/cultural-comment/the-chatbot-problem URL. Accessed.

[1] Cheng, M. et al. arXiv https://doi.org/10.48550/arXiv.2510.01395 (2025).

[1] Hoffman K. Florida mother files lawsuit against AI company over teen son’s death: addictive and manipulative. CBS News. Oct 23, 2024. [03-06-2026]. https://www.cbsnews.com/news/florida-mother-lawsuit-character-ai-sons-death/ URL. Accessed.

[1] Campbell LO, Babb K, Lambie GW, Hayes BG. An Examination of Generative AI Response to Suicide Inquires: Content Analysis. JMIR Ment Health. 2025 Aug 14;12:e73623. doi: 10.2196/73623. PMID: 40811811; PMCID: PMC12371289.