Understanding the Limits and Risks of AI for Therapy
If you’re currently seeking support through therapy, you’ve likely come across one of the many new AI apps claiming to assist with mental health issues. But how effective are these tools—and most importantly, are they safe?
AI-based therapy can offer generalized tips for managing certain mental health concerns by synthesizing preexisting resources. While this can be helpful for supplemental support, these tools fall short in providing personalized, nuanced insights. A human therapist does far more than respond to questions or prompts; they observe embodied communication, consider a client’s lived experience, and integrate current events and context into their therapeutic responses.
Beyond the limitations, AI can pose serious risks. It may reinforce or validate distorted thinking patterns or unhealthy behaviors. New research from Stanford University revealed that AI therapy chatbots respond inappropriately to common mental health conditions and express stigmatizing views particularly toward those seeking help for addiction and schizophrenia. Alarmingly, these large language models (LLMs) have been linked to instances of self-harm and have associations with suicide.
Chatbots are designed to affirm the user’s ideas and create a persuasive impression of a caring human in an effort to keep users engaged and gather data. But AI tools are incapable of real empathy. A skilled therapist not only listens and affirms, but also knows when and how to challenge a client’s perspectives to foster growth and healing. True therapy is a collaborative process that draws on emotions, body language, and intuition. This kind of therapeutic relationship is built on trust and a deep human connection—something AI simply cannot replicate. If you’re struggling, please reach out to me today to begin the healing process through compassionate, individualized support.
Additional Reading:
A Teen Was Suicidal. ChatGPT Was the Friend He Confided In. - The New York Times
The Reality of Instant AI Therapy | Psychology Today
Using generic AI chatbots for mental health support: A dangerous trend
Using an AI chatbot for therapy or health advice? Experts want you to know these 4 things | PBS News