AI & Mental Health: Promise, Peril and Where We Go From Here

To start, I want to acknowledge the elephant in the room–for some, this is going to be a touchy subject. Where people lie on the comfortability and immersion of AI usage exists on a fairly wide spectrum, especially clinically. On one end we see therapists utilizing AI for succinct note taking, treatment planning and I even have a colleague who has used it to generate EMDR protocols mid-session. For some therapists, this has brought about a fresh wind in their practice and reported streamlined work days. Even on the client side, there have been reports of people stating relief of emotional distress via AI. On the other end of things, there are therapists who are not just mortified but disgusted with the integration of artificial intelligence, perceivably watering down the importance and integrity of clinical work. The main critiques I have heard are about privacy concerns, the lack of regulation and the looming fear of “AI is going to take our jobs!(doubt it, but we will table that perspective for now). This blog post aims not to encourage increased AI usage nor is it for the sake of cessation. Simply, I hope for you to explore alternative perspectives and engage in these discussions with an open curiosity, even if your mind remains unchanged.

The Peril

I think that it is fair to say that a public overreliance on artificial intelligence threatens to besmirch the heart of counseling. At least for myself and others in my grad school cohort, the first therapeutic modality we were taught is Person Centered Therapy also known as Rogerian Therapy. This style of counseling is considered a humanistic approach with one of the main emphasis being that the core of the therapeutic relationship is the therapist and client relationship. At the end of the day, it is eye-brow raising to say the least to expect something artificial to healthily replace organic human connection. In fact researchers have seen quite worrisome psychological behaviors associated with an over use of AI for psychiatric symptoms: delusional thought, grandiosity, and an overall exacerbation of symptoms (Head, 2025). Yikes. Additionally, it is harmful for therapists to have a blind trust in AI chatbots for clinic care. In fact, researchers at Brown University have found that AI chatbots “routinely violate core mental health ethics standards” ranging anywhere from over-agreement to explicitly dangerous advice (Brown, 2025)

All that to say, an overreliance on a relatively unregulated platform is worthy of sound caution for clients and and clinicians alike

The Promise

It can’t be all bad, right? Despite the aforementioned risks and dangers, I feel it would be unfair for me to deny that at minimum there is potential for good. The future of AI holds relatively low cost options for clinicians to get administrative support and can even be a helpful tool for clients to find nearby care. Who knows? Perhaps even one day, AI algorithms could be helpful in diagnostic support via analysis of the DSM.

Where We Go From Here

In fairness, that is up to each individual to decide. As each person decides their own stance, I believe it is helpful to do so with a cautious curiosity, regardless of where you lie on the spectrum of AI trust. 

As for me and my practice, I am personally committed to providing quality care without the use of AI in clinical sessions. Clients of mine have the assurance that no matter the complexity of their symptoms, I do not utilize AI note takers and do not utilize AI for mid-session trauma therapy protocols. In my sessions, I craft my therapeutic approach based on my clients verbal and non verbal cues, evidence based theory and genuine human connection.

Resources

Head, K. (2025, September 5). Minds in crisis: How the AI revolution is impacting mental health. Journal of Mental Health & Clinical Psychology. https://www.mentalhealthjournal.org/articles/minds-in-crisis-how-the-ai-revolution-is-impacting-mental-health.html 

New study: Ai Chatbots systematically violate mental health ethics standards. Brown University. (2025, October 21). https://www.brown.edu/news/2025-10-21/ai-mental-health-ethics 

Next
Next

Three Tips for Managing Stress during the Holidays