AI Therapy?
- Nicole Tuohey
- Apr 1
- 2 min read
Most people at this point are familiar with the uses of generative AI. Generative AI is what is most commonly referenced today. In fact, we've all been using AI long before Chat GPT-every time we typed something into a search engine like Google or looked up a recipe online.
What is different today is the use of generative AI which is used in many different types of activities and industries. Generative AI isn't just sending someone to a link to a website, it can help to generate ideas or content and also engage with a person in more of a conversation. One industry generative AI is being used is the mental health world. Can something like Chat GPT replace traditional therapeutic models?
Certainly AI can produce answers based off of assumed knowledge quicker than most people can come up with an answer (even including Jeopardy contestants), but is it helpful or potentially harmful? Generative AI collects and builds it's knowledge base off of top answers that are collected. This is then spewed back out to those seeking answers to their questions. Sometimes it's right, sometimes it's very wrong.
What is important to remember is that therapy is part diagnostic and also part relationship. Without the human interaction therapy is missing an important element in the healing process.
For example, let's say someone was experiencing a "cluster" of symptoms. They had heard references off of social media to Borderline Personality Disorder and were starting to wonder if their past history of disastrous ends to relationships as well as what they experienced daily in their emotional interactions with the world was BPD. They ask Chat GPT (or whatever their preference is), if they have this diagnosis. Perhaps it says yes, based off what you are describing you do have BPD. What is next for this person? Does having an answer solve their lingering questions? Does it even answer their question?
No.
People are not just clinical terms or diagnosis codes. People are not clusters of symptoms. They are living, breathing, ever changing, dynamic, and constantly processing information from the world around them. What makes good therapy good is having the opportunity to interact with a living, breathing, other person, processing together and experiencing a healthier relationship dynamic than perhaps someone has ever experienced before.
Many times people who experience mental health struggles are struggling deeply in their relationships with others but other people either distance themselves from the person struggling or avoid the tough conversations about what is happening in the dynamic. Therapy is so helpful in that there is another person who doesn't avoid what is happening in the room-they directly engage with the person struggling to help them understand how things are breaking down.
So, will AI replace therapy or mental health treatment? I believe the answer is a resounding no. Can it help people begin to explore questions they may have only asked themselves internally? Yes. And hopefully with those explorations AI also leads people to mental health practitioners who can better support another human being in ways AI can only "dream" of.

Comments