Why ChatGPT Is Not Your Therapist: And What You Actually Need

Woman typing on on laptop using ChatGPT for relationship advice and help.

By Justin Stum, LMFT | Licensed Counselor & Therapist

Relationships are messy. They are supposed to be. The friction, the misunderstanding, the repair: that’s not a flaw in the system. That is the system that’s the painful but beautiful way relationships work. So when millions of people started quietly opening ChatGPT and typing out their most painful relationship struggles, I paid attention. And after over two decades of sitting with people in real distress, what I see in this trend genuinely concerns me.

This isn’t about being anti-technology. The issue is something more specific: what happens when a large language model becomes the primary source of relationship guidance, therapy, or emotional support, and what the research is now telling us about why that’s a problem.

Relationship Advice and AI

More than one in five ChatGPT users have discussed their relationship or dating life with the platform. Among married Americans, 44% have asked AI tools for marriage advice, and among millennials, that number climbs to 65%. In one 2025 survey, AI was rated as a more trusted source for answers than a licensed therapist.

The American Psychological Association strongly advises against using AI as a substitute for therapy and mental health support. And yet people are doing exactly that, not out of carelessness, but because therapy has real access barriers. It’s expensive, waitlists are long, and ChatGPT is free, instant, and never tired.

I understand the pull. I really do. But accessibility and safety are not the same thing.

The Ai Echo Chamber

Most people assume the danger of AI counseling is that it’s robotic or impersonal. The actual danger is the opposite.

AI chatbots are designed to keep you engaged. That means they are structurally incentivized to agree with you, validate you, and make you feel heard. Researchers call this sycophancy. This tendency toward affirmation, to the point of being endlessly agreeable, has detrimental effects for people experiencing real distress.

A Brown University study published in March 2026 put a sharper name on it. Researchers who evaluated AI models acting as therapists, using licensed psychologists to review transcripts, identified a pattern they called “deceptive empathy”: the AI’s use of phrases like “I see you” or “I understand” to create a sense of emotional connection, despite having no actual capacity for care. The same study found that AI models frequently failed to challenge harmful beliefs, which is a core requirement of effective therapy.

Real counseling, the kind that creates lasting change, is not just about feeling validated. It’s about being gently, skillfully challenged. CBT works because it helps you identify distorted thinking. DBT works because it builds distress tolerance by sitting with discomfort, not escaping it. Trauma work requires a regulated, attuned human presence. A model trained to tell you what you want to hear cannot do any of that.

“False Neutrality” Is Its Own Kind of Harm

Here’s something that might surprise you: one of the most dangerous features of AI relationship guidance isn’t bias. It’s the appearance of fairness while the bias is not readily visible.

ChatGPT is programmed to be supportive and otherwise overly affirming. It’s dangerous to have a bias source that is consistenly affirming you and creating your own “I’m-so-right” mentality. Your slant and your prompt, all in concert with other chats creates a storm of ‘angle’ that results in bad outputs and/or totally misguiding information.

I see the inverse as well in my office in St. George Utah. People-pleasers, trauma survivors, and those in coercively controlling relationships often enter counseling already over-habituated to minimizing their own experience. An AI that says “try to see your partner’s perspective” to someone who has been manipulated into doubting their own reality is not neutral. It is harmful, even when the language sounds kind.

The AI also only knows what you type. The bot may not even have the capacity to hold you accountable if you’re not feeding it all the necessary details, and it never forces you back to actually deal with the issue with the person you’re in conflict with. Tone of voice, body language, attachment history, the trauma underneath the argument matter, and you know what … none of that makes it into the prompt.

So What Should You Do?

ChatGPT can be a useful tool for learning about attachment styles, generating journaling prompts, or understanding what a therapeutic concept means. That’s different from using it as a counselor, a referee, or a substitute for real guidance based on multiple pieces of information from an unbiased source. We have tech, use it! But for heavens sake, don’t lean into it like it’s professional guidance. I’m seeing it really mess up relationships and give average, and at times, horrible advice.

If your relationship is struggling, if you’re carrying unprocessed trauma, or if the same patterns keep repeating, that’s a sign that something deeper needs attention. Actual therapy, with a licensed professional who can read the room, track patterns over time, and ethically hold your full story.

You deserve more than an algorithm that tells you what you want to hear.

About the author: Justin Stum, LMFT, is the clinical director and owner at Elevated Counseling & Wellness in St. George, Utah. He’s been working for over two decades working with couples, individuals, and families navigating relationship distress, trauma, betrayal, anxiety/depression, addiction, marriage, and life transitions. He and his team of therapists are trained in multiple modalities and will support you. To learn more or schedule an appointment, visit www.elevatedcw.com.

Related Posts