In a revelation that could reshape the future of mental healthcare, a peer-reviewed study published in PLOS Mental Health suggests that OpenAI’s ChatGPT may outperform human therapists in delivering empathetic, context-rich responses to people seeking emotional support.
Conducted by researchers at The Ohio State University, the study compared the responses of licensed therapists and ChatGPT to anonymized queries posted on an online mental health forum. A group of over 250 participants, including practicing clinicians, mental health students, and laypeople, were asked to rate the responses on core therapeutic principles such as empathy, helpfulness, and cultural sensitivity—without knowing whether the reply came from a human or the AI model.
The findings surprised even the researchers: in most cases, ChatGPT’s responses were rated more favorably than those of professional therapists.
A Surprising Benchmark in AI-Human Comparison
ChatGPT’s performance was not just competent—it was compelling. The AI generated responses that were longer, more nuanced, and often perceived as more empathetic than those provided by licensed professionals. Participants rated these AI responses as more contextually aware and emotionally supportive.
Dr. Jason B. Luoma, a psychologist not affiliated with the study, remarked, “These results don’t mean AI should replace therapists. But it does mean we need to ask serious questions about how such tools can be responsibly integrated into therapeutic settings.”
The responses created by ChatGPT were also found to be indistinguishable from those written by humans. In fact, a majority of participants were unable to correctly identify the source of each message, indicating that the AI had successfully mimicked the language and tone of human counselors.
AI in Mental Health: Opportunity or Overreach?
As AI systems grow increasingly sophisticated, mental health professionals are grappling with how to ethically incorporate them into care delivery. ChatGPT is not trained specifically to provide therapy, but its natural language capabilities—and now, its performance in this controlled study—suggest that it could be a powerful tool in expanding access to mental health support.
Still, many experts urge caution. Dr. Elizabeth Zhang, a clinical psychologist in New York, noted, “Therapy is about more than words. It’s about presence, accountability, and long-term trust. AI can simulate empathy, but it doesn’t carry the ethical and emotional responsibility that human therapists do.”
Advocates of AI augmentation argue that such tools could serve as first-line support for individuals unable to access therapy due to cost, stigma, or limited availability—especially in rural or underserved areas. AI could also reduce the burden on overworked mental health professionals by handling lower-risk cases or providing triage.
What Comes Next?
OpenAI has not positioned ChatGPT as a replacement for therapy, and the study’s authors are careful to note that the findings should not be misinterpreted as suggesting that AI can or should replace human professionals. Instead, they suggest a future where AI tools might work alongside clinicians, enhancing human-led care rather than replacing it.
The research underscores a pivotal shift in how society perceives empathy and emotional intelligence—traits once considered uniquely human. As AI becomes more deeply integrated into personal and professional life, the line between machine support and human interaction continues to blur.