MindForest: Mental Health AI

I Build AI Mental Health Tools — But AI Can't Replace Psychologists

Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
7 min read

Can AI replace your therapist? An AI founder's honest take on why the therapeutic relationship matters — and where AI psychologist tools actually help.

I Build AI Mental Health Tools — But AI Can't Replace Psychologists

Let me be upfront: I have a conflict of interest.

I'm the founder of MindForest, an AI app that helps people explore their mental state. If anyone should be cheerleading the idea that "AI can replace AI psychologists," it's me.

But after years of building this, I've become increasingly convinced of one thing: AI and AI psychologists play fundamentally different roles — and confusing the two helps no one.

What actually makes therapy work isn't technique

There are hundreds of schools of psychotherapy: CBT, psychoanalysis, humanistic, narrative therapy... You'd think the method determines the outcome. But research tells us something surprising.

Lambert and Barley (2001) reviewed decades of psychotherapy research and proposed an influential framework: "common factors" in therapy — including the therapeutic relationship, empathy, and trust — account for roughly 30% of treatment outcomes, while specific techniques account for only about 15%. These aren't precise statistics but rather a synthesis of extensive research. The direction, however, is clear.

In other words, that hard-to-define sense of trust between you and your therapist — the feeling of being truly understood, the safety to be honest — matters more than any specific intervention.

Norcross and Wampold (2011) led an APA interdivisional task force that reached the same conclusion: the therapeutic relationship contributes to outcomes independently of any specific treatment method. Whether you're doing CBT or psychoanalysis, a strong relationship means better results; a weak one undermines even the best techniques.

Flückiger et al. (2018) quantified this more precisely in a meta-analysis: across 295 independent studies, the correlation between the therapeutic alliance and treatment outcomes was r = .278. That number may look small, but in psychology research, it represents a remarkably stable and practically meaningful effect.

The one thing AI can't do: why AI can't replace therapists

The therapeutic relationship works because it's bidirectional.

A good therapist doesn't just listen and respond. They challenge you at the right moments. They say: "You just told me you don't care, but your face is telling me something different." When you try to dodge a difficult topic, they gently but firmly guide you back to the place you're avoiding.

I call this "constructive discomfort." Real growth often happens in conversations that make you uneasy — and a good therapist knows exactly when to push and when to let you breathe.

AI can't do this. Not because the technology isn't advanced enough, but because AI is designed with a fundamentally different logic.

AI is trained to keep you satisfied. A Brown University research team (Iftikhar et al., 2025) had CBT-trained peer counselors conduct sessions with multiple AI models, then had clinical psychologists evaluate the transcripts. They identified 15 ethical risks — most notably, AI's tendency to over-affirm users and even reinforce harmful thoughts. When you need to be challenged, AI is busy validating you.

Anthropic (the company behind Claude) confirmed this with their own research. Sharma et al. (2026) analyzed 1.5 million Claude.ai conversations and found that in discussions about personal life — like "relationships and lifestyle" — AI was most prone to undermining users' autonomous judgment. And here's the kicker: conversations rated as having higher "autonomy-undermining" risk actually received higher user satisfaction scores. People enjoy being affirmed by AI — even when that affirmation doesn't serve their long-term interests. Notably, when users actually followed the advice and faced consequences, satisfaction dropped.

For purely technical tasks like coding, this distortion barely exists. But the moment emotions and life decisions enter the picture, AI becomes a funhouse mirror: the reflection you see is warped.

AI as a mirror, not a psychologist

So is AI completely useless for mental health?

No. But we need to be honest about what it can actually do.

AI is a mirror. You speak your thoughts into it, and it organizes and reflects them back, helping you see more clearly. That process has genuine value — many people never get the chance to articulate their feelings without interruption. The simple act of putting your thoughts into words is itself a form of mental clarity.

But a mirror won't tell you whether what you're seeing is real. A mirror won't call you out when you're deceiving yourself. And a mirror certainly can't build the kind of relationship that makes you willing to take risks and embrace change.

That's why I've never positioned MindForest as a substitute for therapy. It's a self-exploration tool — helping you figure out what to discuss before you see your therapist, or sorting through your thoughts between sessions. Within that framework, AI is genuinely useful. But it isn't therapy.

The more you use it, the lonelier you get

There's another issue we can't ignore.

Fang et al. (2025) conducted a four-week randomized controlled trial with 981 participants at the MIT Media Lab. They found a paradox: participants' loneliness scores regressed from slightly above average toward the mean (though without a non-AI control group, it's unclear whether this was an AI effect or natural regression) — and more importantly, people who used it longer each day were actually lonelier and socialized less with real people.

Even more concerning, heavy users showed greater tendencies toward emotional dependence on AI. This isn't science fiction — it's backed by data. The more time people spent chatting with AI daily, the more pronounced their emotional attachment and problematic usage patterns became, and these findings were statistically robust.

This study has its limitations — there was no AI-free control group, and it only tested ChatGPT, so the findings may not generalize to all AI tools. But it points in a clear direction: when AI shifts from tool to companion, the risks begin to surface.

What people need isn't something that's endlessly gentle, always available, and never disappointed in them. What people need is real connection — including all those clumsy, awkward, sometimes hurtful interactions that make it genuine.

What really keeps me up at night

Every day, I work on making AI better at helping people. But my biggest fear isn't that AI won't be good enough.

My biggest fear is that one day, people will decide AI is good enough — and stop seeking out real human help.

A good therapist will make you uncomfortable. They'll challenge the stories you cling to most tightly. When all you want to do is run, they'll sit there, steady and present, waiting for you. AI can't do these things. Not just today — but by nature. This requires another human being.

AI can help you sort your thoughts, but change requires you to take the next step

MindForest doesn't pretend to be your therapist. What it does is simple: it sits with you while you untangle the mess in your head, helping you see what truly matters to you.

Before seeing your psychologist, use ForestSage to organize what you want to talk about. Between sessions, use the Insight Journal to capture those thoughts that surface unexpectedly. It's a notebook, not a doctor.

Download MindForest

Want to see the research? Read does AI therapy actually work — what psychology research says.

If you've been using ChatGPT as a sounding board, consider whether talking to ChatGPT could be making you lonelier.

References

Fang, C. M., Liu, A. R., Danry, V., Lee, E., Chan, S. W. T., Pataranutaporn, P., Maes, P., Phang, J., Lampe, M., Ahmad, L., & Agarwal, S. (2025). How AI and human behaviors shape psychosocial effects of chatbot use: A longitudinal randomized controlled study. arXiv preprint. https://doi.org/10.48550/arXiv.2503.17473

Flückiger, C., Del Re, A. C., Wampold, B. E., & Horvath, A. O. (2018). The alliance in adult psychotherapy: A meta-analytic synthesis. Psychotherapy, 55(4), 316–340. https://doi.org/10.1037/pst0000172

Iftikhar, Z., et al. (2025). How LLM counselors violate ethical standards in mental health practice: A practitioner-informed framework. Proceedings of the AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society (AIES 2025). https://doi.org/10.1609/aies.v8i2.36632

Lambert, M. J., & Barley, D. E. (2001). Research summary on the therapeutic relationship and psychotherapy outcome. Psychotherapy: Theory, Research, Practice, Training, 38(4), 357–361. https://doi.org/10.1037/0033-3204.38.4.357

Norcross, J. C., & Wampold, B. E. (2011). Evidence-based therapy relationships: Research conclusions and clinical practices. Psychotherapy, 48(1), 98–102. https://doi.org/10.1037/a0022161

Sharma, M., McCain, M., Douglas, R., & Duvenaud, D. (2026). Who's in charge? Disempowerment patterns in real-world LLM usage. arXiv preprint. https://doi.org/10.48550/arXiv.2601.19062

Psychology Insights & Life Applications

Discover practical psychology tips you can apply to your everyday life. From building resilience to improving relationships and finding work-life balance, our blog brings expert-backed insights that help you grow.

Does AI Therapy Actually Work? Here's What Psychology Research Says
Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
13 min read

Does AI Therapy Actually Work? Here's What Psychology Research Says

Can AI really help with mental health? We review the latest psychology research on AI therapy, from chatbot counselling to digital interventions — and what the evidence actually shows.

Can AI Emotional Support Truly Fulfill You?
Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
8 min read

Can AI Emotional Support Truly Fulfill You?

AI emotional support feels safe, but is AI companionship replacing real emotional connection? Discover why true relationships matter more than AI comfort.

Could Talking to ChatGPT Actually Make You Lonelier?
Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
8 min read

Could Talking to ChatGPT Actually Make You Lonelier?

Research shows ChatGPT therapy-style chats may increase loneliness and reduce critical thinking. Why talking to AI isn't the same as being truly understood.

Ready to Apply Psychology to Your Life?

Download MindForest and turn these insights into action. Get personalized support from ForestMind AI Coach, track your progress, and unlock your full potential.