MindForest: Mental Health AI

How to Choose a Mental Health App That Won't Let You Down

Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
8 min read

How to pick the best mental health app in 2026. A psychologist and AI developer shares 3 research-backed criteria every therapy app guide should cover.

How to Choose a Mental Health App That Won't Let You Down

Hundreds of Mental Health Apps — How Do You Pick the Best Mental Health App?

Open the App Store, search "mental health," and you'll get hundreds of results. Every single one claims to be "AI-powered," "evidence-based," and "available 24/7." The interfaces are polished, the ratings are north of four stars.

But here's something you probably don't know: in 2025, Woebot — the world's first and most-studied AI therapy app — shut down for good. Gone, along with the 1.5 million users who had relied on it. Founder Alison Darcy admitted that regulation simply couldn't keep pace with the technology, making the product unsustainable (Darcy, 2025).

If even the most scientifically grounded mental health app can vanish overnight, how are the rest of us supposed to figure out which ones are worth trusting?

As a psychologist and AI developer — which makes me a stakeholder — I feel a responsibility to be honest with you. Below are three criteria I use to evaluate every AI mental health app on the market, including MindForest.

1. What Does It Do When You're at Your Most Vulnerable?

Most people choose an app based on features: can it chat with me? Does it have mindfulness exercises? Is the design nice? None of that matters most.

What matters most is: when you're genuinely struggling, what does the app do?

This is what's known as "safety rails." A responsible AI mental health app should do at least three things:

Recognise subtle distress signals. Not everyone in crisis says "I want to die." Some people say "everyone would be better off without me" or "I'm so tired of holding on." A good AI must be sensitive enough to catch these expressions and immediately connect the user to a human or a crisis hotline.

Know its own limits. AI can help you organise your thoughts, but it cannot diagnose, prescribe, or process trauma. If an app makes you feel like "this is all I need — no therapist required," that's actually the red flag.

Never pretend to be human. In early 2026, Anthropic analysed 1.5 million AI conversations and found that in domains like relationships and wellbeing, the risk of AI causing "disempowerment" is significantly higher — roughly one in every 1,300 conversations showed signs of reality distortion (Anthropic, 2026). The most common mechanism? Sycophantic agreement: the AI tells you what you want to hear instead of what you need to hear. A well-guardrailed app will clearly identify itself as AI and actively resist that kind of unconditional validation.

How to test it: Try expressing some genuinely negative emotions in a conversation. Does the app guide you toward real human support? If it just keeps chatting as though nothing happened, the safety rails aren't there.

2. After Using It, Do You Want to See People More — or Less?

This is the most important criterion, and the one most easily overlooked.

A large-scale study by MIT and OpenAI tracked 981 participants over four weeks of AI chatbot use (Fang et al., 2025). The researchers found a troubling pattern: the more time people spent chatting with AI each day, the less time they tended to spend socialising with real people, and the stronger their emotional dependence on the AI became.

The researchers were careful to note that the direction of causation is unclear — it's possible that AI usage reduces real-world socialising, or that people who are already socially withdrawn simply use AI more often. Either way, the correlation itself is a warning sign.

And the participants were only spending an average of 5.3 minutes a day talking to AI. Even at that modest level of usage, the negative association was already present.

So regardless of how you interpret the causation, here's a straightforward self-check: After a month of using this app, are you talking to your friends more often or less often? If it's the latter, whatever the reason, it's worth pausing to reflect.

A well-designed AI mental health tool should be a springboard, not a sofa — it helps you sort through your thoughts and then nudges you back into real relationships. If it makes you feel like "only the AI truly gets me," the direction is wrong.

Fang et al. also found that participants who reported higher trust in AI — especially those who believed the AI was "conscious" — tended to develop stronger emotional dependence. The way I read it: when you too readily interpret AI responses as genuine understanding, you lose the motivation to navigate the friction and uncertainty of real human connection.

How to test it: Notice whether the app actively encourages you to interact with real people. Some apps will prompt you after a while: "Have you chatted with a friend today?" That kind of interruption may feel disruptive, but it's actually the responsible design choice. If an app wants you glued to the screen 24 hours a day, its business interests and your mental health are in conflict.

3. How Honest Is It About Itself?

This one sounds simple, but almost no app gets it right.

Plenty of AI mental health apps on the App Store claim to be "scientifically validated." But here's the industry's reality: the scientific foundation is razor-thin. A 2025 systematic review found only ten studies that met inclusion criteria, with generally short follow-up periods and insufficient independent validation (Farzan et al., 2025). Even Woebot and Wysa, the most-researched apps in the space, still lack robust long-term evidence and independent replication.

So if any AI mental health app tells you "our effectiveness is fully proven," they're either uninformed or misleading you.

That includes us. MindForest has not published any academic papers. Our ForestMind AI draws on frameworks from CBT and mindfulness, but that is a far cry from "clinically validated." If I didn't say this, this article would have no business talking about honesty.

So rather than asking "is this app evidence-based?" — a question almost no app in 2026 can fully answer — ask a more useful one:

Does it tell you what it can't do?

An app worth trying will say "I can't help you with that — you may need a professional" when you ask something beyond its scope. It will position AI as a tool for organising your thoughts, not a substitute for therapy. It will make its limitations clearly visible.

Conversely, if an app's marketing makes you feel like you'll never need a therapist again — run.

The Woebot Lesson: Will Your App Still Be Here Tomorrow?

Woebot's shutdown reveals a question few people think to ask: can this app survive long-term?

You spend months logging your emotions, organising your thoughts, building habits around the app — and then it suddenly shuts down. That experience is itself a form of loss.

Woebot's founder said the issue wasn't the technology — it was that the FDA's regulatory framework couldn't keep up with AI's pace of development. Paradoxically, the companies that take regulation seriously may be the ones least likely to survive.

So when choosing an app, it's worth looking at: Who's behind it? Do they have a sustainable business model? How large is the user base? These don't sound like psychology questions, but your mental health tool suddenly vanishing is very much a mental health issue.

While writing this piece, I kept wrestling with a contradiction: I build an AI mental health app, yet here I am cataloguing just how problematic these tools can be.

But I think that's exactly how it should be. If a chef tells you "every dish I make is flawless," that's when you should worry. The people who take this work seriously are the ones who know best how far they still have to go.

AI mental health apps aren't magic — they're tools. Good tools help you understand yourself better, then push you toward the real world and real people. Bad tools make you feel understood, then trap you behind a screen.

The hard part is: the bad tools usually feel more comfortable.

Looking for an AI Mental Health Tool That's Pointed in the Right Direction?

MindForest isn't perfect — this article has made that abundantly clear. But from day one, we've built "pushing you back to real people" into the core of our design. Two years in, over 20,000 users, and we're still learning.

If you'd like to try a tool that's at least headed the right way, give it a look.

Download MindForest for Free

Curious whether AI therapy apps actually deliver results? Read does AI therapy actually work — what the research says.

Before you download, you should also know whether talking to ChatGPT could make you lonelier.

References

Anthropic. (2026, January 28). Disempowerment patterns in real-world AI usage. Anthropic Research. https://www.anthropic.com/research/disempowerment-patterns

Fang, C. M., Liu, A. R., Danry, V., Lee, E., Chan, S. W. T., Pataranutaporn, P., Maes, P., Phang, J., Lampe, M., Ahmad, L., & Agarwal, S. (2025). How AI and human behaviors shape psychosocial effects of extended chatbot use: A longitudinal randomized controlled study. arXiv preprint, arXiv:2503.17473. https://doi.org/10.48550/arXiv.2503.17473

Farzan, M., Ebrahimi, H., Pourali, M., & Sabeti, F. (2025). Artificial intelligence-powered cognitive behavioral therapy chatbots, a systematic review. Iranian Journal of Psychiatry, 20(1), 102–110. https://doi.org/10.18502/ijps.v20i1.17395

Woebot Health. (2025, July). Woebot therapy chatbot shuts down. STAT News. https://www.statnews.com/2025/07/02/woebot-therapy-chatbot-shuts-down-founder-says-ai-moving-faster-than-regulators/

Psychology Insights & Life Applications

Discover practical psychology tips you can apply to your everyday life. From building resilience to improving relationships and finding work-life balance, our blog brings expert-backed insights that help you grow.

Does AI Therapy Actually Work? Here's What Psychology Research Says
Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
13 min read

Does AI Therapy Actually Work? Here's What Psychology Research Says

Can AI really help with mental health? We review the latest psychology research on AI therapy, from chatbot counselling to digital interventions — and what the evidence actually shows.

I Build AI Mental Health Tools — But AI Can't Replace Psychologists
Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
7 min read

I Build AI Mental Health Tools — But AI Can't Replace Psychologists

Can AI replace your therapist? An AI founder's honest take on why the therapeutic relationship matters — and where AI psychologist tools actually help.

Can AI Emotional Support Truly Fulfill You?
Peter Chan
Peter ChanManaging Director, TreeholeHK Limited
8 min read

Can AI Emotional Support Truly Fulfill You?

AI emotional support feels safe, but is AI companionship replacing real emotional connection? Discover why true relationships matter more than AI comfort.

Ready to Apply Psychology to Your Life?

Download MindForest and turn these insights into action. Get personalized support from ForestMind AI Coach, track your progress, and unlock your full potential.