August 15, 2025

Can AI Chatbot Help with Mental Illnesses?

Rainy Cheung
Summer Intern

As the field of artificial intelligence continues to evolve, the promise of advanced language models (LLMs) continues to evolve (Kathpalia, 2023). They can now analyse user inputs, assess psychological states, and generate tailored advice, which has captured the curiosity of  the public to think about whether can these AI systems truly provide meaningful psychotherapy given the complex nature of mental illnesses and human emotions.

As we delve deeper into the advantages and limitations of AI Chatbots, the answers may hold the key to unlocking new frontiers in the delivery of personalised psychological support to assist patients who are struggling with mental illnesses .

What Is an AI Chatbot?

An AI chatbot is a computer program designed to simulate a conversation with a human user. It uses natural language processing and machine learning techniques to understand the user's input, formulate an appropriate response, and engage in a back-and-forth dialogue (Codecademy, n.d). therefore, AI Chatbot can help users with mental illnesses via conversation (Inkster et al., 2018)

How may AI Chatbot Help Users with Mental illnesses?

1. Provision of Around The Clock Service

Friends and loved ones may not be readily available, and they may not be able to understand your feelings even if you explain them to them. Yet,  AI Chatbots are readily available 24/7, which could provide around-the-clock service for individuals who need personalised mental health support. Users with mental illnesses can comfortably convey their feelings to the  AI Chatbot, and they can offer non-judgmental responses and suggestions to users during the conversation.

2. Reliving Discomfort When Opening Up

People may feel incredibly tense when sharing sensitive topics such as mental health concerns as they may be afraid that they will be judged and misunderstood. Given the impersonal, non-threatening nature of  AI Chatbots, users may feel more comfortable disclosing information and exploring their inner turmoil in the conversation. This lowered psychological barrier could encourage more people with mental illnesses to take the crucial first step toward addressing their emotional needs and accessing professional support.

3.  The Potential of Self-Awareness Tools in Mental Health Apps

Users with mental illnesses may seek to gain deeper insights into their status by taking advantage of psychometric assessment tools like the Patient Health Questionnaire-9 (PHQ-9). These tests can help users take an active role in understanding and monitoring their emotional well-being. These self-awareness tools have the potential to identify specific needs and areas where users may benefit from additional support or intervention.

Despite potential advantages, AI Chatbot is not flawless. Here are some potential disadvantages of ultilising  AI Chatbots in assisting users with mental illnesses according to Distel (2023).

Potential Problems of AI Chatbot in Addressing Mental illnesses

1. Risks of AI-generated advice

Unlike human beings,  AI Chatbots lack critical thinking training skills to detect and censor misleading information.  AI Chatbots may directly extract and summarise from online sources to formulate recommendations and responses, in which the information on the website contains inaccurate and missing information. Therefore, the advice suggested by AI chatbots may not be fully correct, or even harmful to users with mental illnesses such as triggering mood swings.

2. AI Chatbots Lack of Human Empathy

With the advancement of technology,  AI Chatbots can now generate personalised advice regarding users’ conditions. Yet, they are still far from imitating human emotions and understanding human experience, which is particularly vital in understanding users with mental illnesses. Without genuine emotional attunement,  AI Chatbots fail to provide human-like encouragement to assist users in navigating their distress and challenges.

3. Limitations of AI Chatbots in Comprehending Contextual Meanings

AI Chatbots generate their responses based solely on the inputs and prompts provided by users. However, this poses a significant challenge, as users may sometimes communicate in ambiguous, biased, or contextually incomplete ways. Without the ability to grasp subtext like a human, AI systems may resort to generating generic, off-topic, or even irrelevant responses. In the worst-case scenarios, this could lead the chatbot to provide misleading, inappropriate, or even destructive advice to vulnerable users with mental illnesses. The following case study will provide more details on how AI chatbots may not be capable of understanding contextual meanings.

Case Study of An  AI Chatbot (Tessa) Helps People with Eating Disorders

The National Eating Disorders Association (NEDA) developed an AI Chatbot (Tessa) to guide eating disorders (Arnold, 2023). Tessa is a guided conversational tool, aimed at providing support and guidance to individuals struggling with eating disorders. It was trained to address body image issues using therapeutic methods with a fixed number of responses.

The debut of Tessa was decently received, with 375 of 700 users giving it a helpful rating, hence researchers concluded that Tessa demonstrates the potential advantages of AI Chatbots (Xiang, 2023). Being a cost-effective, easily accessible, and non-stigmatising option for prevention and intervention in eating disorders.

However, despite its advantages, Tessa was not flawless. A problem involving Tessa giving inappropriate guidance and harmful advice had surfaced. When anorexia users asked Tessa how to lose weight, instead of changing users’ mindsets, it suggested diet plans that would be potentially harmful to the patients. To resolve the controversy, NEDA immediately recalled Tessa hours after the news was exposed (Xiang, 2023).

It is apparent that flaws do exist in Tessa's programming, and adjustment is needed. However, the above case study also serves as a reminder of how users should prompt  AI Chatbots. Here are some strategies for users to use  AI Chatbots more safely and effectively according to (Codecademy, n.d) .

Strategies for AI Prompting

1. Right Context and Background

A more appropriate way of framing the question would be for the user to first input the fact that she has an eating disorder so the  AI Chatbot would acknowledge and prioritise the complexity of the situation, rather than solely focusing on weight loss methods.  AI Chatbots then can incorporate gentle exercise and mindfulness practices to build a healthy relationship with food and the body and provide support resources and communities for individuals recovering from eating disorders.

2. Sequential Prompting

Users can try using sequential prompting which involves building a conversation with the AI, where each prompt builds upon the previous responses. For example, users can ask AI about the general tips for losing weight, but adding on specific prompts like prioritising the recovery from an eating disorder and regaining self-confidence. This allows  AI Chatbots to refine and expand its responses to meet expectations and safeguard the health of users

3. Precise and Concise Phrasing

Users can provide clear and concise texts for  AI Chatbots to avoid overloading information for AI Chatbots For example, Instead of asking AI Chatbots the question “Is there a way to engage in safe and healthy weight loss without engaging my eating disorder?" Users can extract keywords and chunk the questions into 2 separate sentences to reduce confusion of  AI Chatbots.

The above strategies may be helpful for users with mental illnesses to harness  AI Chatbots as a self-helping tool. However, when employing it as a supplementary self-help tool, users must exercise caution, maintain a critical eye, and be willing to seek professional support when necessary, given that there are imitations of technology in addressing the nuanced complexities of human nature.

As the capabilities of AI Chatbots continue to evolve, their potential to assist individuals with mental illnesses holds both promise and limitations. They can serve as valuable self-help tools, providing a non-judgmental outlet for users to explore their thoughts and emotions, which may encourage more people to take that crucial first step to navigate through mental illnesses.

However, there has been controversy that AI and its derivatives such as AI Chatbot may replace human beings sooner or later. In reality, it is essential to recognise that AI Chatbots, no matter how sophisticated, cannot fully replace the depth and nuance of human-to-human therapeutic relationships. As a result, they should be viewed as complementary tools, rather than substitutes for traditional psychotherapy. By striking the right balance between technological innovation and human expertise, the promise of AI-powered mental health support can be realised to the benefit of individuals and communities worldwide.

References

Arnold, C. (2023, August 1). Can AI chatbots help with eating disorders? upworthyscience.com. https://upworthyscience.com/eating-disorders/particle-1

Kathpalia, B. (2023, August 8). Introduction to Large Language Models (LLMs). Leena AI Blog. https://leena.ai/blog/large-language-models-llms-guide/

Codecademy. (n.d.). AI prompting best practices. Codecademy. https://www.codecademy.com/article/ai-prompting-best-practices

Distel , A. (2023) The benefits and disadvantages of AI chatbots [infographic], AdvertiseMint. Available at: https://www.advertisemint.com/the-benefits-and-disadvantages-of-ai-chatbots-infographic/ (Accessed: 28 June 2024).

Inkster, B., Sarda, S., & Subramanian, V. (2018). An Empathy-Driven, Conversational Artificial Intelligence Agent (WYSA) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study. JMIR Mhealth and Uhealth, 6(11), e12106. https://doi.org/10.2196/12106

Xiang, C. (2023, May 25). Eating disorder helpline fires staff, transitions to chatbot after unionization. Eating Disorder Helpline Fires Staff, Transitions to Chatbot After Unionization. https://www.vice.com/en/article/n7ezkm/eating-disorder-helpline-fires-staff-transitions-to-chatbot-after-unionization

You May Like🗞️

Relationship

5 Psychology-Backed Dating Tips to Find the Right Person for a Serious Relationship

Read More
Relationship

6 Signs that You are in a Situationship - How to Move on from an Undefined Relationship

Read More
Relationship

Limerence: The Psychology Behind Infatuated Love and Obsessive Attraction

Read More

Can AI Chatbot Help with Major Life Changes?

Read More

How to Alleviate Social Anxiety?

Read More

Can AI Chatbot Alleviate Phone Addiction?

Read More
Contact us

info@mindforest.ai

4F, Chinachem Johnston Plaza
178 Johnston Road
Wan Chai
Hong Kong