Communicating with computers has become quite common in modern American society.
If you have used an Amazon device, asked Siri for information on your iPhone, or called a consumer helpline, you’re likely familiar with this phenomenon:
- A welcoming, reassuring voice asks what you’re trying to accomplish.
- You provide the requested details.
- The system processes your request.
- You get what you need without ever interacting with an actual human.
Human-computer conversations aren’t limited to customer service calls, getting directions, or launching a playlist. For example, when you use one of those small instant-messaging chat boxes that pop up in the corner of certain websites, you may be communicating with a computer program without even realizing it.
Many sites use “chatbots” that harness the power of artificial intelligence to answer customer questions or resolve basic problems. In these circumstances, chatbots fill the same purpose that computer-managed phone systems do: They provide basic information and resolve common problems in a manner that allows staff members to focus on more complex concerns.
As the capabilities of artificial intelligence have increased in recent years, chatbots and other AI-supported services have been able to address more complicate challenges. This includes mental health treatment. Chatbots are currently being used to conduct initial screenings, schedule appointments, and provide guidance for people who may be struggling with mental illness.
As they do in the customer service field, chatbots can provide basic mental health treatment services, which can free up healthcare professionals to address more sensitive matters. They can also expand access to care in underserved areas, and they are available 24 hours a day, seven days a week. But questions remain about the effectiveness of this technology.
How Do Mental Health Chatbots Work?
In general, here’s how a chatbot works:
- You visit a website.
- At some point, a small chat window opens, often in the lower right-hand corner of the screen.
- You may hear a sound to alert you to the new window.
- The chat window will typically have an opening message something like, “Hi there! I’m Bob (or some other name). How can I help you today?”
- You can ask questions, and the chatbot will respond (ideally with appropriate answers).
In the case of a mental health chatbot, you may either visit a website or download an app. Depending on which type of chatbot you are using, you may then need to log in with a username and password. Some chatbots provide general information, while others focus on personal details. In the latter case, having a password-protected account is important for preserving your privacy.
Once you’ve accessed the chatbot and logged in (if necessary), the bot will prompt you with a series of questions.
The types of questions the mental health chatbot asks, and how it responds to your answers, can vary depending on the which service you are using. Here are a few examples:
- Woebot is a mental health chatbot that uses natural language processing to carry on casual conversations with a therapeutic focus. Each day when you log into the app, Woebot may ask you how you’re feeling or what you’ve been doing. Based on your reply, the chatbot might send you a suggestion or a video. Woebot’s feedback
- If you use Moodnotes, you will be prompted to enter and rate your mood for the day. You will also be asked to describe your emotions and journal about topics that are important to you. This chatbot also uses the principles of CBT. It can identify evidence of self-defeating thought patterns and help you adopt a more positive outlook.
- The Wysa website describes its chatbot as an “AI coach.” Wysa will ask you about your emotions and then make suggestions that can help you develop greater resilience and improved mood. Wysa’s advice is based on several methodologies, including CBT, dialectical behavior therapy (DBT), and motivational interviewing.
Are Mental Health Chatbots Popular & Effective?
Now that you understand a bit more about the purpose and functions of mental health chatbots, it’s time to address two important questions: Will people use chatbots for mental health treatment, and are these chatbots effective?
The answer to the first question appears to be a solid yes.
An April 2022 article on the Psychiatric News website included the following statistics about the popularity of mental health chatbots:
- 22% of adults have used a mental health chatbot.
- 47% of adults said they would consider using a mental health chatbot if they needed to.
- 60% of adults who had used a chatbot said they first did so during the COVID pandemic.
- 44% of chatbot users said they used the services exclusively and were not currently seeing a human mental health professional.
Respondents to the survey that collected this information said their reasons for using chatbots for mental health treatment included price, 24/7 access, and ease of use.
The second question is a bit more difficult to definitively answer. Assessing the effectiveness of any mental health technique or service can be a complex endeavor.
The Psychiatric News article notes that Woebot claims its service can help reduce symptoms of depression. Here’s what a few other sources have to say on this topic:
A 2019 meta-analysis in the Canadian Journal of Psychiatry concluded that there is the “potential for effective, enjoyable mental health care using chatbots.” This conclusion was based on a review of 12 separate studies. The team that conducted this meta-analysis also reported the following:
- The studies showed that mental health chatbots can identify people who have depressive disorders and reduce symptoms among those who have major depressive disorder.
- Chatbots pose minimal risk for harm. Among 759 recruited participants, only one person had a negative mental health response. This individual began to experience paranoia and subsequently withdrew from the study.
- Two commonly cited benefits of mental health chatbots are psychoeducation and adherence.
- Mental health chatbots appear to promote use among people who would not otherwise seek professional treatment.
The analysis team noted two concerns about using mental health chatbots:
- Smartphone-based chat apps do not always respond to threats of suicide in the most appropriate manner.
- Some users of mental health chatbots may develop distorted parasocial relationships with the apps.
Chatbot Effectiveness for Depression, Anxiety, & Stress
In November 2022, Frontiers in Digital Health published a Brazilian study that assessed the effectiveness of mental health chatbots for people who were struggling with depression, anxiety, and stress. Details from this study included the following:
- The research team reviewed anonymized data from 3,629 adult users of a mental health app called Vitalk. The data was collected over a one-month period.
- Increased engagement with the app was associated with improved outcomes in terms of depression and anxiety symptoms.
- The average anxiety score of the Vitalk users improved, moving from moderate to mild.
- The average depression score of the study’s subjects improved, moving from moderately severe to moderate.
- The average stress score lowered from severe to mild.
- Daily use of the app ranged from 0.29 to 40.25 responses, with an average of 8.17 responses per user each day.
“Large numbers of users showed clinically significant, reliable change,” the study team reported, “moving from a clinical to non-clinical range for anxiety and depression as measured by the PHQ-9 [Patient Health Questionnaire] and GAD-7 Generalized Anxiety Disorder assessment].”
Concerns About Mental Health Chatbots
As the April 2022 Psychiatric News report demonstrated, the news about chatbots isn’t all positive.
A July 2020 report in the Journal of Medical Internet Research, which reviewed 12 studies from 11 countries, raised the following additional concerns about mental health chatbots:
- Most of the reviewed studies had a small sample size, high risk of bias, and low quality of evidence.
- Some studies had contradictory results for certain outcomes.
- Studies did not reveal a clinically significant difference between chatbots and other types of mental health treatment.
- There is a lack of studies that have assessed chatbots’ effectiveness with many types of mental health disorders.
- There is minimal evidence to support the effectiveness and safety of chatbots.
“Given the weak and conflicting evidence found in this review, users should not use chatbots as a replacement for mental health professionals,” the study’s authors wrote.
“Instead, health professionals should consider offering chatbots as an adjunct to already available interventions to encourage individuals to seek medical advice where appropriate and as a signpost to available support and treatment,” the team concluded.