AI is a Game Changer for Mental Health

Mental health care in America needs help. AI will come to the rescue.

Dr. John Grohol
6 min readDec 19, 2023
A woman talking to her AI assistant
AI may help with the mental health care crisis in the U.S.

It’s not clear how mental healthcare in the United States has gone downhill so quickly.

People open up to me every week about how challenging it is to access care covered under their health insurance plan. Even well-funded health insurance seems to suffer from a lack of providers. The shortage is so bad, trying to get a specialist like a child psychologist or a psychiatrist will land you on a 6–12 month waiting list.

We know that about 19% of Americans receive some form of mental health treatment, with younger adults more likely to seek out psychotherapy versus older adults. But nearly as twice as many people take a medication over psychotherapy for treatment, according to the U.S. Centers for Disease Control and Prevent.

Research has shown that the problem of accessing and receiving adequate treatment in the U.S. for a mental health condition is just the tip of the iceberg. About half of the people with mental health concerns simply never seek out treatment in the first place. Reasons include that treatment is still stigmatized, it’s unaffordable, a person doesn’t know where to begin, put off by looking for a therapist, or had a bad experience with a therapist once, among others. It’s sad that despite all the progress we’ve made over the past three decades in education to help de-stigmatize mental health concerns, 47% of Americans still believe seeking out treatment is a “sign of weakness,” according to a poll of 2,000 adults in 2021.

Artificial intelligence (AI) has the potential to help with some of these issues.

AI Can Make Treatment Providers More Effective

When the supply of therapists outstrips demand for their services, one solution might be to help therapists make better use of their time in general, but especially the time spent in psychotherapy sessions.

Psychotherapists often feel overwhelmed by their case loads, the hassle of working with an insurance company to receive payment for their services, the denial of needed treatment by those same insurance companies looking to keep their costs low, and working in isolation. Psychotherapy can be a never-ending grind for many therapists, filled with occasional moments of hope and growth.

What if a psychotherapist could better utilize the time they do have, especially the time spent being in the moment with a patient? David Brooks writes in How to Know a Person about the value of truly being seen and heard by another. This is exactly what therapists are trained to do with their patients.

But too often, therapists spend a good portion of their sessions going over the past week’s events in the patient’s everyday life. While such catchups are important to review, especially with regard to trying out new thoughts and feelings in different situations, they can sometimes become off-topic from the focus of the person’s therapy.

An AI assistant could help. Imagine such an assistant or companion that the patient could talk to on a regular, ongoing basis, whenever they felt like, that offered empathy and support in a non-judgmental manner. Some people might enjoy such an ongoing relationship, sort of an interactive journal or diary, where you could share your thoughts and feelings openly.

Then imagine that the AI assistant could summarize those discussions in a very simple format, noting conversation highlights, and with the patient’s permission, share those highlights with their therapist. Not only could it provide a therapist with greater insight into the patient’s actual week (versus the curated version they share with the therapist), it could help them better utilize the time spent with the patient. Instead of wandering through the first 10 or 15 minutes of session with a catchup, the therapist can focus on some of the opportunities and challenges the patient spoke of that directly relate to the person’s issues.

An AI assistant could also help with encouraging a patient to complete any homework assigned by the therapist, as a lack of motivation to complete such assignments is commonplace (Stephenson et al., 2023).

AI Can Help Alleviate Treatment Provider Shortages

A similar AI assistant could also be used to help alleviate the lack of providers in so many areas. While it’s far too early to suggest AI can replace actual psychotherapy, they can be trained with many psychotherapuetically-derived techniques and strategies proven to help people. For nearly three decades, hundreds of thousands of people have used interactive, online self-help psychoeducational modules designed upon cognitive-behavioral therapy (CBT) principles (with a lot of empirical support, see for example: Wang et al., 2023; Twomey et al., 2017; Twomey & O’Reilly, 2016; Guille et al., 2015; Twomey et al., 2014; Powell et al., 2013; Farrer et al., 2012). There’s no reason these same techniques can’t be baked into an AI to help walk people through scientific psychological strategies to help with their mental health.

AI assistants may offer an on-ramp into regular psychotherapy too. Many people aren’t ready for psychotherapy with an actual therapist, but might be ready to talk about their problems to an AI that was supportive, empathetic, and offered advice when asked. After a time spent with the AI, a person may feel more ready to address their thoughts, feelings, and concerns with a human therapist. Meanwhile, the AI can be helping them learn how to have open, honest conversations with someone else, priming the person for a more fruitful therapeutic relationship in the future.

I see similar things happening every day in our online support groups, My Support Forums.org, that allow people to interact with one another in a self-help environment. People find comfort and support in sharing their mental health challenges with each other, even anonymously. The act of listening to another person is a very powerful intervention in and of itself, one that’s often minimized outside of psychotherapy. An AI assistant can help a person realize the value of being listened to.

AI can also be an extremely private, secure way to share your innermost thoughts and feelings with another entity, without fear of recrimination or judgment. It’s a common misconception that AI conversations are automatically shared with a server to help improve the AI — that’s up to the specific developer and their training model. An AI can be as private as the company wants it to be. When talking about mental health AI assistants, we would want them to be extremely secure and private.

AI Only Works if Properly Trained

These scenarios only work with a properly trained and vetted instance of AI. Without such training and appropriate, experienced training prompts, AI can often be problem-oriented and seem less than empathetic. AI can be argumentative, or offer advice when none was asked for.

A well-trained AI, on the other hand, can actually learn to be more empathetic and supportive. It can stop itself from saying things that may be misunderstood or even hurtful. One can add guardrails to an AI so that it understands what are good responses and what are poor responses, and learn from its interactions along the way. It all comes down to the special sauce of AI — proper training and prompts. And then it all needs to be reviewed by actual mental health professionals to ensure everything being done is ethical and has actual potential benefit.

In 2024, we’ll see more and more of such AI assistants and companions become available. It’s an exciting time to see how exactly these AI assistants will be used and how much they’ll be accepted by people looking for help with a mental health issue.

References

Farrer, L., et al. (2012). Web-Based Cognitive Behavior Therapy for Depression With and Without Telephone Tracking in a National Helpline: Secondary Outcomes From a Randomized Controlled Trial. JMIR.

Guille, C., et al. (2015). Web-based cognitive behavioral therapy intervention for the prevention of suicidal ideation in medical interns: A randomized clinical trial. JAMA Psychiatry.

Powell, J., et. al. (2013). Effectiveness of a web-based cognitive-behavioral tool to improve mental well-being in the general population: randomized controlled trial. JMIR.

Stephenson, C. et al. (2023). User experiences of an online therapist-guided psychotherapy platform, OPTT: A cross-sectional study. Internet Interventions.

Twomey, C. et al. (2017). Effectiveness of an individually-tailored computerised CBT programme (Deprexis) for depression: A meta-analysis. Psychiatry Research.

Twomey, C. & O’Reilly, G. (2016). Effectiveness of a freely available computerised cognitive behavioural therapy programme (MoodGYM) for depression: Meta-analysis. Australian and New Zealand Journal of Psychiatry.

Twomey, C., et. al. (2014). A randomized controlled trial of the computerized CBT programme, MoodGYM, for public mental health service users waiting for interventions. Br J Clin Psychol.

Wang, Q. et al. (2023). A systematic review and meta-analysis of Internet-based self-help interventions for mental health among adolescents and college students. Internet Interventions.

--

--

Dr. John Grohol

Founder, Psych Central (7M users/mo before 2020 sale); Co-Founder, Society for Participatory Medicine; Publisher & Contributor, New England Psychologist