As a record amount of U.S. workers struggle with mental health issues and stress, more employers are offering new chatbot apps to help them.

A survey this past summer of 457 employers by Willis Towers Watson found that 24% of them offer a “digital therapeutic” for mental health support.

Some 15% of the businesses surveyed were considering adding this type of offering in 2024 or 2025, the professional services company found. Typically, these apps are provided as a voluntary or wellness benefit.

Some apps feature chatbots that can hold counseling-type conversations with users, while other wellness apps can help diagnose depression or identify people at risk of harming themselves.

At the same time, these chatbots and other mental health apps have generated controversy, with some experts warning that they are not equipped to handle serious mental health issues and that they are no replacement for human therapists.

However, as long as there are not enough therapists in the U.S. to meet demand and artificial intelligence continues to evolve, it’s likely these chatbots are here to stay.

Examples

Recently Amazon announced that as part of its employee benefits package it would offer the therapist-like app Twill. The platform says that Taylor, its clinician-trained chatbot, “learns, interprets and understands each person’s needs and goals to guide them towards personalized care.”

Another product on the market is Wysa, an AI-driven app that received a breakthrough designation by the Federal Drug Administration, putting it on track for fast-track approval. This came after an independent peer-reviewed clinical trial, published in the Journal of Medical Internet Research, which found the app to be effective in the management of chronic pain, and associated depression and anxiety.

Also on the market is Woebot, which combines exercises for mindfulness and self-care (with answers written by teams of therapists) for postpartum depression. 

Pros and cons

The apps vary in how much they incorporate AI — and in how much leeway they give AI systems. These companies say they build safeguards into their apps and that they have certified psychiatrists that oversee the applications.

Proponents of mental health apps and chatbots say they can address issues like anxiety, loneliness and depression. Also, chatbots and apps can provide 24-hour support and they can meet the demand of people who may have a hard time finding a counselor or fitting therapy into their schedule.

On the other hand, there is a paucity of data or research showing how effective, or how safe, they are — and the majority have not been approved by the FDA.

Many of these mental health apps have different specialties, for example: treating anxiety, attention-deficit/hyperactivity disorder or depression. Others can help diagnose mental health problems or predict issues that can lead to self-harm.

Often, the apps will include disclaimers that they are “not intended to be a medical, behavioral health or other health care service” or “not an FDA-cleared product.”

Also, there have been concerns raised about some of these apps. In March 2023, the Federal Trade Commission reached an $8 million settlement with BetterHelp, an app counseling service, over allegations that it shared user data with advertising partners.

Another company, Replika, updated its app last year after users complained that its chatbot engaged in overly sexual conversations, and even harassed them.

The takeaway

Mental health care is an increasingly important part of employee benefits offerings. Since the onset of the COVID-19 pandemic, 94% of employers have made investments in mental health care, according to research by Mercer.

As these apps improve and become more widespread, it’s likely your employees will encounter them when they use their group benefits, or they will be among your voluntary benefit offerings.