Across the country today, it is widely acknowledged that access to mental healthcare is just as important as clinical care when it comes to overall wellness.
Mental health conditions are incredibly common in the US, impacting tens of millions of people each year, according to the National Institutes of Mental Health (NIMH). However, estimates suggest that only half of people with these conditions receive treatment, mainly due to barriers like clinician shortages, fragmented care, and societal stigma.
For the many individuals suffering from anxiety and depression, these existing barriers – coupled with the current healthcare crisis – can significantly interfere with the ability to carry out life activities.
“The prevalence of mental health disorders – particularly depression and anxiety – is high. If anything, the prevalence of these conditions has only increased as a result of COVID-19. The need is greater than ever now,” Jun Ma, PhD, Beth and George Vitoux Professor of Medicine at the University of Illinois Chicago (UIC) department of medicine, told HealthITAnalytics.
To broaden mental healthcare access for people with moderate depression or anxiety, UIC researchers are testing an artificial intelligence-powered virtual agent called Lumen. The team will train the tool to provide patients with problem-solving therapy, a structured approach designed to help people focus on learning cognitive and behavioral skills.
The two-phase, five-year project is funded by a $2 million grant from NIMH.
“The goal is to meet the many challenges of people who don’t have ready access to proven psychotherapy, which has been a longstanding issue,” said Ma.
“Over the years, my research team has done clinical trials testing the effectiveness and dissemination of different behavioral and psychosocial interventions. The results of that work, combined with the gaps that exist in practice and patient access, have really catalyzed the idea for this project.”
Using the same technology as Amazon’s Alexa, researchers will develop an app that will act as a virtual mental health agent, talking through steps and strategies with patients following a validated treatment protocol.
“If we prove this way of delivering problem-solving treatment is a safe and effective, once we put it into production anyone with access to Alexa would be able to access the program. We’re very early in the development phase, so it will probably be another few years before it’s widely available,” said Ma.
“We’re making good strides. We’re starting to conduct a user study on a small scale. And the immediate next step after this initial user development and the user testing phase will be a small scale randomized controlled trial (RCT), in which we’ll enroll patients with depressive symptoms and/or anxiety.”
Individuals will complete eight one-on-one counseling sessions over 12 weeks. In each session, participants will identify a problem they view as affecting their life and as a source of emotional distress, and the counselor will help them define goals and possible solutions. Solutions are then compared, and counselors and patients work to make an action plan to implement the chosen solution.
Researchers will program Lumen using the Alexa Skills Kit to act as the virtual counselor working with participants, taking them through problem-solving steps and encouraging them to engage in meaningful and enjoyable activities to improve their emotional well-being.
During the first phase, 80 study participants who report elevated depressive and anxiety symptoms will test the Lumen tool, with the potential for wider use going forward.
The researchers hope that the project will increase access to mental healthcare for those who need it most.
“One of the main advantages of using AI as a platform to provide therapy is the ability to scale and reduce significant barriers to access, as well as sustainability of proven psychotherapy such as problem-solving treatment,” said Ma.
“The technology can also be quite adaptable to individuals depending on when they need it and how they want to access it, and can potentially reduce barriers due to stigma.”
Despite the serious potential for these tools to broaden the availability of mental healthcare, Ma also noted that the use of AI in this area comes with several concerns – just as the technology does in any part of healthcare delivery.
“Like any novel treatment in early development, it’s unknown at this point what the effectiveness and the sustained impact of AI in psychotherapy. It’s certainly very worth exploring, as we are doing now,” she said.
“Patient privacy is a very important area that warrants not only additional research, but also additional legislation and regulation. Additionally, AI and the underlying algorithms are trained using existing data and information, and there could be unintended consequences due to implicit or explicit bias. It’s very important to have transparency in how the models are trained, as well as to ensure the data used to train such models is representative of the population.”
Ma’s statements align with those of other industry experts, who consistently highlight the necessity of safety, data privacy, and health equity when building and using these tools.
In a recent viewpoint published in JAMA, authors noted that chatbots and other AI-powered virtual agents are still relatively new, and much of the data available comes from research rather than widespread clinical implementation. For these reasons, healthcare leaders must continually evaluate the capacity of these tools to improve care delivery, the authors stated.
In the development stage of the Lumen tool, Ma’s team at UIC plans to do just that.
“If the small-scale RCT proves promising, then we’ll go on to a larger-scale RCT in which we’ll recruit 200 patients, again with that depressive symptoms and/or anxiety, to further test the potential impact and effectiveness of Lumen,” Ma said.
Ultimately, the success of these tools in healthcare will depend on the industry’s ability to weigh possible risks and rewards.
“Given the potential concerns, it’s worth emphasizing the importance of balancing excitement for such novel treatments with caution. It’s a fine line between ensuring protection of patient privacy and confidentiality and not restricting the innovation in this area,” Ma concluded.