Unregulated New Technology: The Science of Mental Health

Unregulated New Technology: The Science of Mental Health

Whether you’re standing in an elevator or sitting down at a dinner table, chances are that one of the people next to you is experiencing psychological hardship. Maybe it’s you. Last year, an estimated 47 million Americans experienced mental illness; that’s almost one in five.

In response, mobile apps designed to increase users’ psychological wellbeing have similarly proliferated in recent years. Some are generic wellness apps that motivate people to meditate or do yoga, while others provide targeted treatments for specific mental illnesses such as post-traumatic stress disorder or bipolar disorder. Each of these technologies has the potential to reach people who might otherwise lack access to mental health care.


Read More: Your Next Therapist Could Be a Chatbot App


At the onset of the pandemic, mental health professionals struggled to meet the growing demand for their services. A survey of adults who received such services revealed that 17.7 million Americans experienced delays or cancellations of appointments in 2020. Though demand has since decreased slightly, access to services remains a significant issue: Last year, over 26 million Americans experienced a mental illness that went untreated.

While traditional therapists must undergo a licensing process, there is no equivalent screening process for mental health apps. “It’s the Wild West out there. The soil is fertile for all kinds of actors to play in the sandbox,” says Smisha Agarwal, an assistant professor of digital health at the Johns Hopkins Bloomberg School of Public Health.

In May, Agarwal and her colleagues published an evaluative framework for mental health apps. It’s one of a few proposed systems to sift the good from the bad. But for now, users will have to decide for themselves.

Questionable Criteria

The most widely used mental health apps, like Calm or Moodfit, target a wide audience; they’re designed to help anyone who is feeling stressed, anxious or depressed. The approach combines wellness exercises with gamification. In-app goals and rewards motivate users to cope with negative emotions through healthy outlets.

Agarwal explains that apps like these present little direct risk to users. This is because the behaviors that they promote are healthy for most people, regardless of mental state. Keep in mind, however, that some apps may not be effective at what they set out to do. “Many are lacking in terms of user interface and general usability,” she says. “And most are not using established behavior change modalities or evidence-based therapeutic protocols.”

While the apps are questionable therapeutic methods for people struggling with mental illness, studies have shown that some can have a positive impact on the general population. A 2018 paper found that using the meditation app Headspace reduced stress and irritability among a random sample of healthy adults.

Unfortunately, many wellness apps have a data security problem. A May report by the software developer Mozilla studied 32 popular mental health apps and ultimately designated 28 as “privacy not included.” Some of these apps simply had weak security measures, while others included clauses in their privacy policy that allowed them to sell users’ data to third parties.

“You’re dealing with a population with mental health challenges. The privacy and security statements are barely understandable even to someone operating at their full mental capacity,” Agarwal says. At best, user data could be used to create targeted advertising on other websites. At worst, a security breach could give hackers access to personal health and financial information.

A Balancing Act

While apps like Calm and Headspace are aimed at low-risk populations, many apps have been developed as potential therapeutic tools for higher-risk populations — people with schizophrenia, bipolar disorder or PTSD. Up to this point, however, few of these designs have made it past clinical trials. The ones that do often have a hard time scaling up.

“I think there are two big types of apps out there,” says David Bakker, a clinical psychologist and founder of the app MoodMission. “One is a research-focused app that is developed quite thoroughly by academics. Then they have no idea how to run the business after the grant money runs out.” The second type, he says, is regulated by profits and collects user data like every other app.

When Bakker founded MoodMission in 2015, he hoped to avoid some of the pitfalls of other mental health apps by running the company on a not-for-profit model. The app aims to alleviate symptoms of depression and anxiety by suggesting a combination of cognitive behavioral therapy and general wellness exercises to users. In 2019, Bakker and his colleagues conducted a randomized control trial that showed the app successfully helped depressed subjects develop effective coping mechanisms. And unlike other research-backed apps, MoodMission has been downloaded more than 100,000 times to Android and Apple devices.

Though MoodMission’s combination of rigorous research and popularity is uncommon among current mental health apps, it’s proof that an organization with the right mission can develop something that is both effective and accessible.

Future Frameworks

Now, the crux of the matter is how to educate consumers on what to look for. “You can regulate the providers, but you can’t regulate the patients,” Agarwal says.

Ultimately, she hopes that an established framework for evaluating mental health apps will “empower consumers and clinical providers with information.” While app seekers must currently wade through blogs and user reviews to make a decision, a stamp of approval from a certification organization might one day tell us which apps are safe and effective. It’s the same model that empowers shoppers to select organic or fair-trade products at the grocery store.

In the meantime, innovators will continue to evolve the technology that powers these apps. Bakker envisions a future app that uses artificial intelligence to aid clinicians in selecting therapeutic interventions for mental health patients. It’s a vision that is shared by technology companies like Limbic.

“This way, we can do the work of connecting with someone interpersonally, and at the end of a session I can go to my tablet and see that there is an 86 percent chance that a certain approach is going to work well for this person,” says Bakker. “As a psychologist, I look forward to a future where there can be a psychology treatment model that is a hybrid between an AI and a human.”