His solution turned out to be simpler than he expected: A Slack channel, a private online community for people in life-or-death struggles reaching out in real time to save one another.
Schleien, who works in business technology in New Jersey, called the channel 18percent for the 18 percent of the U.S. population living with mental illness. Launched this year by a nonprofit group that Schleien founded with David Markovich, an Internet marketing consultant, the channel has about 150 members who have sent more than 11,000 messages.
Someone who might “feel alone and isolated can reach out at 2 a.m. and find someone on the other side of the world,” Schleien said. “And that connection, finding that other peer who has walked your walk, could potentially save a life.”
Mental-health care is no longer limited to psychiatric drugs and face-to-face counseling. Half of all U.S. counties have no psychiatrist, psychologist or social worker, and that lack of access, plus cost, has put traditional treatment beyond the reach of many. The breach is now being filled with “digiceuticals,” part of a new field of mental-health technology that includes smartphone chatbots that text self-help advice for those dealing with depression as well as virtual-reality-exposure therapy for individuals battling anxiety disorders.
And there is more on the electronic horizon, for both the worried well who want to relieve everyday stresses and the more seriously mentally ill. Pepper, a humanoid robot that moves naturally and is capable of “reading” a person's emotions, has conducted meditation and mindfulness classes for British college students. For people with schizophrenia, avatars that embody a person's auditory hallucinations also show promise.
Some critics call this technology scene a kind of “Wild West.” Of the more than 165,000 health-related apps worldwide, 30,000 are dedicated to mental health, according to the IMS Institute for Healthcare Informatics. But few have been rigorously tested, and a lack of regulatory oversight has prompted concerns about “whether these apps have sufficient safeguards, and what are the indications for who should be using them and in what way,” said Nicole Martinez-Martin, a bioethicist and research fellow at the Stanford Center for Biomedical Ethics.
The National Institute of Mental Health acknowledges both the pros and the cons of the “new technology frontier,” with the risk to health privacy among the additional negatives. The advantages, it says, include anonymity, low cost, convenience and 24-hour service.
“Technology may be a good first step for those who have avoided mental health care in the past,” NIMH notes.
Young people seem especially receptive given their constant use of electronic devices, say technology designers and researchers. For them, turning to a smartphone or tablet when they have a problem is second nature. The ease and anonymity of apps also lessens the perceived stigma of seeking help.
“Personally, I always have my phone,” said Kayleigh, a 16-year-old in the Midwest with anxiety that is sometimes debilitating. The high school student, who spoke on the condition that her last name not be used, began using the 18percent Slack channel just a few weeks ago. As someone who has been bullied, she said, she likes that the conversations are confidential and that what's shared can't be searched on the Internet. Not having to set up an account, as most online forums require, was reassuring for privacy, and knowing that others on the channel have similar issues made her feel comfortable reaching out.
“If I walk into a room, and I’m anxious or alone in my house,” Kayleigh said, “I will jump on [Slack] and say, 'Hey, guys, I need some help now.' Usually someone will say, 'Yeah, what’s up?' pretty quickly.”
The Slack channel is free. Other apps charge, although their price is usually far less than the cost of seeing a psychotherapist. Ginger.io, for instance, allows a user to text a mental-health specialist and receive a reply within a minute, no matter the time of day. It costs anywhere from $129 a month for 24/7 access to a mental-health coach to $349 a month for the same service, plus two video sessions with a licensed psychiatrist.
But a simpler option like notOK, a kind of digital panic button developed this year by siblings Hannah and Charlie Lucas in Georgia, will set you back a mere $1.99 a month. When someone taps the notOK app, a 13-word text — “Hey, I'm not OK. Please call me, text me, or come find me” — is sent along with a GPS notification of the person's location to a select number of “trusted contacts” chosen by the user. These might be friends, family members or peers, all of whom must live close enough to be able to drive to the person if necessary. (Users are instructed to first call 911 if they are in danger of hurting themselves.)
Inspiration for the app came from Hannah after she developed a disorder that causes frequent fainting. For the depression and anxiety that followed, she knew she needed a way to get assistance immediately.
“Depression inhibits a person's ability to ask for help,” 16-year-old Hannah explained last week. “Depression tells you you're alone and no one wants to listen to you. This makes it easier to say you need help, without trying to put it into words.”
Both she and her brother, who is 13, have heard from users about how much notOK has meant to them — for a variety of issues, including panic attacks and recovery from drug addiction. The app has been downloaded about 12,000 times since it was made available in late January, with approximately 3,000 hits of the button every month, according to Charlie.
Elsewhere on social media, the constant beeps, buzzes and flags that signal a new posting or notification have prompted complaints. But mental-health apps are different, proponents say. Some include “those same tools — notifications — but in a productive way” that can support users, said Roxy Fata, a content strategist at Digital Third Coast, which evaluates apps.
The instructions that accompany most of these apps often point out that they are not equivalent to traditional therapy and should be used only in conjunction with such treatment.
“It’s a supplement to therapy, for sure,” said Kerrin McLaughlin, a web designer who is one of a handful of moderators on 18percent. Her Slack channel job is to connect users with resources, be supportive and respond to alerts when the app program detects a participant using such words as “suicide” or “I want to die.” That's when she will ask a series of questions, provided by professional advisers, to determine the seriousness of the person's self-destructive messages and, if warranted, share the number of a suicide hotline.
Founders Schleien and Markovich are also moderators, and they like to say they “nurture” the Slack community.
“If someone joins, and we notice them, and they’re talking a lot and then just disappear, we’ll send them a message” checking on them, Markovich said. “We actually monitor who is active and who goes inactive.”
The Food and Drug Administration's process for evaluating mobile medical apps is still being sorted out. Last September, the FDA issued its first and still only approval for the marketing of a mobile medical device aimed at treating substance abuse. The smartphone app reSET is meant to be used solely for that purpose and as an adjunct to traditional outpatient treatment. Its digital program is based on the principles of cognitive behavioral therapy: The app checks in daily and asks a series of questions, such as “Have you used today?” and “How strong is your craving right now?” and “What triggers are affecting this craving?”
IntelliCare, developed at Northwestern University's Center for Behavioral Intervention Technologies, is based on a similar methodology aimed at depression and anxiety and includes a suite of programs. Some of these mini-apps give advice, others offer a checklist.
“When you start, the app gives you a brief screening to find out the severity” of the user's problems, said psychologist Stephen Schueller, one of IntelliCare's developers and an assistant professor of preventive medicine at Northwestern.
Individuals who are depressed and likely to stay in bed all day might be prompted with “goals” to get out of bed, brush their teeth and eat something. “So you check them off as you do them,” Schueller said, “and as you check them off, you're given harder things to accomplish. People really like being challenged.”
The app also collects information, such as phone use — how often, what time of day — and location, which can help indicate how often the person leaves home and whether he or she is regularly going to work or keeping to a routine. The data can be accessed by the user's therapist, making it easier to assess and even predict the individual's state of mental health, Schueller said.
IntelliCare is one of the few apps that has been tested in a clinical trial, albeit a small one. Most mental-health professionals agree that the lack of serious study has opened the field to criticism. That's slowly changing.
Rigorous examination of these apps is what motivates psychologist Dror Ben-Zeev, director of mobile mental-health services at the University of Washington.
His intervention, still in the testing stage, is a kind of pocket therapist. Called FOCUS, it gives the most seriously mentally ill a way to manage their symptoms no matter where they are. For example, if a person with schizophrenia is trying to deal with auditory hallucinations, he or she can call up the app and get gentle advice for coping. “Listening to music on headphones or the radio can really help” is one suggestion. “Sounds simple, right? Give it a try and see how you feel.”
In a recent study looking at the app's use for a range of serious mental disorders, Ben-Zeev and his team found that subjects were willing to try FOCUS and that it proved effective. Some people even called it empowering.
“The most salient and important aspect,” Ben-Zeev said, “is an intervention can only do well if people are engaged in it.”