As director of the University of Florida’s Counseling and Wellness Center, Sherry Benton could never keep up with the student demand for services. Adding three new positions bought the center only two waitlist-free weeks. Knowing the school could never hire its way out of the resource shortage, she and Bob Clark, a seasoned software developer and veteran health-care executive, created a wellness and mental health app for students.

TAO Connect is just one of dozens of mental health apps permeating college campuses in recent years. In addition to increasing the bandwidth of college counseling centers, the apps offer information and resources on mental health issues and wellness. But as student demand for mental health services grows, and more colleges turn to digital platforms, experts say universities must begin to consider their role as stewards of sensitive student information and the consequences of encouraging or mandating these technologies.

The rise in student wellness applications arrives as mental health problems among college students have dramatically increased. Three out of 5 U.S. college students experience overwhelming anxiety, and 2 in 5 students reported debilitating depression, according to a 2018 survey from the American College Health Association.

Even so, only about 15 percent of undergraduates seek help at a university counseling center. These apps have begun to fill students’ needs by providing ongoing access to traditional mental health services without barriers such as counselor availability or stigma.

Universities license the TAO Connect software and customize the experience by selecting from a curated store of options: It offers hundreds of videos, several hundred interactive exercises, a mindfulness library, self-assessments and logs to practice new skills.

Now on more than 150 college campuses, incoming freshmen are encouraged to download the app. At many schools, first-year students are also required to sign up for various online services. Some university clients also incorporate modules — like a seven-part resilience course — into their core curriculum, and others have opted to use the platform’s units on anger management, communication skills and substance abuse in student discipline and conflict resolution.

To many, the growing prevalence of mental health apps for young students, a generation for whom digital technology is the norm, makes sense.

“If someone wants help, they don’t care how they get that help,” said Lynn E. Linde, chief knowledge and learning officer for the American Counseling Association. “They aren’t looking at whether this person is adequately credentialed and are they protecting my rights. They just want help immediately.”

Yet she worried that students may be giving up more information than they realize and about the level of coercion a school can exert by requiring students to accept terms of service they otherwise wouldn’t agree to.

“Millennials understand that with the use of their apps they’re giving up privacy rights. They don’t think to question it,” Linde said.

YOU at College, another student wellness application, is also advertised as specifically benefiting freshmen, who are making a transition away from home for the first time.

The software, developed in 2014 by Joe Conrad, chief executive of Grit Digital Health, approached the growing demand for services by acknowledging mental health issues as a common part of the college experience. “We’ve hid the vegetable around those issues and surrounded them with other content on academics and success, purpose and meaning and social connections,” Conrad said.

Now on 55 private and public college campuses, with 40,000 accounts, the platform is introduced during new-student orientation, where freshmen can create a profile that acts as a personalized well-being website throughout the four-year schooling experience.

Mental health apps thrive on data; the more an app learns about a user, the more it can customize the experience.

Often, these apps measure student progress with routine assessments that prompt students to track their thoughts, physical activity, diet and symptoms. Some evaluations ask about dating life, alcohol consumption and illegal drug use. Others, like YOU at College’s 18-question “reality check,” cover topics including stress and anxiety, friend networks and sleep patterns. Once the reality checks are completed, students receive report cards and suggested content in areas where they could improve.

Last year, the Institute for Science, Law and Technology analyzed the privacy policies and permissions of hundreds of mobile medical apps. It found that only 38 percent had privacy policies pre-download, so consumers couldn’t determine what was going to happen with their information. The available policies were often difficult to locate and challenging to understand.

Many terms of service stated the policy could change at any time without notice to the user or included a catchall provision that said the company would make every attempt to be compliant with Health Information Portability and Accountability Act (HIPAA) but didn’t guarantee information privacy.

“By agreeing to use those platforms, you were essentially relinquishing privacy rights,” said Lori Andrews, director of the Chicago-Kent College of Law at Illinois Institute of Technology and an internationally recognized expert on emerging technologies.

In addition to privacy policies, many apps have permissions that are granted to a developer when downloaded to a cellphone. These can authorize access to sensitive information, such as location tracking, audio and phone contacts, which can be shared with aggregators or sold to third parties.

Many apps also failed to fulfill promises: Of the apps that were supposed to warn about lethal drug interactions, 67 percent failed to recognize when researchers input a fatal combination, according to the study. One app asked for access to contacts and the phone’s microphone, so it could call a loved one if the user hadn’t left his or her room for an extended period.

“We left the app running for a month. It took all the information for marketing purposes but didn’t call any loved one,” Andrews said.

In traditional medical settings, there are robust privacy protections for personal health information. Universities that receive federal funding are also subject to laws that protect the privacy of some educational records.

Mental health apps recommended or required by colleges ought to sit at the intersection of HIPAA and Family Educational Rights and Privacy Act (FERPA), though oftentimes they don’t. Even where developers claim to be compliant, the law is relevant only when information is in the hands of a health-care provider, medical institution or covered university.

HIPAA “does not apply to user-generated data from the platforms,” including reality checks, self-assessments and quizzes, according to Andrews.

If another app picks up intimate information from the mental health or well-being app, details entered by the student aren’t covered and can be sent elsewhere. And, Andrews said, because HIPAA protections apply only to medical information, data such as location, sleep cycles and number of steps taken daily, though they may reflect a change in health or could be used to predict health status, is not legally considered “health information.”

When data from a mental health app is shared or sold to other parties, a wealth of information can be used for purposes beyond the health needs of students. Insurers can use it to calculate premiums, employers can use it to assess risk, advertisers can use it to tailor ads to consumer preferences or conditions, and all can exploit students’ weaknesses.

A student suffering from an eating disorder may be presented an ad for laxatives, or, if a student is flagged as a suicide risk or likely to suffer from severe depression, he or she may be denied a job or security clearance.

“There can be significant real-life consequences,” Andrews said. “The health advantages just don’t outweigh the privacy risks.”

Read more: