Health, wellness apps pose risks to consumer privacy

Register now

While there are federal regulations governing how personal health data is shared, only a fraction of the increasingly popular consumer apps on the market are actually regulated, leaving enormous amounts of information largely unprotected.

As a result, Americans should be concerned about how these apps collect and share their personal health data. That’s the warning being made by Rice University medical media expert Kirsten Ostherr in a presentation at today’s “Data Privacy in the Digital Age” symposium hosted in Washington by the Department of Health and Human Services.

“There’s a lot of concern around issues of data privacy—especially the more that we have these huge breaches like Equifax and everything that’s come before—and little understanding of how that might spill over or play out in health data domains, such as consumer-facing apps and internal EHR systems,” says Ostherr, director of Rice University’s Medical Futures Lab.

“Part of my research is looking at ways the boundaries between medical and non-medical environments are dissolving through the proliferation of apps that allow people to manage their own care outside of clinical settings,” she adds. “In some ways, those boundaries are breaking down because a lot of things that used to only happen inside of hospitals can happen outside of them now.”

Currently, there are more than 165,000 health and wellness apps available through the Apple App Store and about the same number of apps in the Google Play Store. Ostherr contends that millions of consumers are using these apps and wearable technology such as Fitbits with little concern about sharing the data collected on these devices—often agreeing to companies’ terms of use agreements without reading the fine print.

“They are allowing the corporations that manufacture these devices to share all of their data and sell it to third parties, usually for the purpose of digital profiling,” cautions Ostherr. “All of those digital footprints are being used to understand in a more complex way the social, environmental and behavioral determinants of health. They’re being used to profile people by health insurance companies and by other kinds of corporate interests. People are not worried about what’s being done with those data, but they should be.

“There are very few protections of those data, whereas in clinical settings—where we’re talking about actual medical data—there’s a huge amount of protection both around EHRs and around clinical research protocols,” she adds.

While apps that make medical or therapeutic claims are considered a medical device and must be reviewed and approved by the Food and Drug Administration, Ostherr says that the FDA regulates only a fraction of the overwhelming number of apps that are commercially available. “As long as these companies carefully sidestep those medical or therapeutic claims, then they do not have to be regulated by the FDA, and they are not—and, just because something is FDA-regulated doesn’t mean that the data is covered by HIPAA,” she observes.

Also See: OCR releases HIPAA guidance for mHealth app use

Either way, Ostherr says all of these apps are “capturing tons of personal data, some of which would be classified as personal health information if it were subject to oversight by HIPAA.” At the same time, she believes that the chances of data from unregulated health apps being utilized in a healthcare setting by a physician who might review the information to benefit patients are “almost nil.”

Among Ostherr’s recommendations: there needs to be more transparency in what consumers are agreeing to when it comes to terms of use for health and wellness apps, and these apps “should not be so categorically exempted from regulation around data sharing in particular.”

She also advocates for “gradations” of agreements to sharing data. “For many of these companies, their whole business strategy relies on their ability to sell the data—the data is the real commodity,” charges Ostherr. “Consumers should have the option to opt in to different levels of data sharing, for different periods of time or under different circumstances—and, to know that they’re doing that, so they are forced to actually make a choice and therefore raise a bit more awareness about what they’re doing.”

Currently, she says there is an “all or nothing” approach to data sharing taken by most consumer-facing health and wellness apps that must be changed. In addition, Ostherr would like to see the terms of use for consumer and clinical apps brought into better alignment.

“There’s a paradoxical situation where people freely consent to share their data in the consumer realm but are very unwilling to do so in the clinical realm, where it is very difficult to get participants to consent to share their data for medical research,” she concludes. “There needs to be a better public understanding of what’s being done with their data and how it might benefit or harm them in both of these settings.”

For reprint and licensing requests for this article, click here.