Health, wellness apps pose risks to consumer privacy
While there are federal regulations governing how personal health data is shared, only a fraction of the increasingly popular consumer apps on the market are actually regulated, leaving enormous amounts of information largely unprotected.
As a result, Americans should be concerned about how these apps collect and share their personal health data. That’s the warning being made by Rice University medical media expert Kirsten Ostherr in a presentation at today’s “Data Privacy in the Digital Age” symposium hosted in Washington by the Department of Health and Human Services.
“There’s a lot of concern around issues of data privacy—especially the more that we have these huge breaches like Equifax and everything that’s come before—and little understanding of how that might spill over or play out in health data domains, such as consumer-facing apps and internal EHR systems,” says Ostherr, director of Rice University’s Medical Futures Lab.
“Part of my research is looking at ways the boundaries between medical and non-medical environments are dissolving through the proliferation of apps that allow people to manage their own care outside of clinical settings,” she adds. “In some ways, those boundaries are breaking down because a lot of things that used to only happen inside of hospitals can happen outside of them now.”
“They are allowing the corporations that manufacture these devices to share all of their data and sell it to third parties, usually for the purpose of digital profiling,” cautions Ostherr. “All of those digital footprints are being used to understand in a more complex way the social, environmental and behavioral determinants of health. They’re being used to profile people by health insurance companies and by other kinds of corporate interests. People are not worried about what’s being done with those data, but they should be.
“There are very few protections of those data, whereas in clinical settings—where we’re talking about actual medical data—there’s a huge amount of protection both around EHRs and around clinical research protocols,” she adds.
While apps that make medical or therapeutic claims are considered a medical device and must be reviewed and approved by the Food and Drug Administration, Ostherr says that the FDA regulates only a fraction of the overwhelming number of apps that are commercially available. “As long as these companies carefully sidestep those medical or therapeutic claims, then they do not have to be regulated by the FDA, and they are not—and, just because something is FDA-regulated doesn’t mean that the data is covered by HIPAA,” she observes.
Either way, Ostherr says all of these apps are “capturing tons of personal data, some of which would be classified as personal health information if it were subject to oversight by HIPAA.” At the same time, she believes that the chances of data from unregulated health apps being utilized in a healthcare setting by a physician who might review the information to benefit patients are “almost nil.”
She also advocates for “gradations” of agreements to sharing data. “For many of these companies, their whole business strategy relies on their ability to sell the data—the data is the real commodity,” charges Ostherr. “Consumers should have the option to opt in to different levels of data sharing, for different periods of time or under different circumstances—and, to know that they’re doing that, so they are forced to actually make a choice and therefore raise a bit more awareness about what they’re doing.”
“There’s a paradoxical situation where people freely consent to share their data in the consumer realm but are very unwilling to do so in the clinical realm, where it is very difficult to get participants to consent to share their data for medical research,” she concludes. “There needs to be a better public understanding of what’s being done with their data and how it might benefit or harm them in both of these settings.”