Why Facebook privacy concerns should impact HIT’s future
The recent exposure of Cambridge Analytica’s deceptive and invasive use of Americans’ Facebook information during the 2016 election is causing widespread outrage, and more concern is emerging as the company’s top executive has testified before Congress the past two days.
However, government regulations for healthcare information also could be called into question, and many wonder if a similar incident could happen with data from Americans’ medical records. The Facebook situation provides important lessons around needed safeguards in healthcare IT regulation.
One lesson is that individuals’ consent is not always granted forever; they may someday regret allowing data access down the line, as was the case with Facebook. Current government regulations would still require patients to grant access to their health information, but the Facebook controvery shows the inadequacy of that protection alone.
Another lesson from the current debacle is that the government and the industry ultimately will be held accountable for making it too easy for apps to abuse people’s trust with unenforced requirements, even if it was ultimately consumers’ choice to permit access to their information.
Government regulation tilts toward approaches that weaken patient information safeguards by making it possible for anyone to create an app that can access patient information.
The Office of the National Coordinator (ONC) issued regulations for electronic health records that take effect in 2019. Part of those regulations require that “[Patient] applications should not be required to pre-register (or be approved in advance) before being allowed to access the API.” Consequently, apps can access millions of Americans’ medical records without anyone verifying that the developers aren’t being reckless or malicious.
Recently, the ONC released its Draft Trusted Exchange Framework. The framework proposes that any app can register itself to gain access to patient information, with no human involved. It’s an easy requirement, and it’s crucial to understand that almost anyone with a website could make such an app.
These regulations are modeled on the systems used by social media to create a rich app ecosystem. But they are making it too easy for patients to confuse “verified” or “secure” with “trustworthy,” a mistake even technically savvy programmers make. That distinction has become painfully aware in the Facebook situation.
In healthcare IT, security isn’t about protecting your Candy Crush score, who’s your friend or what you “like.” The industry works with people’s most personal information: your son’s sensitive history of conditions, your drug abuse treatments, your spouse’s address and Social Security number, for example. It’s clear that new security questions are arising—for example, in the current environment, it may be possible for a close relative to authorize an app to access your information without your knowledge or consent.
The level of protections for this information should be higher than used by social media. However, just like Facebook, the ONC’s proposal requires apps to be responsible (many will probably not be covered by HIPAA), but doesn’t ensure that they are truly responsible before they are given access to patient data. After-the-fact regulatory punishment, even if possible, will be little comfort to those whose privacy has been violated.
Industry and government need a more sophisticated dialogue to recognize the tension between sharing information easily yet responsibly. Clearer rules about what happens if an app isn’t abiding by the requirements and the creation of an independent body to certify that apps have appropriate safeguards in place before being given access to patient data should be considered in the current rulemaking cycle.
If government regulations continue pursuing technical standards facilitating the ease of information sharing without giving more consideration to responsibly sharing that information, healthcare IT will be courting its own Cambridge Analytica moment and the deep violation of trust that goes with it.