By DEVEN McGRAW and VINCE KURAITIS
This piece is part of the series “The Health Data Goldilocks Dilemma: Sharing? Privacy? Both?” which explores whether it’s possible to advance interoperability while maintaining privacy. Check out other pieces in the series here.
Early in 2019 the Office of the National Coordinator for Health IT (ONC) and the Centers for Medicare and Medicaid Services (CMS) proposed rules intended to achieve “interoperability” of health information.
Among other things, these proposed rules would put more data in the hands of patients – in most cases, acting through apps or other online platforms or services the patients hire to collect and manage data on their behalf. Apps engaged by patients are not likely covered by federal privacy and security protections under the Health Insurance Portability and Accountability Act (HIPAA) — consequently, some have called on policymakers to extend HIPAA to cover these apps, a step that would require action from Congress.
In this post we point
out why extending HIPAA is not a viable solution and would potentially
undermine the purpose of enhancing patients’ ability to access their data more
seamlessly: to give them agency over
health information, thereby empowering them to use it and share it to meet
rules were not designed to address privacy risks introduced by
widespread personal information collection and use in the modern digital
rules were designed to support information flows within the
health care system and allow for broad uses and disclosures of data by both
covered entities and business associates without the need to obtain patient
- HIPAA is
“leaky” — it expressly allows covered entities and
business associates to share data outside of HIPAA, including selling
de-identified data, without patient consent.
- HIPAA’s rules
protect data and also protect incumbents’ interests in controlling health data.
Congressional action is needed to establish meaningful privacy protections for
Why Is This Issue
The HIPAA Privacy Rule has provided patients with the right
to copies of their health information, from both health care providers and
health plans, since its inception (original rule finalized in 2000) — and the proposed rules double down on HIPAA’s promise. They require
certified EHRs to include functionality that affirmatively makes information
available to patients through open standard application programming interfaces
(APIs) and impose a separate penalty structure for “blocking” information
sought by patients, either when they act on their own or seek to access
information through selected personal health record apps or platforms.
Stakeholders have expressed concerns that patients will be taken advantage of by apps, which are not covered by HIPAA and which will use, share, and monetize sensitive health information without the patient realizing — or meaningfully consenting to — what is happening. This is a legitimate concern. HHS and NCVHS issued recent reports on this issue, and concerns surfaced in recent news reports about how technology companies handle personal information undermine public and health industry trust in expanding the health data ecosystem.
This dilemma — how to make more data available to improve health and wellness (including by providing it to patients) while addressing privacy risks — has been explored in several prior posts in this series, including “Health Data Outside HIPAA: Will the Protecting Health Data Act Tame the Wild West” and “Patient Controlled Health Data: Balancing Regulated Protections with Patient Autonomy.”
Addressing Privacy Outside of HIPAA
The Federal Trade Commission (FTC) has authority to require
commercial apps to adopt reasonable security safeguards and be transparent with
customers and the public about their data practices, and FTC can hold companies
accountable when they are not living up to promises in their privacy policies
and terms of service. They are essentially the consumer privacy watchdog in the
U.S., and they have had cases
involving companies’ use and disclosure of health information. However, the
FTC’s authorities are unlikely to sufficiently protect health data outside of
HIPAA, and the FTC’s recent track record with respect to abuses of consumer
trust, particularly with respect to large tech companies, has been the subject
States are getting more active in enacting strong consumer
data privacy laws, but these laws may not effectively fill the gaps. For
example, the California Consumer Privacy Act does not apply to personal data in
consumer-facing apps and services that collect information on California
consumers from provider medical records (because these apps are already covered
by the state’s health privacy law) —and yet these are the very types of apps
that will have greater access to information in certified EHRs under the ONC proposed
Congress is considering legislation that would protect
personal data — including health data —outside of HIPAA. While it is likely that
Congress will take some action on personal data privacy, the scope of that
legislation — and timing of enactment — is unclear, particularly in a presidential
Should HIPAA Be
What about extending HIPAA to cover these apps? Couldn’t Congress get that passed relatively quickly? We had this debate back in 2008 during consideration of HITECH, and Congress rightly rejected it. Consumers should have control of information in apps designed and marketed for their use. HIPAA’s rules were not designed to address privacy risks introduced by widespread personal information collection and use in the modern digital ecosystem. Instead, HIPAA’s rules were designed to support information flows within the health care system and allow for broad uses and disclosures of data without the need to obtain patient consent — and, except in the case of disclosures for payment purposes when the patient has paid out of pocket for care, even over the patient’s objections.
The HIPAA Privacy Rule allows providers and health plans to
use and disclose identifiable health information for treatment, payment, and
“health care operations” — commonly known as TPO. TPO disclosures are the most common, but they
are not the only disclosures permitted without patient consent. The disclosures in Exhibit A are all
expressly permitted by the Privacy Rule.
Problems With Extending HIPAA to Consumer Apps
HIPAA Expressly Allows Covered Entities and Business Associates to Share Data Outside of HIPAA
Access by consumer apps to health information is not the
only threat to information moving outside of HIPAA’s coverage. HIPAA’s
“protections” have always facilitated the disclosure of patient data outside of
the health care system without a patient’s authorization. Each time
identifiable health information is disclosed pursuant to one of these permitted
purposes (see Exhibit A), the information potentially moves outside of HIPAA
coverage unless it is disclosed to another entity that is already covered by
the Rule (for example, to another covered entity (like doctors sharing
information with one another for treatment purposes) or to a business associate). Consequently, identifiable data moves legally
by providers and plans outside of HIPAA every day and has done so since the
HIPAA Privacy Rule first went into effect.
The recipient of HIPAA data may be required to protect that data
pursuant to another law (for example, state privacy laws governing state public
health departments), but this is not guaranteed.
Vendors to health care providers and health plans — known as
business associates — also can take advantage of the HIPAA Privacy Rule
permitted uses and disclosures, as long as their contracts — their business
associate agreements (BAA) — allow this. In many respects, making
consumer-facing apps business associates under HIPAA would be doubly
problematic: such apps could then share
data permissively without the consumer’s authorization per the Privacy Rule and
the health care provider or plan also would control (through the BAA) how the
app’s data could be used and shared. (After all, by definition HIPAA business
associates work “on behalf of” covered entities.) This hardly serves the goals of democratizing
health care, empowering patients with their data so they can use it — and share
it — as they see fit.
De-Identified Data Can be Shared and Sold
HIPAA also permits the disclosure — and even the sale — of
“de-identified” patient data, as long as the data are de-identified to HIPAA
standards. Business associates may de-identify data they receive from a covered
entity and use it, share it, and sell it as they please, as long as their BAA
permits this. (In the experience of one of the authors, this is a fairly common
provision in BAAs.)
De-identified data is not as risky as fully identifiable data, but the data are not at zero risk of re-identification. Monetization of de-identified data is fairly common in the health care industry. Adam Tanner’s book, Our Bodies, Our Data: How Companies Make Billions Selling Our Medical Records, describes how HIPAA de-identified data are commonly linked across data sets to compile detailed profiles of particular patients, even though these patients are not “identified” by name. And although there are many beneficial uses of de-identified data (such as for research purposes), Tanner’s book details purchase and sale of data for what he refers to as “mundane commercial purposes.” In another example, Practice Fusion, a certified electronic medical record product, was initially free for doctors if they agreed to be advertised to based on patient data; their agreements with physicians also gave them the right to de-identify and monetize the data. Recently, Omny Health, a health tech start up that enables providers to “sell” their data, was voted by the audience as the most promising new technology at the Health 2.0 conference in 2019. Since HIPAA does not require disclosure of recipients of de-identified data, it is difficult to understand the full scope of this activity in the health care system (an issue also discussed in Tanner’s book).
Consumers know very little about actual data practices of
entities covered by HIPAA, as the HIPAA Notice of Privacy Practices is only
required to include information about what uses and disclosures are permitted
by HIPAA (not information on which uses and disclosures are actually occurring)
and is only required to be provided to patients by covered entities (not
business associates). And as noted
earlier, patients are not asked for consent to these information flows (and for
the most part, cannot stop them).
In summary, HIPAA’s rules both protect data — but also protect
incumbents’ interests in controlling health data, which gives rise to some
skepticism on the motives behind opposition to the proposed rule due to “privacy”
concerns. In our experience, entities covered by HIPAA rarely criticize it for being
too lenient in its permitted uses and disclosures of health data.
U.S. policymakers are poised to take meaningful steps to
make patients’ access to all their digital health a more seamless process. Some
stakeholders have even gone so far as to suggest that ONC/CMS should delay
implementation of the rules. But asking to
hit the pause button in the name of privacy seems particularly ironic,
since giving individuals the right to copies of their information is a hallmark
of fair information practices, the foundation for all privacy law. At the same
time, the rosy vision of a patient empowered with her digital health data,
using it and sharing it as she pleases, is threatened by an app ecosystem that
is not sufficiently transparent about data practices, that does not provide
users with meaningful choices, and is not held sufficiently accountable for
harmful uses and disclosures and failure to be responsible stewards of health
Congressional action to establish meaningful privacy protections for personal data – including health data – is needed but extending HIPAA to consumer apps is not the answer. In the interim, greater transparency of consumer health app data sharing practices can at least help consumers make better choices about apps that fit their needs and values – and doesn’t necessarily require further Congressional action if such transparency is voluntary. Demand exists for more information on which apps have the best policies and track records with respect to protecting data, so it is not hard to envision a market developing for app rating services. Voluntary codes of conduct for apps have already been developed by the CARIN Alliance, the Consumer Technology Association, and the AMA’s Xcertia initiative. (Full disclosure: one of the authors (McGraw) contributed to the CARIN Alliance Code of Conduct.) Providers and consumer advocacy groups can point people to resources that will help them make data sharing choices that are right for them. These transparency measures can help address privacy concerns while we await more meaningful protections from Congress.