AHLA's Speaking of Health Law

The Patchwork That Is U.S. Health Care Data Privacy

May 14, 2024 AHLA Podcasts
The Patchwork That Is U.S. Health Care Data Privacy
AHLA's Speaking of Health Law
More Info
AHLA's Speaking of Health Law
The Patchwork That Is U.S. Health Care Data Privacy
May 14, 2024
AHLA Podcasts

Omenka Nwachukwu, Principal Consultant, Clearwater, speaks with Robert Kantrowitz, Partner, Kirkland & Ellis, about the key issues surrounding health care data privacy and the various points of intersection from a regulatory standpoint. They discuss challenges related to inferential data and de-identified data, federal and state privacy laws, the recently proposed American Privacy Rights Act and its implications for the health care sector, risks for health care organizations that share and store data with internationally based partners, and how the Federal Trade Commission’s purview has expanded the regulatory risks associated with health privacy matters. Sponsored by Clearwater.

To learn more about AHLA and the educational resources available to the health law community, visit americanhealthlaw.org.

Show Notes Transcript

Omenka Nwachukwu, Principal Consultant, Clearwater, speaks with Robert Kantrowitz, Partner, Kirkland & Ellis, about the key issues surrounding health care data privacy and the various points of intersection from a regulatory standpoint. They discuss challenges related to inferential data and de-identified data, federal and state privacy laws, the recently proposed American Privacy Rights Act and its implications for the health care sector, risks for health care organizations that share and store data with internationally based partners, and how the Federal Trade Commission’s purview has expanded the regulatory risks associated with health privacy matters. Sponsored by Clearwater.

To learn more about AHLA and the educational resources available to the health law community, visit americanhealthlaw.org.

Speaker 1:

Support for A HLA comes from Clearwater. As the healthcare industry's largest pure play provider of cybersecurity and compliance solutions, Clearwater helps organizations across the healthcare ecosystem move to a more secure, compliant and resilient state so they can achieve their mission. The company provides a deep pool of experts across a broad range of cybersecurity, privacy, and compliance domains. Purpose-built software that enables efficient identification and management of cybersecurity and compliance risks. And the tech enabled twenty four seven three hundred and sixty five security operation center with managed threat detection and response capabilities. For more information, visit clearwater security.com.

Speaker 2:

Good day everyone. This is Omeka KU principal consultant with Clearwater's Privacy and Compliance Consulting team. And today we are going to be talking about the patchwork that is US data privacy . While the recently announced American Privacy Rights Act, indicates that the United States may finally be moving toward federal legislation that governs the use of consumer data. Similar to the European Union's general data protection regulation, data privacy regulation in the United States remain sectorial and provincial in many respects. This is especially true in the healthcare industry where HIPAA looms large, but state and other industry specific regulations may complicate how an organization determines which law to follow. And the expanding use of inferential and de-identified data in the industry further clouds the situation by raising questions of what qualifies as healthcare data to help us navigate the patchwork that is US data privacy. I am very pleased to be joined by Robert Kitz, a corporate healthcare partner in the New York office of Kirkland and Ellis. Rob and I will discuss several key issues surrounding healthcare, data privacy, and the various points of intersection from a regulatory standpoint. Hey, Rob, it is great to speak with you.

Speaker 3:

Great, great to be here. Um , thanks, thanks for having me. It's always a , a pleasure to be on on these things. They're , they're always a lot of fun.

Speaker 2:

Yeah. So why don't we start by talking about the industry's current, thinking about what constitutes healthcare data. So over the last couple of years, as the use of large language models and machine learning has seen increased interest and adoption in healthcare, we've seen questions about inferential data and de-identify data and the relevance to data privacy regulation emerge. What are your thoughts on the growing risks related to the expansion on what is considered to be health data covered under current and proposed data protection laws, including how inferential and de-identified data is viewed under such laws ?

Speaker 3:

So, no. Great, great question. And , uh, one thing I just need to start out with is, anything I say is not the opinion of my firm. Now, we've gotten that outta the way for, for everybody. We can , we can get to the, the , the good part. So this question's in line with recent trend, I think in privacy legislation, regulations and guidance and whatnot that , that we've been seeing, which is that data which was not traditionally seen as directly health related or other types of sensitive data, has now getting interpreted to be health data or sensitive data. So the landscape, the landscape has shifted significantly over the past few years or even months. And this is something I've seen clients and others grapple with because this broadened scope impacts not just US lawyers, right, that advise on privacy matters, but the business folks having to pivot and find new avenues when the data that they were using that they thought was not regulated or restricted now potentially is mm-Hmm , <affirmative> , so much of this trend, it seems has followed the, the growth or I guess the bright spotlight on digital advertising and the use of tracking technologies such as pixels and the like, and the hype around an adoption of certain AI tools. So on on the marketing side there , there's a growing awareness that advertisers could track a website or an app user's activity or location and make inferences about that person's health , for example, did , did they turn on, and by they, I mean the, the user certain app functions about diet. And because of that, could the app or advertise or determine if that person is a diabetic, has food allergies or struggles with obesity, or if someone's location is tracked and they repeatedly go to say, a cancer treatment center, could it be inferred that that person has cancer and then the marketers could use that data for targeted advertisements for cancer drugs and so on? Mm-Hmm, <affirmative>. And then on the generative AI front, in , in many respects, generative AI is designed to, to make inferences based on data inputs. And some cases have shown it's very good at it. There, there's growth in AI enabled tech and use in healthcare or, you know, aspiring to do so or use it, which, which is expanded the types of inferences. Businesses can assemble from seemingly benign or not traditionally considered health data per perhaps purchasing certain toiletry products or something like that could be viewed by others as being in line with, okay, maybe that can be inferred and then become health data, right? So in some instances, there appears to be a reaction to that from stakeholders such as consumers and patients and lawmakers and regulators. Uh , and I guess the plaintiff's bar , I'll , I'll throw on that. They're , they're , they're not shy about their views on some of this stuff. So there's a risk of enforcement in using that data without the proper disclosures or permissions. So, and some of the, some of the recent FTC enforcement reference reference the use of some of this inferential data, for instance, and we've seen some state privacy laws include inferential data in their definitions of covered health information or sensitive data. And so with all that, the , the risk in using this inferential data without proper consent, so authorizations and so on, has surely increased. Um, and then I think your other part of the question was on the de-identified piece. And so this one's particularly interesting to some and perhaps frustrating or concerning to those who've built organizations or services based on being able to use de-identified or anonymized data, particularly those using data for say, research , uh, as many listening in probably know, de de-identified or anony anonymized data if done so properly, has generally been exempt, I think from the purview of major privacy regimes. But there have been recent lawsuits and studies that have alleged that data, quote unquote sufficiently de-identified under applicable privacy laws is potentially more re identifiable than the law's original intent or was previously known. And there's a view that, or growing view that AI and other technologies have made that possible, or more so than in the past. So this idea that if it's not actually de-identified, then there's a risk it could be covered by these laws and subject to the requirements such as, you know, obtaining consent for sharing or giving individuals rights to the data such as access or deletion. And , uh, thi this issue has come into play at the federal level. I've seen , so the, the February executive order on access to bulk sensitive data specifically notes that even data that is seemingly non-identifiable such as anonymized, anonymized de-identified data is increasingly at risk of being identifiable through the use of advanced technology. They list , I think they listed out big data analytics, ai, high performance computing, and the D-I-J-D-O-J in its notice proposed rulemaking regarding the executive order, has even requested comments with respect to whether and how transfers of data should be restricted. So when I, when I read it, I thought this is particularly interesting considering anonymized or de-identified data is exempt from the scope of, you know, many privacy regimes as, as I mentioned earlier.

Speaker 2:

Wow. Wow. So many , um, great considerations there.

Speaker 3:

Um , I'm sorry, that was a mouthful, <laugh>.

Speaker 2:

No, no, no, that was, that was perfect. Um, and just listening to the new considerations , um, especially regarding research , um, previously de-identified data, I mean, it really makes you think , um, I know I just completed , um, our security training, our, our monthly refresher training. And , um, during the training , uh, one of the modules mentioned that it is not advisable to take , uh, medical records and put them into chat GBT to help you transcribe , uh, patient notes or do patient charting or anything like that. And that is a new consideration because when you put that , um, information into that ai , uh, module, that data number one is no longer secure. Number two, you're putting that patient's data into a situation where it could be possibly used. So inferential data is, it's, it's a huge consideration.

Speaker 3:

And was that, and was that with respect to data that was identifiable, or was it just , um, was it just certain I identifiers that you think were just completely anonymized?

Speaker 2:

That was the example. The specific example was identifiable data, like literally typing Yeah. Patient notes into chat GBT to help you complete your, your reporting or charting at the end of the night. Um, but I think the risk is still significant, even if it's de-identified, like you were saying.

Speaker 3:

Yeah, no , we've see , we've seen a lot of that too because yeah, the , those, those types of algorithms can then, then trace it backwards. At least that's, that's some of the, some of the view that, that others have been having or that we've been seeing. So no, no, definitely an interesting , um, anecdote. Yeah, that , that I think everyone got that chat GBT notice, don't use company information in chat bt . So , um, exactly. Although there was that one lawyer I think who used it for a brief and it was cited to the wrong cases, but , um, the conversation for another time,

Speaker 2:

<laugh> Yes, we I saw that too, and I was like, oh , yeah, not gonna do that. Mm-Hmm. Um, all right , so let's shift gears a little bit , um, and let's talk about federal and state privacy laws and how they're intersecting right now. So right now, as of this recording, we have 15 states in the United States who have passed state privacy laws that are either already in effect or will go into effect within the next two years. And several other states have laws that are in committee. Meanwhile, we just had finalization of the HIPAA privacy rule to support reproductive healthcare privacy. So what are some key considerations about where HIPAA ends and state laws pick up?

Speaker 3:

So yeah, this is a very, very timely, timely question. Um, so these, these newer general and health specific state privacy laws, some of which, like the HIPAA rules are kind of coming outta the Dobbs decision and with respect and subsequent state abortion bans and things like that , uh, some of these laws generally cover similar information and often pick up where HIPAA drops off, either with respect to entities or the, the regulated data. I , I should mention that these laws, when I'm talking about like the new general privacy laws or the health health specific laws, the newer ones are a bit different than the medical records or information confidentiality laws that have been on the books for years, even predating HIPAA in many instances and apply to healthcare providers and other entities such as hipaa. Some carry more limitations with respect to data related to mental health and, and STDs and things like that. And, and the more stringent requirements of those medical records , um, confidentiality laws are, are not necessarily , uh, preempted by hipaa, but the, the newer health specific privacy laws more or less regulate data and entities not already covered by HIPAA or these medical record laws. So just consumer facing health apps and some e-commerce websites. They also generally prohibit geofencing around facilities that provide healthcare services. And for those who don't know what a geofence is, it's essentially a virtual perimeter around a certain geographic place that it that's set or tracked via your phone that you, you know, have in your pocket. Right. Uh, my health , my data for example, is, you know , one of the prime examples is, is a applies to a wide range of data categories that are linked or reasonably linked to a consumer's health status. And as related to your first question on this podcast, appears to cover inferential data from non-health sources of information, right . And then there's also the general not health specific state privacy laws that follow suit after California passed the CCPA. These laws do cover health information typically fitting under their definition of sensitive data, which comes with heightened requirements for use and sharing. So these laws tend to include HIPAA related exemptions though. So that's kind of how the interplay happens here. And some flat out don't apply to certain HIPA recovered entities or business associates altogether. Others have data level exemptions, others exempt, PHI , uh, specifically others exempt. I think those language around like information originating from or intermingled with data maintained by covered entities or business associates. And then other ones are , if it's de-identified data and done, done in accordance with hipaa. So those, those are the types of the, I guess the, the interplay that, that we're seeing in kind of how one law leads into another, where the other one picks up, but the other one where other one ended.

Speaker 2:

Okay. Okay. Wow. Wow, wow, wow. That is, it's a lot, it's a lot for entities to think about, honestly, <laugh> , um, sometimes I'm , I'm grateful to be on this side, right where we're, where we're advising , um, because it's, it's a lot to, it's a lot to consider. Um, I think one thing, I know that we are all familiar with the concept of preemption, but it just bears repeating , um, as different laws are being propagated , um, from state to state. Um, and a lot of us are part of organizations that are, are working across multiple states. Um, remember that preemption applies when a state law is contrary , uh, regarding HIPAA specifically. Um, preemption of state laws that are , um, contrary to HIPAA occurs when a state law is contrary to the privacy rule unless a specific , uh, exception applies. So let's just keep that in mind. Um, and contrary means that it would be impossible for a covered entity to comply with both the state and the federal law. So as we're looking at these different , um, health specific laws that are being propagated , uh, remember as we're advising or going into different situations that if it is impossible for the entity to , uh, comply with both that state law and with hipaa, then that state law is going to be preempted unless the state law provides greater privacy protections or privacy rights , um, or there's some other exceptions that we don't need to go into. Um, but Great. Thank you Rob for that, for that , um, those thoughts to think about. I'm really loving this discussion. Um, so we've talked a little bit about state laws versus , uh, federal , um, state versus federal perspective. Let's talk about an exciting new proposed federal law. So Rob, what is your take on the recently proposed American Privacy Rights Act and its implications for the healthcare sector?

Speaker 3:

<laugh> . So, I guess, I guess exciting, it depends on who you ask <laugh> , but , uh, it , uh, it would likely be a game changer in how privacy is, is regulated in the US considering how sectoral the laws are, at least at the federal level. So the organizations that are ready having to comply with stricter state laws like CCPA, may not feel a huge difference in the way they do business. Um, but for those that aren't already complying with such laws, it may be an adjustment to say the least. And for those struggling to track the different state requirements, it may actually be welcomed. Only having to comply with one comprehensive law is arguably better than having to apply with say, 15 or, you know, if every state does it 50, right. So at least from what I've heard and read and seen, a lot of folks are actually pretty bullish on this law actually having a decent chance of , uh, passing because of support of businesses and it's bipartisan and bicameral support. Now, lemme say decent chance , I don't mean like this is, you know, all but a sure thing. I think there is still a long way to go, but more so than pre previous attempts have been. And also a federal law could on privacy, could be viewed as a precursor to any AI specific legislation at the federal level, which many are eye eyeing as well. And I think there's certain folks that are anxious to get something on the books to, to regulate this kind of new growing technology.

Speaker 2:

Exactly.

Speaker 3:

Yeah. So it , it's, yeah, it's, it's very interesting and, and I don't wanna speculate, but if it does eventually pass, it's, it's not likely to do so in its current form. Even the support I have seen has mostly been conditional, so based on certain amendments or changes to the law. So I think there's still some changes to be done. So for those who are reading it and panicking about certain provisions, like, you know, we'll, we'll see what'll happen. And , uh, yeah, I think, I think that there's, there's, there's some time before anything passes. I , I think that the preemption piece and the private ready action, at least from what I saw, have a appear to have some of the larger sticking points. States don't really want to give up control over privacy regarding preemption piece , right? Like, I can't see California wanting to give that up, and industry does not want to deal with the deluge of litigation that will flow due to a private right of action. So that's kind of at least my initial take on, on the law itself and the status. But some things that I thought were notable on the healthcare side, I think is , is currently drafted. The law does address certain healthcare concepts. One thing is sensitive cover data does include health related information. I think it's reveals health condition treatments, things like that, and has heightened requirements like express consent for sharing with respect to third parties . Not a surprise to many folks that see health information covered under sensitive data under certain state laws. It's also notably provides that companies in appliance with laws like HIPAA are deemed to be in compliance with this law. And data that is de-identified in accordance with HIPAA would potentially be exempt. Um, and then the current preemption provisions is , which is great that you went into that before meka , like indicate that state laws covering health information or medical records won't necessarily be preempted. So maybe sorry, to those folks who are looking forward to Washington's my health, my data being preempted by this law Mm-Hmm , <affirmative> , sorry, but who knows? The , the draft could change and, but I, I do think that healthcare entities, particularly those specifically subject to HIPAA and maintaining mechanisms to comply with HIPAA and other state laws, or those that have like more global footprint and have to comply with international laws like GDPR will have a decent, or sorry, will be in a decent position, should, should this law pass.

Speaker 2:

Wow. Thank you. Thank you for that. Um, yeah , there's definitely , uh, a lot of considerations for privacy, especially as you're looking at , um, like you said, organizations that are operating beyond our borders. So let's think about what other considerations may come into play as you're shifting data from the US and beyond. Um , the mechanism for easy data transfer between Europe and the US has gone through a few iterations. Um , so the third and current version, which is called the data privacy framework for Europe and the uk , um, and the US data bridge for the uk, it went into effect toward the end of 2023. And we've also seen, we got another long title coming, the Executive order on preventing access to Americans Bulk Sensitive Personal Data and United States government related data by countries of concern and the DOJ proposed rules , um, which were issued in February. So for healthcare organizations that share and store data with internationally based partners, what are some major risks to be accounted for? And you kind of already have started going down this line.

Speaker 3:

Yeah, no, no, I, I think a, you , you , you said those laws beautifully. It was great. Um , <laugh>, thank you. The titles, the titles are a mouthful. Um, I think having a , just to start off like a well-developed privacy framework at your organizational overall , it's just a great place to start to mitigate potential risks . Not to be a cliche, but it's true, like something that needs to be taken seriously. And that would really help with a lot of the changes that are coming and a lot of different concepts and concerns that you might have with international data transfers and these new laws, et cetera , to take from GDPR , I mean , developing products in a business with the idea of privacy by design, it's gonna get you pretty far. Also, just understanding what data an organization has and what it's going to do with the data and who it's sharing the data with are important. And so it's data mapping, conducting audits and risk assessments, segregating data where necessary, practicing data minimization principles, these, these, all these types of things they listed off. These are things that can help us on exhaustive lists, but just thinking about things that you can do internally to get an understanding of what your organization does. First step is knowing the problem, and then second step is fixing the problem, right? Is was , uh, that old , old saying goes right. Um, and then it goes without saying, but really understanding where your data's being collected and sent to is key is as I'm mentioning, and we're, we're seeing all all kinds of restrictions or guardrails and international, international, excuse me, data transfers. So there are risk in organization can get tripped up by those laws. And if it's not aware of the restrictions in which the jurisdictions in which it operates. So the US in many respects did not have the same cross border restrictions as some others had historically. But we are seeing movement on on that, as you've mentioned. Um, on the state level, we saw Florida pass a law prohibiting offshore storage of certain health information on the federal side. You mentioned there was the recent executive order and subsequent DOJ notice of proposed rulemaking aimed at protecting the US sensitive data and US government data from exploitation from countries with a record of using that data to enable cyber operations, surveillance, scams, and other malicious activity. Th this, this law could impact healthcare organizations in particular with, you know, those who have government contracts or those that wanna receive federal funding. That's, that's probably those who should be most concerned about these, this type of law. But yeah, we're, we're seeing movement in that. And there, there are things organization can do just really kind of first and foremost understanding , uh, where their data's going and what data they have.

Speaker 2:

Very true, very good considerations. And I'm also thinking, Rob , um, going, this ties into your first point of making sure that you're , um, that an organization should make sure their data privacy program is secure. Um, they may also wanna consider have they allocated for consulting costs or legal costs , um, because that might be a part of trying to implement , uh, a re uh , an appropriate response for their organization towards these different laws. But let's shift gears a little bit. Um, our last question for today deals with how we are planning for our data intersecting with other industry specific regulations like the FTC act. So the FTC act, it prohibits companies from engaging in deceptive or unfair PR act or practices in or affecting commerce, right? So this means that companies must not mislead consumers about, among other things, what's happening with their health care information. It also means that at , in organizations, we have to ensure that healthcare data practices aren't causing more harm than good. The FTC ACT obligations apply to HIPAA covered entities and business associates as well as to companies that collect use or share health information that aren't necessarily required to comply with hipaa. So practically speaking, what does that mean and how have the FT C's purview expanded the regulatory risks associated with health privacy matters?

Speaker 3:

Uh , I'm , I'm glad you brought up the FTC and how it expanded the regulatory risk because the FTC has kind of been, has been kind enough, sorry to keep us all on our toes as of late, from non-competes to antitrust enforcement, increased scrutiny on private equity and healthcare really becoming a , i , I guess a jack of all trade commissions. But , uh, on the privacy side, FTC has sort of come in to fill the gaps at the federal level with respect to health information use, particularly with its current enforcement against digital health companies using section five of the FTC ACT and its health breach notification rule, which it recently updated. But the, the FTC risk is present in two sort of ways. It's been enforcing and updating its rules and guidance on the health breach notification rule governing, governing , um, uh, governing entities, including those not subject to HIPAA or traditionally didn't specifically worry about restrictions on its use of health data. And then two, enforcing section five of the FDC ACT regarding unfair and deceptive practices, as you mentioned, even against entities subject to hipaa, but enforcing HIPAA compliance and data use in a different sort of way. So on the health breach side, the FT DFTC sort of took a law that was originally understood to apply to instances of theft or breaches of consumer health records, but the FTC did not bring, I think, a single action enforcing the health breach certification rule since it was promulgated in 2009. But then after its September 21 policy statement, it subsequently began taking the, the position that the health breach certification rule was much broader in scope, and then it applied to consumer health health related to website browsing and app usage data shared with advertising vendors such as being tracking pixels without consumer consent. So it applies to entities not necessarily subject to hipaa, though it , you know, there's application to to, to both, but it , it also broadly regulates unauthorized disclosure rather than a simple breach due to malicious due a due to a malicious actor. And there have been numerous settlements here, though it's worth noting that because there were settlements, courts haven't really evaluated the FTCs position on this. Um, so, so something like that is , is , is worth considering. And then on the section five part of it, the FTC can enforce unfair and deceptive trade practices against entities that are seemingly in compliance with hipaa. So for example, if a telehealth company said its product is HIPAA compliant or HIPAA certified, but it is not indeed fully HIPAA compliant, the FTC is gonna potentially have a field day about that. Or also not many organizations can say they're a hundred percent, a hundred , a hundred percent HIPAA compliant, and that could be seen as deceptive. Another example could be the FTCs enforcement against a software provider. I think it was 20 15 20 16, which the FTC alleged was touting its encryption, which was apparently subpar. So the practice could have been technically in compliance with HIPAA's security rule, but because it's using strong encryption to sell its product and quote unquote like strong, it turns out it wasn't true or allegedly. So the FTC had something to say about it . So in other words, to answer your question, practically speaking, if you think you're free from a federal enforcement because you're not subject to hipaa, may maybe take another look at your data practices. And then just because you are in material compliance with HIPAA requirements does not mean you're free from scrutiny from the ftc. Being transparent and not being deceptive could help you keep the FTC away.

Speaker 2:

Very true, very true. Um, and also I wanna just give a quick reminder that that health breach notification rule , um, it's very, it's very similar , um, to the HIPAA breach notification rule. As you, as you've mentioned, Rob , um, and the FTCs Health Breach breach notification rule applies to vendors of personal health records , um, personal health record related entities, and also third party service providers. So, like you said, just because you are not a HIPAA covered entity does not mean that this rule does not apply to you. Um, and one thing that I think some organizations may be surprised to know is if you engage in practices that fall under the purview of the FTC Act , um, and under hipaa, so this would be regarding a HIPAA covered entity, potentially you could be subject to two investigations, one from the FTC and one from the OCR , um, and those two , um, uh, OCR being the Office for Civil Rights. Um, and those two federal agencies work very closely together. So some things to keep in mind,

Speaker 3:

Not surprising, the agencies do talk to each other.

Speaker 2:

<laugh>. All right, <laugh> . All right . Well, we have had an awesome discussion today. Um, thank you so much, Rob, for all of the insights that you've shared. This has been excellent information. Um, I really hope to speak with you again soon. Thank you.

Speaker 3:

Oh, no, my pleasure. It was great, great chatting with you and, and happy to be on. And yeah, look forward to doing this again.

Speaker 2:

Yes, definitely. And thank you to our audience for listening. I hope you all have a great rest of your day.

Speaker 1:

Thank you for listening. If you enjoyed this episode, be sure to subscribe to a HLA speaking of health law wherever you get your podcasts. To learn more about a HLA and the educational resources available to the health law community, visit American health law.org.