Soundbites

Overcoming Challenges with the Governance and Security of ePHI and ePII in the Era of Decentralized Care and Data Sharing – The Virtual Fortieth National HIPAA Summit

David Ting, Founder & CTO of Tausight moderates a panel of healthcare security, compliance, and privacy thought leaders as they discuss overcoming challenges with the governance and security of ePHI and ePII in the era of decentralized care and data sharing at The Virtual Fortieth National HIPAA Summit.

Automated Transcript:

Welcome to the 2020 third, uh, hipaa summit session on overcoming challenges in the governance and uh, security. E P H I and e P I I in the era of decentralized care. I’m David Ting. I’m c t o founder of TE Site. Uh, and I have here three distinguished experts in this field. Uh, I would like them to introduce themselves and, and have a opening statement on what they think about this topic.

And I’ll start with you, Lynn. Thank you David. Uh, my name is Lynn Sessions. I’m a partner at Baker Hot Stetler. I lead our healthcare privacy and compliance team and my practice, uh, is centers around ge uh, assisting clients and guiding them through data breaches, ransomware attacks. Um, and, uh, the regulatory investigations that follow that along with proactive compliance with HIPAA and other state privacy laws.

Um, this particular topic is near and dear to my heart because we see, uh, a lot of our clients struggle with the amount of data that healthcare has in, in their, uh, various. Electronic systems and they, uh, have trouble with deciding what to do with all of that data, how to protect that data. Um, and it is very valuable to them.

So I think this will be an interesting topic to many of those in attendance. Tom, uh, Tom August, um, basically 30 year veteran in the IT risk IT audit and information security. Spent the last decade as a CISO for various healthcare provider systems in the state of California. Uh, the topic is absolutely interest, uh, interesting to me because I’m in healthcare.

I’m a ciso. It’s everything we do is this. So thank you and Dawn. Yes. My name is Don Swick. I’m the Chief Compliance Officer for Nova tesol. Uh, we are a company that we address the security and the privacy compliancy, uh, for healthcare entities. This is very dear to my heart. This topic is, uh, because protecting the p h i, protecting the E P H I and the p i i is the number one fundamental of a security and a compliance.

Well, thank you. So as, um, as a CTO o and founder, the reason I started test site was also the same reason you all had brought up. E P H I is at the core of what we, what healthcare drives on. I’ve watched the digitization of multiple verticals over my career. Uh, everything from manufacturing to e.

Healthcare just moved from paper records to electronic and, and our ability to speed up healthcare delivery and information sharing is one of the benefits of, of electronic, um, movement of electronic data. Yet at the same time, when I served on the cybersecurity task force in 2015, what people call the four oh five committee, uh, the core of the formation of.

Panel was all around breaches. We had just gone through that year, the previous year, 120 million records lost. And I think it was the expectation that we could do something about it. And, um, I think the number of breaches that are still going up is, hasn’t abated. Um, records are too, too important. I, I put three important criteria on, on.

P h i one is the exploitative value of, uh, of p h i. People steal it to exploit its value, and the, and the value on the black market is still very high. The second one that I think people underestimate is the exploitation value. I take your data, I hold it hostage, and I threatened to release it unless you pay.

And then the third value of all the C P H I is, what do we do with it and how do we effectively use it? increasing the cyber resiliency of our organization. You can’t run a healthcare organization if most of your records get reached and, and, and lost. And yet we see very, um, varying degrees of cyber hygiene as, as we go around talking to.

I’d love to understand, and every one of these folks that we’ve talked to, I’ve talked to over the years, um, basically indicate the difficulty of Secur, E E P H I. And so the word that nobody wants to hear is I. Just had an cyber incident and so I like to turn that over to you, Lynn, to say what happens when that GC hears the, the word that they don’t want to hear and they have to call you, and what happens when that clock starts?

And then we can have that conversation around what are the things that I wish I had done? The regret moment, I call it regret moment when you go, you’re the CISO or the privacy officer, and you say, I wish I had. To simplify that problem, uh, downstream, so, Yeah. So that, that is a, a real life call. I get them at all times of the night or day, I think.

Uh, there’s a ransomware attack that I’ve been working on, uh, since, uh, around New Year’s and, uh, I got the, the text message from a longtime client of mine at 5 45 in the morning. So it woke me up. Uh, I knew getting that text from, uh, from the general counsel who I hadn’t talked to in about five years. Was probably not gonna be a good day for her.

Uh, and certainly wasn’t gonna be a, uh, an easy day for me. Let’s just say that. And, uh, and one of the things with respect to it, it’s one thing and, and I think Tom can certainly, uh, talk on the, the security side of things, but it’s one thing when the, the fact that the bad guys can’t get into your systems, right?

It’s another thing then too, to be thinking about. All right, what’s that impact on our patient care? , what’s the impact on us from a communications and reputational standpoint? And then there are of course, the legal ramifications around that. And at the core of all of that is the massive amount of data that a lot of healthcare organizations have, particularly our large health systems that they have, just whether it’s in their electronic medical record, we find it a lot of times in what I call the juncture.

Um, are a, a part of their systems that is, is just where all of the transactions may take place. And, um, and it, and it’s interesting when we’re talking to the non IT people, the non-information security people at the healthcare organizations, they’re mostly concerned about their electronic medical record, right?

Like, did the bad guides get into our E M R? That rarely happens. And I’ll say that’s kind of the good news, at least from an integrity standpoint. But the bad news is, is that we do get, um, we, we do find that there are, uh, there is data that is very unstructured. Uh, there’s a lot of information about patients that are contained on shared drives and the nas part of the, the environment.

Um, and there, uh, Oftentimes a master patient index worth of information contained in those systems. And so, um, as we look at being able to govern and, and, um, and really protect that data becomes very important. And at the core of the HIPAA security risk analysis, Is an inventory of where all of your protected health information is.

And, uh, I think Tom can probably address how difficult that is in being able to, to, to even make that determination. But what we find certainly in our data breaches and involving large, large amounts of, of access to, to data such as an a ransomware attack, that it’s a a, it’s a, a massive number of patients who are oftentimes affected.

I think from a straight security standpoint, one of the biggest challenges. In healthcare, it is just asset management data being one of the assets, and it’s hard to secure what you don’t know you have, and until you do a thorough inventory to figure out where your data is, you may not know. All the little crevices and nooks and crannies that could be hiding in.

I’ve seen the junk drawers. They are terrifying. Um, I’ve seen where data is cashed in places it wasn’t supposed to be, but when you do a D L P scan and all of a sudden you’re turning up internally, all, you know, hundreds and thousands of records on individual desktops because somebody viewed something and Outlook decided.

Store it, uh, in a cash file or the viewer and a PAC system would store it in a cash file. All of those are little data nooks and crannies where this stuff can hide. So that’s a real challenge. Um, I, I really like the other question, which is when you get into kind of what are the lessons learned and how would a security person look at this?

I mean, to me, an investigation’s gonna focus on two things just from a security standpoint. I won’t speak to the legal aspects, obviously, but I like to know what’s the quantity of data that’s potentially at risk and what’s the nature of the data that’s potentially at risk? And in the, in the case of a breach, that means how many records, how many uh, PHI elements are in there that are contained in those records.

as Lynn alluded to, we have the omnibus rule has a four-step risk assessment, which we have to go through, but in the end of the day, it’s really one question and it’s really painful. Can I prove that no one accessed the data in question? And if I can’t prove it, I can’t say that it’s not a breach. It has to be something more.

If someone had access to it, and I can’t prove that no one accessed it, well then it’s, it’s in the, the world of potential reportability. We had a real world example of this. I can give to kind of give you an idea how it all plays out. Um, At a healthcare system I worked at, we had a nurse that got fished and unfortunately she gave up her credentials through the attacker a week before we rolled out MFA across the entire organization.

Uh, the timing just killed me, um, but because the log on gave her access to a number of systems, The privacy and security teams had to review each system to see what was at risk, whether the account was used during the time in question. Some of them we could rule out right away, like Lynn alluded to, we could look at Epic and say, okay, she wasn’t logged into Epic during this time.

Pretty good exclusion there. I feel pretty good. You know, this nurse wasn’t uh, in this system there. However, the account that got phished had access to email, so we had to go through her inbox. Contained years and years and years of data, including an email with an attached spreadsheet, and the spreadsheet, you know, contained several dozen pasting records.

Um, it. Turned into a reportable issue and, uh, oh man. The longest hour of my life, it seems like was spent with the regulators going over and over and over and over and over the specifics of that particular incident. And it, it’s, it’s like they ask you the same question 27 different times, um, and going over it again and, and explaining what you’ve done afterwards to prevent it from occurring again, and all that, it’s, it’s, it’s a very painful thing, but that gives you an idea.

All these nooks and crannies that you may not expect the data to be. It’s not just in the e h r, it’s, it’s all over the place. Um, so anyway, that’s my two. Dawn, you’re, you’re a person who lives that battle every day. What’s the challenges you face? You know, the clinicians obviously want all their data, they, their entire work histories and their email inbox outbox the files that they keep in their folders, uh, the shared drives.

Um, how do you balance that against the practical realities of somebody’s gonna come in? It’s not a matter if, it’s a matter of. I like to, oh, I, I like to say I don’t think it’s if, or, and it’s a matter of when I like to say that it can be avoided, but there does have to be, you have to be consistent across the board.

So one of the things with insecurity, there cannot be, uh, you know, the one-offs, right? So whether it’s gonna be from the provider all the way down to the front desk and even to the C-suite, that level of secur, that consistency of security has to. , it has to be there and it has to be buy-in from the top all the way down to the bottom.

So you have to have a cultural buy-in for security. It, it, it, it, it has to be preached, but it also has to be practiced, um, across the board. And the level of accountability has to start from the top, also from the c e O to the c e to the cso, et cetera. Uh, what we do find that one of the biggest challenge is the.

And really identifying that need of that data that p h i and having the ability or to say, no, uh, you don’t need this data. And finding an additional workaround. Maybe de-identify the data if it’s on a certain project that they’re trying to use, uh, that they’re looking to use this data for, et cetera.

We’ve, we’ve, we’ve been hearing the challenges of the modern healthcare system, especially those organizations that have researchers where there’s that prop, that desire to have a lot of data. They bring in the data with them, the IRBs bring in the data, they couple it with the data that they have, and yet that’s not in the most CIO’s purview cuz the researchers have their own.

Sphere in their own little, uh, network. Um, how does that affect, I, I’d love to hear your feelings about, or your thoughts. , the legal side. Am I responsible for those? I brought them in, my researcher brought them in. She might be a visiting, uh, associate from some other organization, but she brought in data.

She imported them from other organizations. What are the ramifications for that and Tom managing that stuff? You’ve managed academics, uh, that seems to be a category of, of, uh, concern all onto its own. . So I’m happy to kind of address the legal side of stuff. Um, a lot of it, I’m gonna give you the, the lawyer answer and say, it depends, right?

It kind of depends on that researcher’s agreement that they may have. Um, the clinic, it’s oftentimes clinician faculty type members that are coming into academic medical centers or large health systems, and. , bringing that research with them and continuing the research after they got there. So in all likelihood, that gets ingested and Tom can speak in from his specific experience, but that gets ingested into the health systems.

Systems. Um, and therefore those CISOs become responsible for it. The other thing too that we see sometimes though is as part of the, um, the, the research protocol, there may be separate computers and other systems that are set, set up outside of. the health systems, um, protections. So, uh, Tom and and his colleagues are, are putting all kinds of parameters around, uh, not being able to get into the health systems.

And then you’ve got this research data, uh, that may include clinical research that are sitting outside of those protections and if something happens to them, um, and it happens to fall under HIPAA or some other other laws that may be implicated. Then oftentimes those healthcare entities are responsible for the breach notification and the regulatory protections around that data.

Um, so there, it, there’s a lot of contingencies that I’ve, I kind of placed in there, but I think it’s a little bit of buyers beware when, uh, someone is bringing in their own data. Particularly if they’re continuing on with that research, um, with, in their current, um, in their current role, um, then it is a risk to the healthcare organization and there’s a need and a desire to have to protect it.

and it’s not just the researchers. It could be specialists within a hospital system that are hired specifically because they have a particular nuanced skillset that’s usually based on research data of their own, uh, and application systems and other tools of their own that they’ll bring. And when you hire the doctor you hire.

The whole office. So what happens is you get all these ancillary systems that kind of bolt on your existing system, that support the specialization practice. You’ll eventually absorb those hopefully into your EHR and others, but you still have legacy data that he may wanna maintain on his own. So it becomes very much a legal thing of where, whose data is it?

is the big question that comes up on all of these. At what point do we draw the line? No, that’s health system data. No, that’s the private practice data. He’s legally liable for his, we’re legally liable when it fits certain parameters in ours. So that’s huge. That’s guaranteed. I’m working with counsel on that every single time to figure out, okay, where, where do we draw the line?

Whose data is it? It, it’s almost like you need that Lynn sessions right next to you for all these critical decisions, cuz it’s, it’s a much more connected world. In our decentral. You know, one of the things that we wanted to talk about was in this decentralized delivery model that we have evolved towards.

What are those challenges? As your data goes everywhere? It, it seems to be, it used to be in that medical records room. You go down and you ask for a CER certain number of folders and you can walk away with them. I used to, It’s four ounces per jacket, four pound, uh, four patients per pound. 500 records is 125 pounds of records that you’ve gotta steal.

We don’t have those constraints anymore. It is easy to steal tens of thousands of records or lose them. , what does that mean for you in this decentralized environment, Don, as, as, as you manage your sprawl of data? I, I call that data sprawl. Um, there might be more polite terms, but it just seems like we have data everywhere.

Mm-hmm. , I call it the who, what, when, where, um, you know, who has access to my data? What are they doing with my data? Uh, when did they access that data and where are they accessing the data? And I think really being able to identify those four, four prongs really leads you to the why they’re accessing the data.

Right? This is a huge challenge, but it’s not just understanding, knowing where my data is, but it’s also knowing where my data is not. So when you can establish that baseline, then it gives you the ability to manage access to that data and still, still hold true to the, you know, the, um, the cybersecurity one oh one triad.

I call it, you know, the c i a, right? The confidentiality, the integrity, and the availability. So how so the, one of the things. Task force had recommended was the broader use in adoption of the, uh, this cybersecurity framework. Identify, protect, detect, respond, and recover. How broadly do you see that? Uh, and, and I’m gonna turn that problem to that question to you, Lynn, from not only.

The cases that you’ve dealt with, but Don and Tom, your practitioners in this field, is that something that’s gaining traction? We have the 4 0 5 D hiccup guidelines, the, the, um, safe harbor, um, incentive to basically reduce the penalties if you are following best practices. Do you see the, in. getting towards that point.

I mean, those are great guidelines, but it seems like everyone is saying, I’ve got it covered and, and, and I’ve done my breach assessment and it’s good for my, at, I te I attested that we have the best security for our patient. Um, everybody’s smiling. I can see that. This is not, this is gonna be an interesting point.

So, uh, I’ll start with you, Tom, and. So first, I guess there’s a huge difference between a risk assessment and a compliance or a gap assessment. That’s like the number one. . A lot of health systems, I believe, struggle, um, with the difference between the two. And if you look at a lot of the NIST standards and High Trust and all these other ones, they’re basically compliance frameworks, which you can be compliant with all the controls in there and still miss the very risks that are facing your organization if you’re not careful.

So to me that’s like Job one. Um, specific to the n Cybersecurity framework, it’s been adopted pretty broadly. , although when I say adopted, it’s in various stages of adoption. Um, I don’t think people can look at their entire control environment and say, I’m 100% mature at the right level based on our risk appetite for our organization right now for Aubrey control.

Uh, if you are, congratulations, you’re the first person I’ve ever heard of, um, that’s been able to say that. But that said, I. , there will always be a push, uh, towards prevention. It’s the preferred place to invest because you want to prevent something bad from happening in the first place, but definitely you get into the other areas of identification and responding and having some sort of business resiliency and all that.

That’s. more and more critical. Uh, especially as the, the tech changes, the cyber threats change and they evolve, so you can’t prevent everything all the time. And then when you add humans into the equation, humans are creative. Humans in healthcare are really creative. They come up with all sorts of neat ways to make data move because they have to solve their problem, which is solving patient care, which I respect that a ton.

I want them to be really, really good at that. I just wanna set up guardrails to prevent them from doing something that might, you know, cause the organization, um, some damage in the future. So, hope that answers your question. I think. Go ahead. I was gonna say, David, I can tell you this, I mean, we get asked this question in every single one of our ocr r investigations that involve a security incident.

Um, so I, I think this is, I think the audience probably knows this, but you know, if you have a, a data breach that involves more than 500 people, you’re guaranteed to get an OCR r investigation if it’s secure, if it’s a security incident. Then you’re going to get asked questions about, um, about your cybersecurity framework.

And what we found is we will oftentimes talk to clients that have assured us that they’ve, they’ve got everything in place. We provide them with an advanced, uh, copy of the types of questions that the OCR r is going to ask them. And then when it comes time to act, they actually get asked these questions.

They may not be able to complete the, the answer. So we help them marshal all of the evidence they have to be able to appropriately respond. So it’s a fantastic framework for, for, um, healthcare entities to be, um, working through. Um, I think Tom’s point is spot on in that. Um, there’s a lot of folks that are doing a lot of really good things, uh, but they’re not perfect in, in all aspects, um, of being of the compliance front.

And even from a compliance standpoint, it’s still, uh, again, to Tom’s point, there are risks that that may or may not be identified. So that kind of brings us to the security risk analysis. Mm-hmm. , again, in the OCR world, that is kind of the fundamental starting point for any security. Um, Security protection and they look at that very, very critically.

We have found clients who have spent hundreds of thousands of dollars with consultants to put together a security risk analysis and OCR R comes back and says It’s not compliant. Um, even though I will tell you that we look at what their guidance is on their website and it certainly appears to. Meet the guidance that the OCR r provides.

Um, I look at it as a little bit of a moving target. It’s something that can both hurt and hurt and help a, a, a healthcare entity. They need to have something in place. By the same token, they need to ensure that the risks that are identified, that they’re actually remediating those risks, um, in a, in an orderly and, and appropriate manner based upon their specific, you know, Income level or revenue level and the size of their organization.

All of those types are thing of things are the types of things that the OCR takes into account in making determination on whether or not they’re going to dismiss an investigation, provide technical assistance, or decide or decide to decide to penalize a covered in any your business associate. Huh? It’s frightening.

Yeah. On one hand you have to be totally compliant. On the other hand, everybody’s after your data trying to get into your system, trying to get access to your data. Don is answering the questions that you presented, which is the who, what, when, where, how. In doing that, probably more continuously than just a once a year assessment that you paid for and just to get past the compliance checklist.

How do you do it? How are you? Yes, you have to leverage technology. Uh, technology is our friend and well, first of all, the healthcare industry has to catch up to the rest of the world. We always seem to be five to 10 years behind on certain technologies and cybersecurity compliance is. You can include that with a bunch.

Um, start leveraging artificial intelligence. I, I feel artificial intelligence is gonna bring a new, uh, it, it is gonna bring a different, uh, tool to the playing field. Um, especially when you start to work with machine learning and deep learning and it’s capabilities and what it can stop before the, you know, it can stop the ransom.

For, it even begins to, you know, go across the network. Um, we can verify using artificial intelligence, the difference of data, right, that are on different devices and whether that device is encrypted or not. Um, and you can do this with less manpower. Um, but I feel honestly, David, that it’s gonna take the payers coming down on the healthcare industry and saying, look, until you implement these policies and these certain security rule, uh, We’re no longer going to, uh, do business with you, and unfortunately it’s gonna, that’s the bottom line.

Yeah. Well, I’m certainly in for the, I, I certainly see that transition occurring in the space as ML machine learning, AI technologies become more relevant and certainly in multiple cybersecurity offerings, we’re starting to see that not only in the analysis of. Uh, events that are coming in, but also the ability to analyze, um, the content.

One of the things that we do at tele site is we, we look for content for. P i I p h I in unstructured and structured content using ai. And that’s been a, a huge, um, advance that I believe, and that is part of what I believe, um, we need to start doing, which is focusing on how do I discover all that dark p h i, you know, we have dark energy, we have dark manner, uh, we have dark P H I P H I, that doesn’t, uh, like to be seen.

Uh, it gets created, it gets stashed. Uh, one of the things that we have discovered in some of our, um, projects with, with our, uh, customers and PLCs is the amount of stale p h i that sits out there, the amount of unused E P H I sitting on dry, sitting in the cloud, sitting in different places. Shouldn’t there, should there be policies for data retention that all, all these organizations have to, to basically minimize that exposure?

I, I, I’d liken it. I was telling, uh, describing this to my wife and I said, I suppose you had all your jewelry in your house and you just scattered them all over the place, ready for somebody to come in. Does that make sense or is that a better model for. To say, gee, the ones I use that I need to keep, I’ll, I’ll put ’em in a protective place.

All the stuff I don’t use in commonly or haven’t used, I’ll put ’em in the safety deposit box. Uh, is that a model that re reduces and shrinks that exposure? Is that something that healthcare is focused on? I’ve just been very surprised by, uh, that number of how old some of the files sitting on the systems are.

I can say this, I, I know that state Regulat in particular are very interested in the data retention piece. So when we’re working with healthcare entities that, for whatever reason, drawn investigation from the state attorney’s general office, um, they do not have a co an understanding as to why. You have to have data that’s 20 years old, 30 years old, 40 years old.

Um, but you can talk to, you know, the researchers and faculty members, um, and physicians at many organizations who want that data, um, because they don’t know if they’re ever going to need to. Um, look back on that data to have the next breakthrough, um, for whatever novel disease may be, may be on the horizon.

Or even, you know, looking back on some breakthroughs that they can, um, that they can then look back at older data that can be helpful to them in curing cancer or whatever that may be. . At least that’s what we hear. Um, and I hear that, um, when I’m proactively counseling clients, but ideally there would be some policies and procedures in place.

And I sit in my ivory tower here in my law firm office, and I’m saying that I think the folks who are actually working in, um, the, the, the Dawns and the Toms that are working with this data, On the front lines probably would push back, even though they may have the same feelings I do. But ideally you would able be able to get rid of the stale data.

Um, and um, and because it then it wouldn’t present the same risk. But I can’t tell you the number of data breaches that we’ve had to notify patients that go back 20 years, 30 years, even. because that data has been in, um, either, uh, been kept by the organization, um, on, on some type of, um, of, of device that, uh, that has been breached and we’ve needed to do notification.

So it’s a real risk. I think from a practitioner’s standpoint, it becomes really challenging when organizations aren’t clear of what their book of record is. So they have decision making systems, and that data does have legal ramifications of, um, minimum dates that you need to keep and, and for data retention policies, so like your EHR system, depending on the age of the patient, it could vary, but it’s gonna be many years that you’re storing data financially for financial decisions.

Um, board level business decisions, that data needs to be stored legally, um, for many years. However, those, that’s for the books of record, that’s for the legal decision making systems. That’s not for my spreadsheet that I just made to whip up a quick analysis for somebody yet. I’m gonna hold onto that cuz I might get a question in 47 years.

So I’m just gonna keep that spreadsheet and literally that’s what happens. As a practitioner when I’m coming around trying to say, okay, let’s clean up the share drives. Oh my God. The pushback we get is, that’s a great idea, just not for me. Yeah. And no one wants to be the first person to volunteer their data to be cleaned up.

They say it’s great in theory, but the reality is as a, as somebody who actually has to do the work, um, it’s a real uphill battle. So I’ve seen council, uh, in individual health systems float the idea of we’re gonna keep email for one. And then we delete it. Um, that didn’t go over so good, but it was good in theory because it reduces their entire risk of everything Lynn brought up of the old data.

That’s stale data that’s out there. Yeah. It’s one way to get rid of it. Let me put a timer on your data. And if you don’t do anything with it in a few days, that’s great, but again, I’m not talking about the legal books of record. I’m talking about all the ancillary data, you know, my spreadsheets, my emails, my um, you know, whatever printouts that I, that I make or whatever that aren’t decision making systems cuz I, I think.

A distinction between a decision making system and keeping those records for legal reasons versus all the other data. And Lynn, correct me if I get any of that wrong, but I think that’s how I’d view it. I think you’re, I think you’re right on. Um, and I know just from a, from a practical standpoint, I know what, when I was in-house, um, for several years and, um, and even today in my law firm, it’s like you want to be able to look historically back on things that you’ve done in the past and replicate that so you don’t have to reinvent the will.

And we all know that’s a, a lot of the reason why we’ve got work product that a variety of different people within organizations want to keep and. and that becomes our stale data that becomes the basis of our stale data. And for many years, healthcare organizations enjoy long tenure of their employees.

Right. We’ve got, I remember my mom worked at, um, her hospital for over 30 years, and that’s not uncommon for, um, for nurses in, in even. , you know, non-clinical, uh, folks to be working at, at healthcare systems because they’re good employers and people like working there and helping people. Um, so we do end up with a lot of that, just kind of historical and that becomes your junk drawer.

I mean, that is, there are several junk drawers in a lot of hospitals and, um, and, and, and Tom’s alluded to a lot of that and it’s. And he is exactly right on. There’s the legal component that we need. We have to keep things, there may be some contractual reasons that we have to keep things. Um, but most everything else is kind of expendable.

Expendable. But he’s right, right? He’s exactly right. No one wants to give up their data. Oh, they might need it in the future. It’s surprising what we’ve seen, terabytes worth of stuff kept on expensive data servers and, and. With the unknown amounts of data sitting on private cloud shares, uh, private emails.

I mean, this is, this is a huge fano problem that we’re experiencing. Uh, Dawn, are you seeing that or do you worry about that? Well, we don’t see it as much. I know it’s out there. Uh, but I think for those r and d companies, what they really need to start looking at is synthetic data and, and getting this data converted over, uh, that just removes the possibility of, you know, that HIPAA violation or that OCR r investigation or, you know, right away, I know other industries have done it, such as the banking industry.

So, again, like I said, The healthcare industry always tends to be at least five years behind when it comes to technology or or or technology adoption. Five years you’re being generous. . I think it’s more like 10 or 15. You’re absolutely right. I was being generous. ? Well, I think it’s a, I think it is a challenge.

I, I think the other side that we are seeing, when we find the p h I in an organization, we look at the archival status and we find that there’s a very spotty record of files that people have archived. You know, if you trust the archiving status on the, on these files, It’s spotty. So you, you, you basically now put ransomware, um, attacks the p the potential for the extent of damage at even larger in terms of your ability to recover and, and go back to, to a normal working state.

Uh, is that something that you’ve seen Lynn or, or, or Tom and, and Don, the ability post breach if you were. . If you have experienced that in that recovery phase, what do you think through and, and how does that translate to, gee, I, I wish I knew about these things not being archived, or, I knew about all this extra data file.

It gets back into the, what would I do differently? Had I known beforehand that in a breach incident rec, uh, recovery, I would’ve loved to. . It’s a terrible situation for the CISO to have to go to the c e o during a ransomware attack and talk about this, um, unstructured data that frankly no one knew about.

It kind of goes back to what I said earlier on. Everybody worries about their electronic medical record. That’s the crown jewel that, um, that. , you know, there’s kind of this sigh of relief that Epic or Cerner wasn’t impacted by the incident, and then they find out they’ve got a notify 4 million people anyway.

Mm-hmm. , right? Like, that is a, that is a, that is realistic. And so when we get to those conversations with the highest levels of the organization, we have to go to the board with this large notification that’s going out. Um, the first thing we say is, well, you told us the medical record wasn’t affected.

It’s like, well, that’s right. We did. , but all of the, I don’t know, the business transactions, the financial transactions take place on this particular system. And now we’ve got a database that’s full of your historical, um, you know, patients that, that is your master patient index. And those are real life experiences that that happen.

routinely in our ransomware attacks. In fact, we kind of start off with the assumption that they may have to notify all of their patients, and then if their forensics, um, is able to demonstrate that, uh, specific in, uh, systems or information was not accessed or acquired, then it’s a good day, right? So I’d rather start off with the bad news and then be able to step back from that than being a little too optimistic on the front end and have to deliver the bad news later.

Um, but absolutely realistic situations. And unfortunately, the legal risk associated that with that is not just that you have have to undergo an O C R investigation or a state ag investigation, but the minute that that gets posted on the wall of shame at the O C R. The plaintiff’s lawyers are circling, and I call that the second extortion.

Um, because you are extorted, of course, by the Ransom Threat Actor group, and then you’re extorted again by the plaintiff’s attorneys. Um, when they file this class action lawsuits, that is based on what’s posted on the regulatory websites. So it’s an unfortunate scenario that we find healthcare in, in a time when reimbursement is not as high as it has been in the past, and increasing cost for everything, whether it’s labor.

Whether it’s storage of, of, for data, um, whatever it may be. Um, there are just a, a, a host of, of expenses that our healthcare clients are facing these days. And then you also have to face bad guys in another part of the world. And then the plaintiff’s attorneys that are down the street. There was a large healthcare system that had a very, very large breach, um, multiple millions of records, and it got announced on a Friday.

By Monday morning, um, a large class action suit had already been filed for billions of dollars against the organization. It made me wonder like, did they already have the whole suit? Typed up and they just need to put a dough on it and that’s what they just cut and paste on. Yeah, that’s what I thought it was just like, slap a dough here, add the names of all of the board members, and there you go.

Um, file for class, you know, or try to get class. So yeah, I, I get back to the asset management piece. You can’t secure what you don’t know you have. And with data it’s a big deal cuz data sprawl, that’s what we’ve been talking about in large part today. Um, data sprawls a real thing and. Is in a lot more places than people may realize until you’ve been through a lot of investigations and you’re like, oh, I have a plain text data feed that got put through a router and I did a debug and it’s in my router log.

And oh, by the way, it’s got all the patient numbers in it. Wait, what? ? I mean, you wouldn’t expect that. It’s not an e H R system, it’s a network device, but because of the nature of how we communicate, it may be a repository for data, um, as well as all the bazillions of other loose, unstructured files that may exist in various places.

They could be on thumb drives, laptops, hard drives. , um, you know, your nas all this stuff internally and then you get into cloud storage and that’s a whole nother thing. Microsoft by defaults having you save everything to OneDrive unless you tell it differently. Wait a minute, , that’s another place to secure data.

Oh, no. Uh, and then you get all the other providers out there, which can be secured or not, depending on how you’ve configured him. So the data sprawl thing is just huge. And you know, David, to your point, I think really understanding where all your data is. Is one of the first steps, I think to, to really putting some sort of structure around your risk assessment and then your control strategy.

No, I think, I, I, I think that whole data sprawl is only going to get worse as we have more, as information sharing becomes more the norm, uh, as there are more decentralized care. I mean, even the, the broader use of. Uh, is going to cause more and more of the data to be moved through the browser, into places, into vendor sites, into third parties, perhaps not in the huge quantities, but it’s a potential for additional lost dhi.

Does that strike you as one of the concerns that, uh, we’ll see moving forward? Um. Absolutely. Absolutely. Um, when it comes to those third party, uh, vendors, I feel you have to have a strict, uh, BAA and Awesome. You have to have a structured BA one for international, uh, companies as well. Right. Um, within, uh, our company, we have two separate bas that we use.

One is gonna be for domestic, the other ones for international, and there are stipulations. Um, at the same time we also insist on knowing, uh, what type of coverage does that company have, you know, that business associate, um, do they have cyber coverage at all? Uh, what are they carrying? Um, have they gone through a soc, have they gone through a SOC two audit?

You know, the HIPAA audit should be fine, right? I mean, it, that should just be obvious that they’ve gone through, you know, that security and that privacy. , uh, and they have some way of showing, uh, the results. Again, though, SOC two has to be mandatory if that falls under their company, right. If they fall into that, where they need to have that type of audit, then yeah, it needs to be, uh, produced in real time.

It needs to show their shortcomings where they were addressed, uh, because that is one of the biggest worries is, you know, especially when it comes to the. . So to me, a having an asset inventory is not only the data that you have, but also the inventory of where all your data’s going, um, internally, externally, on physical drives, if, if they’re, um, people using removable drives, which I, I see a lot of, um, it’s a, it’s a, it’s a very complex model of just associating where is your data, where is it going?

what are the, what are the, uh, compliance rules that it should be following? And one of the things that, um, we have seen in, in recent days is as part of that asset, as part of that knowledge of where the E P H I is, how have you secured it? Are you encrypting it? Are you making sure that you have full audit trails and this all gets back, you know?

Company CIO told me, he said, it’s almost like you have the pre-B breach photos of your house before the house was burglarized. I said, yeah, that’s kind of the analogy that if you knew where your E P H I is and you had a complete inventory of where all the stuff is, you’re so much better at resolving the, the extent of the problem, the cyber blast radius, if you will, from an, from an incident.

Um, Lynn, you’re a recon, you’re the incident. Uh, Person. What, what’s your perspective on that, knowing, the, knowing where all your stuff is before the potential incident? How helpful? I think, I think that would be incredibly helpful. Um, we do find ourselves following, uh, recovery, um, by our clients in a ransomware attack.

Um, or even, you know, ejecting a threat actor from their environment that maybe hasn’t launched ransomware already. Um, and then it comes down to trying to figure out who we need to. And a lot of times we don’t, the, the client themselves don’t know what data’s necessarily involved until they actually get an inventory of that data from the forensics firm.

So I think it would be immensely helpful for us to kind of know, is this area even something we need to be concerned about from a P H I perspective? Um, or is this something that’s just kind of used for different purposes? Uh, because it would definitely speed up the, the need to do, uh, to hire third parties to come in comb through your data.

Do various types of, um, of data analytics on the data that may be involved. Um, and try to get your notification out, uh, in 60 days under hipaa, 15 business days in California. You know, other states have accelerated notifications as well. So that’s a very, very quick time in which we have to do notification when I work with California entities.

Um, my end, my clock is taking really, really fast there. Um, and pushing the forensics firms to try to even get you an answer to questions is really tough. But when they come up with the answers, if the clients were able to tell us, Hey, there’s no p h i there and here’s our proof mm-hmm. , um, and we could, we could, you know, get rid of a lot of stress very quickly in those instances.

Or if there was p h i, what’s the p i that’s there? And then that allows us on the legal front to provide good, quick legal advice, particularly to those clients in, in states that have accelerated notification. Hey, Lynn, if I could piggyback on that, payers may have a contract with a health system that requires 48 hours.

Cyber insurance companies may have a, a stipulation in their contract that says within 24 hours. Um, so you may have other things, not just the regulatory requirements, but you may have contractual requirements or even faster breach notifications. Um, so it’s, there’s a, the clock’s ticking and it’s spinning fast once you realize you have a thing.

So, I heard from, uh, some of particularly Cavalier CI ciso. Well, if everything fails, I always have cyber insurance to count on. Uh, I thought that was pretty cavalier. I won’t name who, but how does that figure, you know, it’s sort of like that’s my last resort. I’m always gonna be covered. Might be looking for a new job, but , what’s the role of cyber insurance and what’s the appropriate role for them?

I think it’s a, it’s a security blanket that makes you feel comfy at night, but I wouldn’t look at it for anything that’s gonna actually restore operations or get your business back to profitability. There’s way more cost than that. I will tell you, I mean a big proponent of entities having cyber insurance.

We’ve definitely seen a hardening of the cyber insurance market in the last couple of years with some of our clients not even being able to get coverage or taking very huge retentions of 1,000,010 millions worth, you know, in in their own retention. So while cyber insurance can provide you, I think with a lot of resources, and we work with a lot of cyber companies, cyber insurance companies, um, it is, it is, I agree with Tom, it’s more of a security blanket as opposed to your plan A, uh, and I realize that this particular CISO was essentially saying it’s his fallback plan, not his plan A, but, um, but it is, um, you know, it is a necessity when you can get it, but it’s harder to get these days than it has been in the.

I know. Any, anything on your side on that? Yeah, I’d just like to add, and it’s not just what it takes to get the cyber insurance, but it’s also what needs to be done while you have the cyber insurance. Uh, there’s a lot of policies that there has to be continuous review, whether it’s on a monthly basis, quarterly, semi-annual, that if you fail, if you don’t follow through, then they can easily deny that claim.

So, , anything along the lines of, I wish I had, could have done this if I had known in advance, um, any view.

I think I would say this, um, just to kind of piggyback a little bit on Don’s comments about business associates and other third parties. Um, we track the da our, the data on, uh, all the breaches that we handle in a year prior and in any given year, a third to two thirds of breaches, particularly in the healthcare industry, are caused by third parties.

And when you see who those vendors are, they often don’t just work with a single covered entity. They will work with multiple hospitals across the country and so they were to have a breach of their systems, um, and, uh, or a ransomware attack on their systems. Then it’s probably not just gonna impact a single health system.

It will probably impact. Multiple healthcare organizations across the country, and we definitely have seen that. And, um, so shoring up your business associate agreements, I think is, is critically important. Um, but also too, um, kind of going along the lines of like, do your business associates even need to still have that data?

Um, and putting a process in place in which you can, um, terminate that relationship such that they, uh, may need to delete or destroy the data that they have in their possession to the extent that they can. We started seeing that bubble up as data hygiene’s gotten a little bit better with our. Our healthcare clients in the sense that they’re trying to get their arms around these third party vendors and in determining whether or not they even need to have a relationship with them going forward.

So I think we’ll see more to come on that we may even see some regulatory impact coming, um, from, uh, O C R on that. Um, as these third party, um, incidents continue on the rise from a practitioner standpoint, I look at vendors as potential threat actors. Um, just be, no, they’re not, they’re not malicious necessarily, but they control some of your systems loose in healthcare.

They’ll manage some of the systems. They may make changes on your network or access data on your network just to support their own systems without you even knowing, unless you have very rigorous change management controls and have vetted their processes. You know, Don talked about the importance of a SOC too.

I would, I would. Double down on that one and say, if you’re just doing a SOC too, you’re missing the point cuz you haven’t looked at the risk of the data on your system as it progresses from your system to the third party. Where is that data going and what network is it going across? Where is it stored in an inter intermediate location?

Is there perhaps a control assessment of your own controls and the data flow there before it even gets to the third party? I would say there should be, and then you look at the third party, and that’s a whole, um, it’s, it’s a whole ball of wax, if you will, because the third party will hand you Amazon’s soc too.

which is great from a platform standpoint, but that’s not where the risk always is. The risk is in their operational controls. So the vendor may have a need for their own independent audit of their processes and controls on top of the platform controls. So if a third party that does say an analytics platform just hands you Amazon’s soc too, I’d be right back at ’em going, okay.

Thank you. , that’s not what I asked for. , we need to know your controls. What are you doing for change management? What are you doing for employee background checks, uh, authentication, encryption, and all the things you do within your system of, of control. Um, and where does that play into things? So there’s a lot involved in third party management.

Again, I treat ’em as threat actors, not in a malicious way, but I just need to recognize the impact they have on my environment and. Lengths that I may need to go to, to get comfort that they’re not gonna introduce unnecessary risk. Mm-hmm. . Well this is great. Um, I just wanna let you know that one of the things that we do, uh, and we have been doing at Cal site is we have this ability to go and find P h I, uh, in inventory of them across drives in, in different environments.

It’s been a very. Eye-opener in, in many ways as, as we start to shine that flashlight. Tom, as we as you, I know you like to use that analogy of, gee, dark matter, dark P hhi, a bright light shines on it. And, and, and frankly, um, there are hidden places. There are things that would scare you thoroughly when you find out where they’re hidden, where they’re buried.

Getting that inventory is, I think the, that first step of, of any kind of security posture, which is, I gotta know what I’m protecting and I have to know how I’m protecting it. Is it access rights? Is it encryption? Is it just taking this, the data off the system? I think what we heard today from. All three of you, and thank you so much for participating, uh, is that there are lots of dimensions to this thing.

Everything from how I can prevent it from happening or reduce the impact of, of a, of a bad incident, um, to what I should think about now in terms of making sure I can recover quickly from it. And, uh, thank you all for your insight. Uh, any last words in the, in the last minute from any of. I’ll start with Lynn.

I would just say, um, know what you’ve got in your, uh, possession. Uh, know how you’re going to protect it, uh, and keep checking it over and over and over and over. I could probably say it a hundred times over again because it’s constantly changing environment and um, and I think that the bad guys count on that and the OCR knows that.

Hmm, Don. I would just sum it up as saying if you know the who, what, when and where. It’s a lot easier to find out the why. Hmm. Tom, you get the closing word, . Well, I think it’s super important as you do a risk assessment and you are doing a risk assessment, right? Um, but as you do a risk assessment to take into account all the assets that you have, ID.

Computers and data, um, data being probably the hardest thing to get a, a grip on these days, but you can’t secure what you don’t know you have. It’s kind of a mantra that I follow, so at least identifying it. I like to go around with a flashlight for just what David was alluding to. It becomes. a bit of a, just a prop, but it, it makes the point of, Hey, we need to look under the dark, scary rock.

Oh wait, there’s a spider there. Oh wait, there’s a thousand records of p h i. There you don’t know. Um, but you have to really go through your organization and understand what’s where, uh, before you can really put together a good plan of countermeasures and controls and be able to report your true risk of, of where.

Well, Lynn, Tom, Don, thank you so much and this concludes our presentation. Um, thank you. We record two questions. Thank you.