Brainstorm Health 2024: Data, Security, and a More Resilient Future

  • 4 months ago
Stephen Gillett, Chairman and CEO, Verily Bob Segert, Chairman and CEO, athenahealth Andrea Downing, Co-founder, The Light Collective (appearing virtually) Moderator: Andrew Nusca, FORTUNE
Transcript
00:00 Hey, everybody.
00:03 Thanks for being with us.
00:04 Andrea, thanks for being with us.
00:07 I want to take a minute to talk about the change situation because so much has happened
00:13 and I want to just underscore the gravity of it.
00:15 So the latest estimates I looked at, one third of Americans have been affected by this.
00:20 The company is still trying to return its claims and payments flow to normal.
00:23 It's still not happened yet.
00:24 The financial losses were estimated at $100 million a day for the general, you know, everybody
00:30 affected by it.
00:32 Obviously there continue to be patient care disruptions and by the way, all those inefficiencies
00:36 to mitigate all the systems that had to be put in place when the primary systems were
00:41 affected.
00:42 So in tech, we always say it's not a matter of if you get hacked, it's when.
00:47 So Andrea, I would love to start with you, if you don't mind.
00:52 This is about trust right now.
00:54 Yeah.
00:55 So what do we do about that?
00:57 Well, first, let me share a little bit of my backstory and how I got to here.
01:05 I am a breast cancer advocate, patient advocate turned security researcher or white hat hacker.
01:12 And I started on this path five years ago because we saw that nobody was coming to save
01:18 us as patients.
01:19 And, you know, a lot of the folks that have been in healthcare cybersecurity for decades
01:24 have been worrying about these types of attacks on our infrastructure.
01:28 And now we're only seeing them get bigger and worse and exploited in ways that can cause
01:34 loss of life.
01:36 So is it about trust?
01:37 Absolutely.
01:38 I don't think of trust as an emotion anymore.
01:40 I think of it as a way that we can accurately predict benefit or harm.
01:47 And, you know, I think when we look at the current state of healthcare cybersecurity
01:52 in this country, it's really, really bad.
01:55 And we need to do a lot to move forward.
01:58 Good.
01:59 Steven, I want to come to you next, because you've worked in cybersecurity before, your
02:04 current role.
02:06 You know, in security, when you create a door, the good guys can walk through it.
02:12 The bad guys can walk through it.
02:14 How do you balance that in this situation?
02:17 Because everybody wants the capabilities to move the data around and enable more care
02:22 and all that sort of thing.
02:23 But we clearly know what happens when it goes bad.
02:27 That's true.
02:28 And security used to be, people would think about it as a perimeter.
02:32 Let's put up this perimeter and let's protect the bad guys from coming in the perimeter.
02:37 And a lot of the security structures of today are a result of what I would say is an antiquated
02:42 perimeter kind of mindset.
02:45 Because you have to think of it as the bad people are already in the perimeter.
02:50 It's how do you protect the data and the information, the intellectual property that is critical
02:55 to your business, critical to your infrastructure, even critical to your government.
03:00 And to do this, it's not just, we were talking about in the break, it's not just about buying
03:04 some spectacular piece of technology that you deploy and everything turns into a secure
03:10 rainbows and butterflies scenario.
03:13 That's not how it works.
03:14 You have to really bring together the people component.
03:17 For example, the Change Healthcare, what we do know about it publicly is it was a black
03:21 cat operation, which is a Russian organization.
03:24 But it exploited very simple configuration issues with multi-factor authentication and
03:29 what they call credential stuffing.
03:31 So they basically got a username and password, were able to log in, and it didn't send them
03:36 a text or ask them to use their app to authenticate.
03:39 There's no amount of technology you can buy that will stop the human part of that security
03:44 infrastructure.
03:45 And in this case, I think it's a human and technology together that are going to move
03:51 the needle.
03:52 It's not just your IT department buying some spectacular piece of security software.
03:57 So what do you do about that, if you don't mind me pressing?
03:59 The humans are not going away in our companies, I surely hope not.
04:04 I think there's a few things you need to do.
04:06 One is, if you're thinking about this in your day job, look at the public-private partnership
04:13 opportunities.
04:14 So NIST, NIST is a framework and organization, a standards body.
04:17 They came out with security, I think it was in 2013, over 10 years ago, version 1.0 of
04:23 the NIST framework, which was designed to help organizations navigate creating a security
04:28 profile for critical infrastructure, utilities, power, railroads, et cetera.
04:33 They've now, and earlier this year, did a much-needed NIST 2.0 framework, which takes
04:38 that same expert public-private collaboration of 1.0, but expands it to all of healthcare
04:44 and all of the services and infrastructure.
04:47 Use that, because a lot of companies, we were a part of that at my last cybersecurity company,
04:52 we were a part of working.
04:53 So you have to get a coalition of the willing.
04:56 You have to look at public-private.
04:58 You have to look at academia.
05:00 Then you have to look at your ability, think about most of these regional hospitals and
05:04 these regional organizations that are getting hacked, they can't afford, oftentimes, the
05:09 Goldman Sachs or the Google-level security talent.
05:13 It's very difficult for them, so they have to partner with these companies in order to
05:16 increase their posture.
05:18 And look at the motivations of the bad guys.
05:21 In this case, which sounds like an odd thing to say, the fact they were after monetary
05:27 gain, which is we're going to encrypt the data and we'll give you a key to unencrypt
05:31 it, is a good thing, because monetary gain you can solve for.
05:35 The more dangerous one is the nation-state that's not in it for monetary gain, and they're
05:39 not in it to discuss with you your data, and you may not even know that they did the deed.
05:45 They're the most dangerous tier of security, but the public-private partnership, I think,
05:49 will take average or even below-average security postures and make them good or great, but
05:54 requires you not try to do it all by yourself.
05:57 So real rainbows and butterflies here you're painting for us.
06:00 More little waterfalls and maybe a stream.
06:03 Okay, fair enough.
06:04 If it was just that easy.
06:05 Yeah, all these rainbows and butterflies.
06:07 Look, the NIST framework, we use that at Athena Health as well.
06:10 I think it's a great framework.
06:11 I would encourage everyone to do that.
06:13 But as you think about what has occurred with events like this, and the reality, I think
06:17 you said, Andrew, is that you either know you've been hacked or you don't know you've
06:21 been hacked.
06:22 But people can be inside your structure, and I think the real element is about resilience.
06:27 And what do you do about it?
06:28 I mean, resilience is about what do you do when conflict arises?
06:31 What do you do when something goes wrong?
06:33 And I think in the healthcare system, it's not just about technology.
06:35 I mean, everyone's working on endpoint detection and multi-factor authentication and logical
06:40 separation of databases and making sure that only certain people have all the access codes
06:45 to the data, active-active replication, putting things in a situation where you can easily
06:50 recover.
06:51 But it's also a lot more than technology.
06:53 When I think about resilience in healthcare, it's about these practices that almost lost
06:59 their business.
07:00 We talk about 50,000 short primary care physicians in the United States.
07:05 Think about all the independent docs in rural America serving underserved communities that
07:10 literally couldn't make payroll.
07:13 So we got to think as an organization, and a set of organizations as an industry, around
07:17 how we drive alternative pathways for connectivity.
07:20 So if something like change happens, what is the alternative?
07:23 Electronic health records.
07:24 How do you go to paper if you have to?
07:25 I mean, it's a terrible thought, but how do you do that?
07:28 And how do we build an economically resilient system that allows these practices to thrive
07:32 and to survive?
07:33 I think they're also, just to add to what you said, I think it's great points, is what's
07:37 the government response to the change?
07:38 In my world, that's almost an act of war.
07:42 That's that level of aggression towards infrastructure.
07:45 Those are people's lives, their personal information.
07:48 That is not something that should just be a tech issue that we're solving for.
07:52 There needs to be, I think, a geopolitical disposition that updates countries.
07:58 If you were to sink a US ship off the coast of a different country, it should be taken
08:04 with that same level of seriousness, above and beyond the technical response.
08:09 And I don't know what the US and other countries are doing to elevate.
08:13 So when you do compromise US or our allies, we view that as a much more elevated event
08:20 than I believe it is today.
08:21 Yeah.
08:22 Well, that one's above my security clearance.
08:24 Andrea may have one.
08:25 I just want to tag onto a couple of these comments.
08:29 First of all, I think cybersecurity and healthcare truly takes a village.
08:34 We can no longer assume that what happens within our firewalls, within the walls of
08:40 a HIPAA covered entity, is going to be good enough.
08:43 If we're checking the boxes on SOC compliance, on HIPAA compliance, those things are going
08:48 to work.
08:50 Hackers don't care, especially if they're nation state actors.
08:54 We're reaching a point where our ability to protect and respond, if we're thinking about
08:59 the NIST framework, is not being...
09:04 These attackers are outpacing what is actually out there within the ecosystem.
09:09 And what I'm finding, and I'm just going to give you a couple of anecdotes, we were navigating
09:15 earlier this year something with CISA, where we're starting to see infiltration of clinical
09:20 research.
09:21 And people, when you think about 80% of ransomware attacks are first leverage, what we talked
09:29 about in change with the password stuffing issue, but with phishing attacks or social
09:34 engineering attacks.
09:36 We're starting to see that with patient communities and clinical research where we're impersonating
09:40 patients, we're impersonating physicians.
09:44 And going back to this idea that it truly takes a village, we need to start training
09:48 up our army within our workforce and recognizing the second that they leave the walls of a
09:54 clinic and outside of our firewalls, we are also consumers who are being targeted with
10:01 the data that we post on social media, where we are buying things and using these consumer
10:08 driven technologies that have really poor privacy.
10:11 These are ways that we can target our workforces and our population in ways that are not protected
10:19 just because it wasn't part of the checkbox for SOC compliance and HIPAA compliance.
10:26 So I have a bit of a soapbox here, but I would definitely agree with everybody on this panel.
10:32 I'm also noticing that going back to that example of infiltrating clinical research
10:38 and having to navigate this, we've got to ask ourselves, where are the coordinating
10:42 bodies?
10:43 Today we have NIST, we have within healthcare cybersecurity, we have HISAC and BioISAC.
10:50 We have to expand capacity beyond just one healthcare system sharing intel with another,
10:57 but really to build resilience, we're going to have to shore up, as you were saying, the
11:01 public private partnerships and funding and infrastructure to really get ahead in its
11:06 detect and respond to these attacks.
11:09 I think to add to that, the unemployment rate, if you're anything in security right now is
11:14 0%.
11:15 There's more need than there are people with security expertise.
11:21 For that to change, I think what Andrew said, we need to have education and a workforce
11:25 so it's not just your security team that worries about this, but it's all of the company.
11:29 There needs to be a culture of security and privacy, particularly in healthcare.
11:34 The kinds of attacks, okay, we see change healthcare, but for every change healthcare,
11:38 there's very sophisticated attacks that you don't read about that are quite scary.
11:42 I'll share one story we talked about in the green room, started to.
11:46 This was shared to me by a security executive and it's okay to share publicly, is the Dalai
11:52 Lama.
11:53 It's a story about the Dalai Lama.
11:54 The Dalai Lama had prostate cancer a few years ago, was going in for external radiation treatment
12:00 for his prostate cancer.
12:02 Well, there's a specific nation state that doesn't support the Dalai Lama, you can guess
12:07 who, who had worked to infiltrate the radiation system so when the Dalai Lama went in for
12:12 his treatment, they would deliver a lethal dose of radiation in order to take out a political
12:16 opponent or a political dissenter.
12:19 Now they caught that hack and they were able to protect him and he was able to get his
12:23 treatment, but had that happened, you would just read the Dalai Lama passed away due to
12:28 radiation exposure and you'd never know the sophistication of the attack that was going
12:33 after that political dissident for that particular country.
12:37 When you start, and I live in this world for many years, when you start hearing these kinds
12:40 of asymmetric stories, that's why the term warfare comes to mind because we have to respond
12:47 to this almost like a NATO for security and have all of the organizations around the world
12:53 working on the things.
12:55 No one country, no one city and definitely no one industry can solve this problem on
12:59 their own.
13:00 You know, the stakes are different for nation states and they have different motivations
13:05 and it's not always economic and like you said, the non-economic is the most pernicious
13:10 and frankly the one that scares you the most and you start thinking about AI and we're
13:14 talking about AI all over this conference, right?
13:17 And I start thinking about China and I start thinking about other nation states, they don't
13:21 have HIPAA.
13:22 They don't have this concept of the privacy of data.
13:26 In fact, all they want is data.
13:28 So they're coming in, they're actually taking data on the US population through our systems
13:34 and they're using it to train their AI models and they don't have constraints around that
13:38 and that allows them perhaps to even accelerate their models in advance of countries that
13:43 have kind of a different value system or way to think about it.
13:46 So I think the whole issue around security and data and training employees and making
13:52 sure that people are not allowing these phishing attacks to happen is so, so critical because
13:57 it's just one little thing.
13:58 At Athena we have 180 million patient records.
14:02 That's a lot of data.
14:03 So you got to make sure that you're covering that.
14:05 >> Yeah, sensitive data too.
14:06 And if you look at one of the interesting stats, we're either in the year now or it
14:10 happens next year where the cyber crime, the money these bad guys are getting from these
14:16 exploits is now surpassing all of the illegal drug trafficking in the world combined.
14:22 So you're seeing all of these cartels and these organized crime units shift to have
14:26 digital arms so they can create malware, they can go after where the real money is which
14:31 is not the opioids and the drugs that we see.
14:33 Those are deadly and important, but it's these cyber crimes and that's only going to exacerbate
14:38 these issues particularly going after the data.
14:40 >> I'd love to come to the audience for a question in just a second, but I have one
14:42 more for you.
14:43 Folks, are we worried about, I mean, we've been talking about these concerns for years
14:48 and years and years and years, especially at the Fortune 500 level.
14:52 It's not new.
14:53 It's a matter of if, not when.
14:54 So are we concerned about the additional fallout from the change incident, whether that's regulation,
15:03 God knows there will be fines, just as business leaders, what are you concerned about in the
15:10 aftermath?
15:11 >> Look, I don't know if there's more fallout that I'm concerned about.
15:14 I mean, look, cybersecurity is an inevitability.
15:17 To think it's going to abate and go away I think is just being naive.
15:20 It's going to continue to happen.
15:22 I think the benefit of change is the resilience hopefully that is coming out of the system.
15:27 So it's additional redundant connections and connectivity points.
15:30 It's people that actually had a chance to exercise going to paper and thinking through
15:34 how they're actually going to operate in a non-digital world.
15:37 So I think if anything, there is a small silver lining that comes out of this, which is the
15:41 ability to practice and learn how to react to cyber incidents that actually drive a shutdown
15:46 of a major kind of national system like that.
15:48 So I'm actually optimistic it'll be helpful long-term.
15:51 >> Right on.
15:52 All right, let's go to the audience for a question.
15:53 Yes, we've got one right back here, please.
15:55 Your name and your affiliation, please.
15:57 >> Yeah, Venk Varadhan with NanoWare.
15:59 I want to touch upon, Bob, your last point, and I think it's a tie-in to the previous
16:04 panel when we talk about consumerization and the human element that you guys all talked
16:09 about.
16:11 We're getting numb to these hacks, right?
16:13 I mean, AT&T was just hacked, Experian, UnitedHealthcare.
16:19 The consumerization of every other industry, we're just voluntarily sharing data, the human
16:25 aspect that you said.
16:27 Maybe from a policy standpoint, how do we kind of shift that mindset into healthcare
16:33 where, Bob, your point about HIPAA and not existing in other places is actually in some
16:39 respect holding us back in being able to deliver better care because we get so up in arms about,
16:47 oh, my healthcare data is gone, but all my Instagram data is hacked and that's fine,
16:53 or we're numb to it.
16:54 I'm curious on the cultural aspect and the policy aspect.
16:57 >> Oh, I have a lot to say about this.
17:00 >> Take it away, Andrea.
17:01 >> Thank you.
17:02 >> Go ahead, Andrea.
17:03 >> If I may go.
17:04 So I'm going to talk about my origin story here.
17:09 Back in 2018 after Cambridge Analytica, we started the Light Collective because we saw
17:15 how Facebook profiles could be scraped to create predictive algorithms and analytics
17:20 about our consumer data, from our consumer data about us to predict our behavior.
17:27 And I was asking a simple question, which is how does that impact our health data when
17:32 we share from vulnerable patient communities on social media?
17:37 That moved on to doing some research in 2022 and a proof of concept around how Metapixel
17:43 and other cross-site trackers or surveillance trackers could be installed in HIPAA-covered
17:48 entities in order for us to target ads to people.
17:52 And as it turns out, these pixels or ad tech tracking technologies are rife within the
17:59 healthcare industry, leaking data at a scale that we haven't yet seen.
18:03 There's been a wave of federal and state class action lawsuits and a definitive ban on tracking
18:11 technologies on HIPAA-covered entities' websites.
18:14 The irony here is the American Hospital Association is fighting these ban on trackers, even though
18:20 it's leaking data at a scale that we've never seen.
18:23 And they're fighting it because of the liability.
18:25 So I really think we need to take a different approach when it comes to thinking about health
18:31 systems as a business entity and who we're protecting.
18:37 When we have a fiduciary role to a company to maximize shareholder value, are we really
18:43 talking about protecting the patients and the physicians who are at a point of care,
18:51 or are we talking about protecting our liability and our business assets?
18:56 I really think that we need to shore up our consumer privacy laws, especially health privacy
19:03 here in the U.S., because if we're leaking these data, and as our other panelists mentioned,
19:11 using those data to train AI, and there's this arms race going on, we are in a very
19:16 difficult place where ransomware attacks are only going to become more prevalent and easier.
19:23 We're nearly at time, so I'll ask you just the lightning round version of that, which
19:26 is, Bob, Steven, can we change culture sufficiently to address these problems, if I may?
19:31 Boy, can you change culture.
19:33 I mean, even in an organization, it takes like decades to change culture.
19:37 Look, I believe consumers have an individual liberty to do what they want, and to try to
19:42 start to dictate what consumers can do with their information and what they consider to
19:46 be hypersensitive or not sensitive is a really slippery slope.
19:49 I think the best thing that we can do is just making sure that we're protecting people's
19:54 options and where people do have tighter restrictions around the way that they want to manage their
19:57 personal data, that the industry is responsive to that.
20:00 We are not a facilitator of that data being exposed in a way that they don't want, but
20:05 if someone wants to expose their medical record online to somebody, I mean, who am I to say
20:09 that they can't do that?
20:10 So I do think it's tricky.
20:11 I think the industry has to make sure that we can match the privacy needs of each of
20:15 the individual consumers.
20:16 Right on.
20:17 Steven?
20:18 I would just add a quick ... There needs to be alliances, public, private, academia, individuals
20:24 need to work together to create, over time, a much better posture for bad actors to have
20:31 to go against, but also force them to understand what is the real important data of a company.
20:37 Do I treat my medical record the same as I treat my credit card statement, that I treat
20:41 my social media profile?
20:42 It also forces everyone to say, "These are the crown jewels.
20:46 These are what we protect at all costs.
20:48 Not every data structure is going to be treated the same."
20:51 And then we can put a lot more controls once you have that identity done in your organization
20:56 or your health system.
20:58 And I think that's important, too.
20:59 So you're not protecting everything the same, and you bring together these constituents
21:02 to form a much stronger posture for bad people to have to go against.
21:07 Well said.
21:08 This was a wonderful conversation.
21:09 Please, a round of applause.
21:10 Thank you, all of you.
21:11 [applause]
21:13 [end of transcript]
21:13 [BLANK_AUDIO]

Recommended