Kitecast

Jacqui Kernot: When AI, Security, and Compliance Collide

Tim Freestone and Patrick Spencer Season 2 Episode 32

Jacqui Kernot, the Security Director at Accenture for Australia and New Zealand, boasts over two decades of extensive experience in cybersecurity, spanning multiple industries. Recognized for her authoritative voice on diversity and inclusion alongside cybersecurity risk management, Jacqui is a well-regarded speaker who frequently addresses these pressing issues. She is committed to pushing the boundaries of cybersecurity and focused on integrating cutting-edge AI and technological advancements into the security domain.

In her recent appearance on the Kitecast episode, Jacqui illuminated the transformative impact of AI on cybersecurity. She pointed out that although AI technology is still emerging, the foundational steps taken today by organizations to build robust infrastructures will be pivotal. Jacqui stressed that companies poised to anticipate future technological needs and begin laying the groundwork for AI integration will likely lead the industry. This strategic foresight is crucial for fully realizing AI’s potential and maintaining a competitive edge in cybersecurity.

A significant portion of Jacqui's discussion centered on the imperative of data sovereignty and stringent management practices. In an era increasingly dominated by large language models and cloud-based technologies, securing and responsibly managing data is paramount. Jacqui advocated for strict data governance frameworks that ensure data is accessible only by authorized personnel, emphasizing that responsible AI deployment is fundamental to future security architectures.

Jacqui also delved deeply into the role of Zero Trust architecture in today’s cybersecurity landscape. She explained that as organizations increasingly migrate to cloud services and face more complex cyber threats, adopting a Zero Trust approach is crucial. This methodology is not only essential for blocking unauthorized access but also vital for building resilient security protocols that can robustly counteract potential breaches.

Looking forward, Jacqui shared insights on the evolving challenges and opportunities within cybersecurity. She highlighted the necessity for security strategies to remain adaptive and vigilant against new threats while also leveraging emerging technologies. The discussion touched on the need for more sophisticated security measures that can effectively safeguard against the evolving landscape of cyber threats, ensuring that organizations can protect their critical assets in an increasingly digital world.

LinkedIn Profile
www.linkedin.com/in/jkernot/

Accenture
www.accenture.com/us-en

Check out video versions of Kitecast episodes at https://www.kiteworks.com/kitecast or on YouTube at https://www.youtube.com/c/KiteworksCGCP.

Patrick Spencer (00:03.342)
Hey everyone, welcome back to another Kitecast episode. I'm Patrick Spencer. You'll notice my co -host Tim Freestone is not with us. He is overseas traveling and hopefully getting some sleep right now. I'm not awake at this hour. That said, we're going in the other direction Tim is at right now across the Pacific Ocean to Australia. And we're gonna have a conversation today with Jackie Kernot. Jackie, thanks for joining us today.

Jacqui Kernot (00:31.141)
So great to be here, Patrick, and as we discussed earlier, I think Tim, we're all in recovery from RSA, and if Tim's having a sleep, I'm very grateful that he gets a sleep. Definitely need to catch up.

Patrick Spencer (00:39.694)
Yeah, he was not very wise, I think, to plan a trip to Europe the week after RSA. So a two -week trip on top of that. He'll be worn to a funeral. But there's always Memorial Day for him when he returns. So.

Jacqui Kernot (00:47.141)
Yeah. Wow. Yeah, I,

Jacqui Kernot (00:55.045)
Well yeah that is good. I reckon I really would need to sleep for a week after that. It's been, I'm glad I came back for the weekend.

Patrick Spencer (01:01.902)
I don't do two -week business trips. I think the last one I did I was told when I got home that was the You're not doing that again, and I didn't argue so So a quick introduction for Jackie everyone's gonna be really this is gonna be a really fascinating conversation, but She Let me start but fix that obviously show

Jacqui Kernot (01:10.341)
Yeah, very sensible at home, I think.

Patrick Spencer (01:29.774)
we have a real treat today. and Jackie, as I mentioned, joins us from Australia. She currently serves as the security director at Accenture for Australia and New Zealand. She's been in the role for not quite two years, I think a year and nine months or something along those lines. She has more than two decades of experience across a wide range of industries. Speaks frequently on diversity and inclusion, cybersecurity risk and other topics. And as she just mentioned, she.

just returned from RSA over in the Bay Area. So she hopefully is over her jet lag by now. And maybe that's a great segue, Jackie. So I was not able to attend. I had to stay home and help run marketing operations while Tim and others had fun at the event. But you were there. What are a couple, let me say two or three takeaways that you might have based on what you heard at the event or saw at the event?

Jacqui Kernot (02:25.575)
So, you know, as we talked about Patrick, I didn't see you, but I did pet the puppies in the stand. That was very cool, the kite car stand. So, awesome to do. And look, I think one of the things that was interesting to me about RSA was, you know, obviously everyone was talking about AI and what became really clear is that AI hasn't fully landed yet, right? We're just...

kind of starting to get a taste and it's already what everybody's talking about. But it's the people who are starting to think two, three years into the future, which is really difficult to predict, but who are starting to build the infrastructure and the capacity to really make use of what AI can do. They're the people who are going to kind of win, right? And there's some companies that are thinking, you know, really forward thinking around infrastructure. And you know, what I,

It's funny, we think so much now. In fact, I was talking to one of my grads the other day and I said something about the OSI model and he said, what's that? And I said, you've just come out of uni, had a master's and a bachelor of computer science and he reckons they don't teach that anymore. I reckon he might have been sick on the day anyway, but apparently they reckon they don't teach it, which is quite hilarious. But I do think in today's world, we spend a lot of time.

Patrick Spencer (03:33.55)
Jacqui Kernot (03:45.8)
you know, maybe not security practitioners, but we spend a lot of time in that, you know, in the top layer, you know, in the, in the app layer. And we spend a lot of time thinking about social media and all the threats there and, and that application layout, but you know, you can't get past the transport layer. It's still really important. And I think as we start to, to put everything into LLMs and into cloud workloads, to be able to run those LLMs better and access data and transfer it better. And we start to think about.

how we don't mix data, how we get, you know, whether it's data sovereignty or data control, how we manage all that stuff. You know, we're back to thinking about the transport layer, right? So we're really back at the bottom of the stack. Like how do countries do something if undersea cables are disconnected and the processing capacity of the AI is mainly in the US, for example, what happens there. And so I think the people who are seeing the hype, not necessarily feeding into it.

doing a little bit of stuff now, but who are really starting to think about how we build infrastructure that will take us into a new era, really, for humanity, frankly, are the people that are going to kind of leap ahead pretty quickly, because what's becoming clear is that very quickly, especially when we start talking about exploding malware, et cetera, we're going to actually run out of capacity and tokens.

Patrick Spencer (05:08.334)
Yeah, yeah, interesting. Well, before we started the actual recording, you and I had an interesting side conversation where we discussed the fact that, you know, we really don't know how to plan for the future. You know, we used an example or I cited an example of one or two of our marketing applications. We're like on a second or third LLM iteration and we may change again in another three months to something completely different. So.

from a security and governance standpoint, how in the world do you plan for that? Or is there a way to plan for that? And that's probably one reason, you know, there's so much uncertainty out there.

Jacqui Kernot (05:49.801)
Absolutely. And look, I think we were talking earlier and our chief AI officer, Lan Gwenn, who's just, you know, what she doesn't know about AI, you know, I think we can safely forget. It was really interesting to be speaking to her a couple of months ago when we launched our GenAI studio here in Australia. And she sort of said, you know, if you'd asked me nine months ago which LLMs we'd be using and how we'd be taking advantage of them and what would be leveraging, I would have told you something completely different actually.

to what I'm telling you now. So literally the technology, three, six month blocks is changing so quickly. And it's very difficult to predict what to do. And I think that's why, in terms of your question, it's really hard to answer, right? And it's why we've got to be so robust about responsible AI deployment. We use kind of five pillars for every AI deployment, but under so many...

clients are really still at a position of not really understanding where all their data is and what it's doing. And I think, and I hope, we are finally getting to the stage of understanding how dangerous data can be and that you just shouldn't be collecting it for no reason. And if you do collect it, you really need to think about how you make sure absolutely no one but the right people have access to it, right? And I think, unfortunately, despite the conversations we've been having for a decade, probably about zero trust,

We're still probably not there. So yeah, the state of data readiness and governance kind of, you know, worries me a little bit, but we're going, we're in the age of AI. Yeah. Yeah.

Patrick Spencer (07:28.686)
Well, for good reason, you probably saw the IBM report that everyone releases the report. We actually hang on to ours. We do ours like a month after the RSA event because there's so much darn noise. They don't pay attention to it. IBM came out with one and I wrote a data point here. 24 % of projects in this big survey that I did, I think several thousand organizations say their AI projects are secure. So that means what? Over 75%.

Jacqui Kernot (07:33.738)
Mmm. Yeah.

Patrick Spencer (07:57.742)
aren't secured. So to your point, there's a lot of risk out there from a, I mean, you talk to a lot of customers, you guys do a lot of audits and assessments at the same time. Is it just employees being stupid, lack of process and what kinds of data is being exposed or is it all different types of data?

Jacqui Kernot (07:59.369)
Yeah.

Jacqui Kernot (08:18.987)
look, you've got everything but the kitchen sink. So yes is the answer to all your questions. I think the big thing is that we usually still haven't like, most people don't know what all their assets are in a CMDB. Like if I get to an organization that has more than 70 % of its assets listed in the same day, I'm like, why you are doing well, right? So, you know, if you don't even know what you've got kicking around on the network, it's very difficult to make sure that whatever those devices are on the network don't have data in them.

that they shouldn't have, you know, and, you know, repatriate, we, we went through that thing where we were like, collect all the things, you know, collect all the names, collect all the numbers, collect all the stuff, put it somewhere. We'll be able to monetize data at some point. So we did this big process of, of collecting it. And, you know, we need to do that kind of Marie Kondo process of, you know, if it's not spiking joy now, it probably never will get rid of it.

Patrick Spencer (09:14.766)
You

Jacqui Kernot (09:16.236)
but also kind of understanding what you have, who's got access to it. You know, I think the big thing that we still haven't solved around Zero Trust is identity. And I think that's why a lot of clients are really going into big identity programs and just, you know, whether we'll ever get a reasonable size organizations data into a shape that we can reliably, you know, run an LLM across with no.

with 100 % confidence. I'm doubtful that anybody will ever get to Nirvana. Maybe we can ask the AI to do that once it's trained, but yeah, I don't think we're gonna do it any other way. But I do think that's why a lot of clients are starting to think, right, we really need to start getting identity right. And actually, interestingly, that's some of the interesting stuff that we see from some of the identity vendors that's coming out. So some of the tools and some of the ones that we've been working on with.

Patrick Spencer (09:51.95)
Yeah.

Jacqui Kernot (10:11.916)
with the vendors actually are all geared around accelerating identity projects. Because, you know, I think it's not ideal if you've got stuff that you probably don't need and shouldn't be keeping. But at least if no one has access to it, including the AI that's built on the same governance rules, you know, you've got a level of protection that doesn't necessarily make me feel comfortable, but at least it's better than nothing. And of course, there's the other thing which is just

really making sure you're the boss of your LLM, not the other way around. So you really need to make sure, as you say, I think you use hermetically sealed container that the LLMs access. I think that's a really good term, making sure that you've boxed it in and really at least run some solid data governance processes against that set of data is critically important. And really, even if you're not going to deploy,

AI on anything for a little while, it's, you know, get your data ready, get your data ready, is what we're telling people all the time. So unfortunately then we get a lot of responses back, which is what do you mean by that? So, but at least that's a good place to start.

Patrick Spencer (11:24.302)
Now, one of the items we did a bunch of work around the NSA here in the States released advancing zero trust. Once again, zero trust is talked about everywhere as to how you define that. It varies obviously, but advancing zero trust across the data layers, they have seven different data pillars. And one of them is data tagging and classification.

which we do some in. Actually, we went through all seven of those data pillars and Kiteworks enables organizations to align with each of those, which we're proud of. We just published something on it actually. But if you look at data classification and tagging, back to identity, I assume that's an important piece of the puzzle, knowing what data you have and where it's actually at and who has access to it.

Jacqui Kernot (12:14.766)
100%. So, you know, the five no's, if you like, you know, who has access to it, where's your data, is it encrypted, is it encrypted and moving it at rest. Super important. I don't, it's so bad to think that it feels like Nirvana, but you know, because it should just be the base level. But unfortunately, you know, I don't think many people could put their hand on heart and say, I know where all my data is and who has access to it. I think,

There are some great tools now that are coming out that can help you start to kind of map and classify some of that data for you. And security AI is one that we use. And I think because of that exact problem space, it's going to be a lot more popular than maybe otherwise would because actually no one can see, you know, has effectively identified and mapped all the data and, you know, data mapping on an organisational scale is a huge exercise, right? It's a...

Patrick Spencer (13:12.91)
Yeah.

Jacqui Kernot (13:14.542)
It's a computer level exercise, right? I don't know how many security people you'd need to throw at that problem or data people you need to throw at that problem to get that done. And look, the other problem is, of course, you know, we're back to the same old catch -cries. We've got no people. There's not enough data and AI people. There's not enough cyber security people. So even if you were going to throw people at the problem, where are the people?

Patrick Spencer (13:35.662)
Yeah, you don't have that. You may have people with they don't have the right skill sets. The people you're looking for. There's a finite number. Yeah, yeah. Yeah, we talked about the.

Jacqui Kernot (13:41.518)
100%.

So being able to automate is good, I think, if you can find a way to automate some of that. I don't actually see any other way to do it. Like we've all stored so much data in the promise of data and the promise we could somehow monetize it. I don't know who came up with that. But anyway, I hope they feel sorry now.

Patrick Spencer (14:04.078)
Yeah, well, there's no way to tackle these digital problems with automation. Whenever someone starts talking about manual, it's just face in palm. No, please don't. Please don't go there because it will never, never achieve the objective, as you will know. Now, we talked about the risk associated with AI and it's certainly significant. Did you hear much talk about

Jacqui Kernot (14:15.023)
Yeah. Yes, exactly.

Patrick Spencer (14:31.374)
using AI for cybersecurity for good. Any cool things you heard about? You probably saw the dark trace report that came out also at about the same time as RSA that I think it was like 96, which is no big surprise. 96 % of people think that AI will advance cybersecurity in the coming year. I don't know where the other 4 % are, but that was not as a whole.

Jacqui Kernot (14:53.134)
Did not respond to a survey. Didn't understand the question.

Patrick Spencer (14:56.878)
Yeah, I guess, I don't know.

Jacqui Kernot (15:01.68)
So yeah, look, I think one of the things is that at a conference like RSA, obviously you've got a lot of people trying to get their products bought, which I strongly support. We've all got to earn a living and that's why we all go, to hear about all the new shiny things. I do think cyber security people tend to be a little bit obsessed with new shiny things and understand the appeal, right? Because I'm one too. But I do think one of the things we need to think about is how we consolidate and

cut down, you know, how we declutter, like I said, that kind of, you know, the life changing magic of tidying up, I think it is, isn't it, by Marie Kondo. So we need to, I don't know if you've got that Netflix special where she goes around and cleans people's houses. Apparently her house isn't that tidy now either, but so maybe, so maybe it's the same. Yeah, but yeah, exactly. So I do think there's a, we need to do kind of a lot more cleaning up and shutting down than we do buying some new stuff to do new stuff.

Patrick Spencer (15:47.278)
Yeah, it's grown as she paid more money. She bought more stuff.

Jacqui Kernot (16:01.616)
However, and you know, I think obviously at a conference like RSA, you're going to hear about how amazing it is and how it's going to transform it. And I actually spent a bit of time amusing myself going around stands with AI enabled. I'm like, what do you mean by that? What are you actually doing? What's the AI doing in it? And actually fewer than 50 % of the people could answer the question effectively. So I think...

Patrick Spencer (16:21.87)
Yeah, they're about to make that bet.

Jacqui Kernot (16:24.849)
Yeah, I think I don't know whether they just had terrible people on the stands or whether their products not really doing much actual AI. But I do think there are some really promising technologies starting to come out. And as I said, what I think is critically important is actually the people who are starting to think about the future power of it and the capacity. And we spoke to Google actually, and we had some really good discussions with those guys. And

the way that they are thinking about planning and capacity. And one of the interesting things that struck me is things like, you know, we're seeing a lot of carriers and things looking at undersea cables, you know, how they have resilience and backup, you know, because they don't want to be necessarily disconnected from some of the processing capacity. As they start to plan data centers, they've started to run, you know, these

AI enabled data centers, which, you know, and Google's actually after the Aurora attack in 2009, that actually built it rebuilt from the ground up the hardware. So they run custom hardware, right? To run these data centers. That's really forward thinking in terms of understanding how to build this. And they said, they, you know, built a whole data center, which sounds sounded terribly wasteful, but probably true. And then realized that actually the physical power that it needs is like,

to the power of it's an exponential power requirement on the basis of the data center capacity. Once you start using all this, you know, really leveraging the AI and running, you know, huge numbers of tokens. And so, you know, basically had to like this, a data center, they had to go, right, well, we just can't actually get the power to that. It's pretty much useless. We're going to rebuild it, start again, right, and run more power to it. So, so they're not sure what they're going to do with that one, but the new ones are built with this incredible power capacity, right? So the,

Patrick Spencer (18:10.094)
Yeah.

Jacqui Kernot (18:21.17)
Just the physical limits of this stuff is immense. And I think we're not, I haven't seen many vendors actually thinking that way. Like how do we plan for the physical limits that we're going to need to push? And look, maybe quantum will assist with that, right? Because it'll give us that processing capacity. But, you know, that's also extremely power -intensive.

Patrick Spencer (18:43.822)
That was going to be one of my questions. I mean, you saw the latest chat GBT release, which I downloaded the same day or started using it. And it's what, three, four times faster than the chat GBT release for four release. And probably the next one in three months will be three or four times fast. So perhaps those along with quantum will help us overcome some of those power challenges. Those have not been figured out yet.

my knowledge anyway. I haven't seen it.

Jacqui Kernot (19:14.835)
No, 100%. Yeah, and I think that's the problem. We haven't really started to think about the physical limits of what we're going to be able to do with it, right? And so, you know, if you take, you know, reverse malware engineering or malware sandboxing, for example, and you're trying to explode some malware, you know, you don't want to have to break it into parts and explode it in different sandboxes because you haven't got the capacity. Because that won't give you the true reflection of what it's actually doing in the network. So to be able to kind of explode the full...

you know, full chunk of malware in the same, in the same space is, is so much more powerful, right? And it's going to be much more effective at actually reverse engineering it. So, I think when you're starting to plan for capacity, you know, say you've got five bits of malware, you want to exploit it once, right? How do you, how do you actually get that done? Right. So hopefully quantum, will bring a new layer to that.

kind of physical requirement. As you say, I don't know what the physical capacity requirement of the new chat GPT is, but even then, so what happens if that gets cut off and we're already using it? You start to think about those, and maybe by force majeure, right?

Patrick Spencer (20:25.518)
Yeah.

Well, I can't do my job. I hate to say in marketing today, if chat, GBT and Claude and some of the other, you know, perplexity, they go down. We can't do our job. I have to go get a cup of coffee and wait for it to come back up again. That's where we're at today. So, you know, we need to figure out those challenges, obviously. And, you know, we still have, I think it probably listened to a podcast last night. I think they're spot on where.

Jacqui Kernot (20:33.587)
Mmm.

Jacqui Kernot (20:42.227)
Yeah.

Patrick Spencer (20:56.302)
The smaller organizations are so much more agile. They're able to embrace AI much faster. And we still have way too many, and they do a lot of marketing space consultation with the Fortune 500 companies. There's so many of those that have not pivoted, that are still doing things the way they were doing things, you know, pre -November 29th or whatever the date was when, you know, Chad GPT 3 .5, the advanced version that came on the market. So.

It will be interesting to see what transpires there. There was one other transit. We could talk about AI forever, obviously. There's plenty to talk about.

Jacqui Kernot (21:32.66)
Yeah, no, totally. And look, to be honest, I think enough people are talking about AI. There's probably other things to discuss.

Patrick Spencer (21:37.966)
Yeah, we talk about AI and about every podcast. I think it's inescapable just because of how ubiquitous it is today and it's top of mind of all the thought leaders we talked to. If you looked at the Verizon report that came out, I wasn't surprised because we watched this. In fact, I did a webinar with our chief product officer in Charles Mandiant, the CTO over at

Charles Carmichael over at Mandi, don't get it right yet. Charles Carmichael over at the CTR. Dyslexia, I guess, or something. On the NFT breaches that happened last year. And they were all over the Verizon data breach investigations report, as you probably saw. And it's supply chain related. And we see more and more, particularly the nation state attackers, but also some of the sophisticated cyber criminal groups.

Jacqui Kernot (22:08.564)
Yeah, yeah. I knew who you meant.

Patrick Spencer (22:34.414)
figuring out that one -offs, they don't get them that far, but if they go hit one of these software supply chains, suddenly they can attack hundreds, thousands of companies and access millions and millions of actual records, which has happened with the Move It breach, and even with the Go Anywhere breach, it wasn't quite as large, but it was quite substantial. The Verizon report showed that 15 % of all the data breaches last year were related to the supply chain.

A 68 % increase. So what's your perspective on that? And is that threat also real in Australia and New Zealand? I assume it's a global threat. It's going to be the same here as it is in Australia and New Zealand as it is in the UK.

Jacqui Kernot (23:16.693)
Absolutely 100 % and I look I think here our regulators have you know financial regulators have been really focused on supply chain risk and we've been seeing more government regulation come out as well as from the regulatory bodies like APRA. So APRA's CPS 230 and 234 legislation here all about supply chain risk and and kind of understanding you know what they call third -party risk but supply chain risk largely. I think

we're seeing a bit of focus on it, but I'm certainly still saying low maturity levels across, you know, non -financial services companies. And I think that's problematic, right? Because, you know, one of the dynamics that you alluded to with supply chain and some of the other stuff we were talking about is that we are all becoming so interconnected, even like separating LLMs can be very difficult, right? Depending on what data you're using in it. So,

you know, sometimes we can be our own risk in supply chain, you know, if we're if we're running that. So I think there is there needs to be a significant amount of focus on supply chain risk and really the amount of money that various companies are spending on it. We really need, you know, I was talking to the minister a while back in the consultation process and I over here we have I don't know if you have it in America, but we have like a heart foundation tick for things that.

Patrick Spencer (24:16.59)
You

Jacqui Kernot (24:43.446)
you know, a low cholesterol and good for your heart. And so food products have, you know, the heart foundation tick and on like we actually just need a, you know, good, better, best kind of t -shirt sizing for suppliers, you know, and everybody just needs to fill that questionnaire out once and it needs to be comprehensive. The fact that we're all designing separate programs and, and there's no standardized supply chain, you know, management and risk quantification processes, you know, we're seeing actually one of the things that another RSA thing and

Patrick Spencer (24:46.03)
Hmm.

Jacqui Kernot (25:13.207)
Another dynamic is we're starting to see the rise of automated risk quantification tooling, which is really good because I think the amount of risk that sits in a supply chain versus the amount of risk that sits in the business, people sort of seem to rate it as less than, but depending on how interconnected those systems are, as we start to automate more with AI, we're exposing a lot more risk.

You know things will be faster services will be faster, but the risk increases exponentially when you add a hundred suppliers to that You know data set that previously perhaps only five had access to in order to speed up transactions for example But you know it doesn't just reduce the risk or increase the risk by five percent it exponentially increases the risk right? It's you know when you've got a hundred suppliers. It's to that the risk goes to the power of not

not just, yes, five, you know, 100 minus five, so it's 95 % more risk, like it's not at all. And I think really there has been a very long history, you know, had a military past and, you know, damaging the supply chains is always a great strategy, right, from the 15th century and the situation's no different really if...

if it's too hard to get into a supplier, even if you have a robust, you know, to get into the original organization, utilizing the supply chain is so straightforward, right? Especially because you can usually tell by the products and services offered that that supply chain facilitates, you know, just how interconnected those systems are. That's easy for an attacker to work out, you know, from the internet.

Patrick Spencer (26:57.006)
Yeah, well, like about Hannibal. He didn't have a supply chain because he couldn't extend it over the Alps. He got to, you know, over into Italy and Rome and, you know, he muddled around there for three or four years, but without a supply chain, he finally ran out of gas and had to go back to Carthage and the rest is history.

Jacqui Kernot (27:03.928)
Yeah.

Jacqui Kernot (27:15.32)
Yeah, well, yeah, 100 % right. And, you know, I always think we, you know, these lessons are very old, but we need to learn them, right. And so this idea that your supply chains, you know, not to worry, you don't need to, you know, it's less than the risk to your organization. I mean, that's the way people will get in. If you have yourself really effectively, and people will go great, we'll get in through the supply chain. And even now, you know,

third party contractors, et cetera, the easiest way to get into an organization, right? So managing physically those people, managing what they do on the network, et cetera, because they're not, they are part of the organization, but not in the same way that a full -time employee is. Those things are really obvious.

Patrick Spencer (28:03.406)
Verizon report 87 % of all data breaches are tied back to guess what end users. Now it didn't, it would be interesting, hopefully they add it to next year's report. What percentage of those end users are actually outside of the organization and third party contractors or partners or in some of your going on, you know, the fourth level of the supply chain, fifth level and so forth. There's been a couple of interesting reports over the past year actually on that, but.

Jacqui Kernot (28:09.464)
Yeah.

Patrick Spencer (28:30.446)
It's the weakest link to your point is the actual human human.

Jacqui Kernot (28:34.809)
100%. And you know, look, I think we think one of my one of my dear friends, who's a genan budget for us to has been really working on human risk management. And, you know, her work is amazing. And the depth to which she goes, but it's, it's amazing how little we focus on human risk management, and how we focus on people who are tired, you know, which post COVID, everything's accelerating.

Things are pretty hard, the market's hard, economic conditions aren't great. You know, and I'm not ashamed to say, a shout out to Chris Burkhart, our CISO, who designs great fishing programs with his team. And I did get fished yesterday with an internal program because they sent me a thing about some rewards I need to spend before the end of the quarter. And I thought, someone's given me some rewards. No, they hadn't. But I'm back from RSA and I'm tired. And it's not that I'm...

Patrick Spencer (29:14.318)
I'm just kidding.

Jacqui Kernot (29:31.674)
not educated, but I'm trying to chew through all the stuff, catch up, I'm tired, I've got a bit of jet lag. And of course, you know, so the amount of focus that we need on that human risk management and how we manage burnout, tiredness, you know, the inability to actually really focus on one thing, you know, it's not like I need to do more security training, I'd argue that's not the problem.

Patrick Spencer (29:40.878)
That's a great one.

Jacqui Kernot (30:00.538)
you know, and all the indicators were there. It clearly said external blah, blah, blah. And our rewards thing doesn't come from an external thing. But, you know, when you're trying to run through things quickly, you know, it can be tricky even for the most educated user. And so I think how we focus on, on managing, you know, risk quantification in that human element, how we not let people get too tired, how we, you know, don't, give them difficult.

complicated things to do without support so that you know when someone socially calls them as a social engineer they think this is another of those difficult things that I'm supposed to do without support that someone's asking me to do from somewhere I don't know and but this time actually you know it's someone socially engineering their credentials credentials right it's we make life complicated in the in the era that we're in for humans

in the space and they're under a lot of pressure and you know they're worried about things like economic conditions so how do we support people so that they make better choices you know in terms of work. I don't think we think about that type of thing enough and I think it's something that that Jananne's research is really bringing to the fore but you know I hope we start to think about it more because you know 83 % of breaches through human vectors I'm surprised it's not higher to be honest.

we have got some really good security products and tooling in place and humans are a lot easier to.

Get access to those guys.

Patrick Spencer (31:34.51)
Now there's that social engineering element, you can't socially engineer responses typically out of a machine to your point. So that's a great point. So on that topic, and you have a lot of experience in this space as well, and in Australia you've had IRAP here, and usually it starts at the government level. You have CMMC here, you've heard a lot about that over the past year, and it's gonna start to have some teeth. You have certification programs like,

Jacqui Kernot (31:41.755)
Yeah, well exactly.

Patrick Spencer (32:03.374)
SoC 2 and then you have FedRAMP or FedRAMP moderate. The others FedRAMP high and so forth. Can we begin to implement security protocols and standards? You had the CESA thing that's very high level, obviously that was signed actually at the RSA event. We were one of the pioneers. But to get more granular than that, you start to dive into 110 controls or 300 controls or whatever might be the case.

Will that help us to mitigate some of those risks in the supply chain?

Jacqui Kernot (32:35.612)
Look, I think regulation always helps. I'm a huge fan of regulation because the reality is there's a bazillion things to focus on and until we get regulated, we often don't think about things, even sometimes if we know that they're key risks, right? So regulation really helps to solidify that in people's minds and I think, you know, and bring that attention and focus. I think that's important.

And it also brings the top down focus, which is super important as well. I do think that regulation and frameworks have a limit, largely because often, the struggle for compliance creates people who are more focused on compliance than the problem or a situation where you're so focused on compliance, you can't focus on the problem. And certainly, when we look at crisis response and resilience here,

You know, we've added a lot of regulation in a short time, arguably in Australia, we were under regulated, but, you know, and it's great that that's quickly happened, but it was really interesting to hear in some of the breaches that we've had some of the large breaches that because of the overlapping legislation, it's quite difficult to work out how to comply or how to actually run the business in the event of a breach. So one of the organisations, I was talking to the lawyers who are kind of helping them.

manage all the ASIC regulatory halts and trading halts here. And, you know, they would do a mandatory notification to the Australian Cyber Security Centre, which of course is a good thing to do. And there's a time frame on that. And then, you know, half an hour later or 15 minutes later, they would get a call from ASIC. Here we go, you've been on trading halt for too long, start trading again. We've just set the ACSC notification. No, this is going to kill our share price. Can we let it rest for 45 minutes? And ASIC's like, no.

You can't. And so now they're looking at whether they actually make those mandatory notifications, perhaps not public. So you can, you can probably do it to encourage people to really report. But, you know, a CEO who's already in the middle of a massive crisis is dealing with the press. And, you know, he gets lawyers saying to him, you've got to, you know, ASIC's telling us to start trading. You've got, you know, the level that the CISO sort of saying, you know, I've got to report to the.

Jacqui Kernot (34:57.533)
to the ACSE, I've got mandatory reporting, there might have been other kind of mandatory reporting things. There's something like 45 different regulatory reports you've got to do within the first 72 hours of the crisis. Some CSOs were telling me, I won't quote that number because I don't know if it's been quantified and true, but it's in that vicinity. That is a lot of regulatory reporting and you've got all sorts of different people who need to do it. And the poor CEO at the top is going, I've just killed the share price of my company. Like,

what, you know, I'm going to lose my job. So he's got all that existential angst and the board yelling at him, hopefully not yelling, but probably not happy who he's got to kind of brief and manage. That is a lot for any group of humans or one human being to be focused on and manage. And so I think how we simplify reporting and regulatory frameworks and think about, you know, I think,

The way that we think about security and look just the way we operating corporate life anyway is quite individual. And I think what's really becoming clear, you know, in Australia, we say cyber is a team sport. And I think there's a lot of individual collaboration between security people. But until we start thinking about how we manage a lot of this regulatory framework stuff together, how we share data on

suppliers will make it easy to digest so not everybody has to run a multi -million dollar third party supplier chain program. How we kind of have a professional framework and certification for people is really important. I actually had a funny conversation with one of my mates who heads up Army Intelligence here now and I was talking to him about doing some stuff and he sort of said something and I said,

I'll have to get my clearance back. I don't have a clearance at the moment. And, and he was horrified. He goes, what do you mean? How do you have any assurance? How can you be doing what you do with no clearance? And I'm like, well, you know, you only need a clearance for government work, right? But, you know, and of course it's a kind of made a bit of a joke. And I said, well, you've got a lot of cleared people who seem to still be getting, pitched by foreign intelligence services, mate. So I'm not sure that's actually going to fix the problem, but which you agreed might be true. But, but you know, it is.

Patrick Spencer (37:08.142)
I'm sorry.

Jacqui Kernot (37:15.903)
It is a really valid point, right? What's our certification process for people, right? Unless you are actually sitting under a clearance framework or one of those regulatory things. And, you know, we see it all the time. The number of, you know, there's so many conversations about things like TikTok and Teemu here. And internally we're having one about whether we should ban it, which I'm a, every time I get on a call with the Australian team, I'm like, just ban it, just ban it, stop. And,

Patrick Spencer (37:42.958)
Good one.

Jacqui Kernot (37:44.287)
Yeah, just move on. And they're like, well, Jackie, there's some nuance there. And I'm like, no, there's not get it off. Like it's a Accenture owned device. Just know. And, and so luckily that idea seems to be now carrying. But, but if we look at all the ways that people are, are to be socially engine. And I was talking to a guy who's, who's doing a lot of operations out of Papua New Guinea at the moment and the Solomons. And he was saying, people are kind of handing out free phones, you know, because.

Patrick Spencer (38:13.102)
Hmm.

Jacqui Kernot (38:14.27)
that go back to Huawei devices and that sort of thing. So there's a lot of collection attempts going on, right? And if you take someone who, for example, is a VC, so for example, this kind of thing that we're doing with a war for talent where we're having interim CISOs and people contracting into businesses. I mean, if you could get one of their devices, you could collect a lot of information, right? If you could.

collect on them or socially engineer them somehow. So I think we're going to have to see some kind of process that's a broader framework around people because, you know, the collection, the level of collection activity going on broadly, you know, globally is, is pretty hectic and certainly. Well, it's interesting. Yeah. Well, that's it. Sorry. Sorry to interrupt. But our,

Patrick Spencer (39:01.39)
Yeah, you don't hear much being talked about on that frontier point. That's a risk that I'm going to evaluate.

Jacqui Kernot (39:12.478)
Our Director General for ASIO here, Mike Burgess, has been, you know, did a National Press Club speech and has been talking about it quite a bit, but you know, he's getting really frustrated and I understand his concern with, you know, people putting, I've got an NV2 clearance on LinkedIn. It's like, mate, get that off. Like, please don't make it easy. So you've got this incompatibility with people who...

Patrick Spencer (39:34.574)
Yeah.

Jacqui Kernot (39:40.446)
who were trying to be employable, et cetera, but who were then exposing themselves to, you know, kind of collectors, et cetera. It's really tough. You know, and then you've got your kids who all want to download these apps. And, you know, I had a funny joke, which I actually did do this. You know, my daughter came in grade four and sort of said, you know, mom, I...

Patrick Spencer (39:49.07)
Yeah.

Interesting.

Jacqui Kernot (40:04.414)
Like I want to have TikTok, all my friends have it. So I rang the school and I said, you know, I'm going to do a, like a cyber security talk for all parents at the school. And we had a chat about the evils of TikTok and I came home and I said, there you go, honey, I fixed it. And she's like, I can have TikTok. I said, no, none of your other friends have TikTok now either.

Patrick Spencer (40:19.502)
It's all been banned now.

Jacqui Kernot (40:22.398)
She wasn't very happy with me about it got the job done, but you know, not everybody has that option, right? So.

Patrick Spencer (40:28.686)
So, and we're almost out of time. One last question for you here, and I'll combine like three of my questions I had put together beforehand into one. When it comes to the skills issue, you mentioned, you know, going to the school to present and so forth. How are we gonna solve that? Because we do have a lot of cybersecurity professionals out there, but they may not have the right skillsets that organizations need today for the gaps they're trying to fill. Are we gonna fill that with?

you know, mentoring and coaching a new generation? Are we gonna do it, you've done a bunch of work around diversity and inclusion, is that another area? Is it gonna be in the universities? How are we gonna solve those, fill those gaps from a security skills standpoint?

Jacqui Kernot (41:12.734)
And look, Patrick, I think I could write a book on this topic. Maybe I should write a book on this topic. But in fact, there are books written on this topic, actually. Jane Franklin, a good friend of mine, has written a book called Insecurity, which is a really good book on exactly this topic. I recommend you read it. But I do think our industry has to become more inclusive. I don't think we're great at that. It's been...

very male dominated. And the other thing that struck me that, that, you know, a young graduate when I was working at Telstra said to me is she did telecommunications engineering and, and she said, you know, there's no sexy engineers in movies. That is not the, they're the weird guys in the corner in the movies, right? Maybe the girl with the dragon tattoo like created some noise, but,

But I do think how we look at people being depicted, et cetera, in the media and real life, I think we need to really think about making career roles and stuff more visible. Because I think until we do, until we make, there's been a bit of a funny thing about making, I always laugh that we make kind of cybersecurity into a dark art, which is really good because it helps us all get paid more. So I'm totally in support of that. But.

we might've pushed it a little bit too far, right? It's actually not an arcane art. There's lots of different roles in cyber security, some of which can be filled by reasonably non -technical people. And, you know, frankly, the training I had at uni at computer science, I'm trying to think of, I did first year to computer science. My degree is actually in biochemistry. I seem to have muddled through quite well, you know, and there wasn't a degree in cyber security. So, you know, when, you know, cause I'm that kind of, I'm carbon dating myself now, but,

Patrick Spencer (42:46.862)
Exactly.

Jacqui Kernot (43:05.95)
But, you know, what I learned then is largely immaterial. Now, I can't remember if there's much that I would have learned.

Patrick Spencer (43:14.67)
Any of those who are getting cybersecurity degrees today, what they're learning today, probably in three years is gonna be immaterial as well, because things change.

Jacqui Kernot (43:21.63)
Exactly. A hundred percent. So I, and I really one day if universities are the right place to, to teach it just because of the model they apply, right? That three year degree residential onsite, you know, the other thing with university is it, you know, fundamentally applies, you know, reapplies privilege. And I have a strong problem with that because we really need people who can think differently, who can think.

like attackers think and attackers are not usually white rich people in first world countries. Okay. So you actually really don't want those guys doing resilience and defense because you know, maybe they can learn some of the tech tactics, techniques and procedures, but whether they can really put themselves into the attackers headspace, you know, why would they do that? What would they go after? You know, you need people who think like those people.

Patrick Spencer (43:57.038)
precisely. Yeah.

Patrick Spencer (44:15.054)
They work.

Jacqui Kernot (44:20.228)
and these are all concepts we've known from, you know, military models for ages, right? You know, when you, when you go to a conflict in another country, the first thing you do is pick up some people who can speak the language and who can explain customs and how people do things and that sort of thing. And so, you know, if we don't start to attract a broad range of people, our defenses will be ineffective and we'll be constantly surprised by the new techniques and tactics of, of attackers. And.

So I think it's kind of an existential threat that we really start finding a way to reduce the privilege, you know, entry requirement into cyber security. And I think we need to think of ways to do that structurally. And we need to think of ways to get, you know, free training to people and, and help them understand it. And we also need to role model and make visible different people in cyber.

You know, most young women, for example, don't necessarily want to go and join a big group of men. Like this young graduate was telling me, I think she was the only telecommunications engineer in like a hundred blokes. And her parents were like, you know, went to her, went to the opening thing and I was like, honey, are you sure this is what you want to do? And she was having doubts herself, right? When she saw how there's no other women in the room.

Patrick Spencer (45:39.342)
It's like

Patrick Spencer (45:43.054)
Malabominated the group plate.

Jacqui Kernot (45:45.509)
And certainly for me, I'm used to that. I kind of tell a funny story about how it was two years before I went to a meeting with another woman in it. And when I did, I got such a surprise, I thought I was in the wrong room. And I said, I actually said, sorry, I don't think I've got the wrong room. And she said, no, you're here for the authentication meeting. And I said, yeah. And she said, great, I'm Lisa. And we're still friends today. She's still in cybersecurity as well. But that's two years without seeing anyone that looks like you is a very long time.

Patrick Spencer (46:14.989)
Yeah, yeah. One hope that we saw would see some change, you know, back when I was running a publication at Symantec, I would interview a lot of women CSOs actually for the publication. But that was a rarity. I think we were just fortunate they were the thought leaders in the space. So those are the ones I interviewed. And we thought, you know, because of that, we would see more progress and it almost seemed to go backwards. You know, a few years ago, I don't know if we've made any progress after COVID that.

Jacqui Kernot (46:16.612)
So.

Patrick Spencer (46:45.006)
We certainly haven't seen a whole lot of progress over the past 10 years. So lots of work to do. Lots of work to do.

Jacqui Kernot (46:49.926)
Yeah, definitely. Yeah. And I think making those people a lot more visible is really critical, right? And critical part of what we have to do because we've got to amplify that, you know, women who look like X can do. Someone who looks like you, you can't be what you can't say as they say. And I think that's a really important tenet to remember. So it's not necessarily that we're giving...

Patrick Spencer (46:58.702)
helps.

Jacqui Kernot (47:16.422)
you know, women say says unfair exposure, we need to level it up so that we get the right diversity and that applies to all diverse. Lauren R is you know, I don't know. It's actually funny at at EY actually, I got asked to, we had a whole program to get neuro diverse graduates on board. And, you know, we had this kind of interview set up and, and everyone joined and there's a bit of a cast of thousands and I thought this is a bit weird. And

And then I'm like, where's the candidate? And they said, no, this isn't the interview. This is us explaining to you have to, Hey, you have to do a neurodiverse interview. And I kind of laughed and I said, I work in cyber security. I have interviewed quite a few neurodiverse people. Are you kidding? I think I'll be okay. it was, it was great that they were, that they were doing that training, but I was kind of a bit of an eye roll, but you know, how, how many leaders are actually out as neurodiverse people and feel comfortable to say I'm neurodiverse and.

you know, how we, despite the fact that it's a bit of a hidden secret that we've, you know, got a huge chunk of our industry, I think far more than the normal 20 % of the population, probably neuro diverse in cyber security and probably, you know, broader computer science generally, IT. I think, you know, it's kind of a running joke, but yet we don't necessarily have many leaders calling it out and saying, you know, this is what I'm role modeling. And so I think that happens across all diverse minorities.

Patrick Spencer (48:37.902)
Thank you.

Jacqui Kernot (48:41.191)
And we need to start getting structurally better at that because it's not about our individual journeys. It's about making sure that people see a role model and they see someone they can look at and connect with and feel that could be them one day. And so we need to think about it. Yeah. And if they don't see that they make a different decision when they're in high school. And I think we can't afford for that to happen. We can't afford to lose those bright people that.

Patrick Spencer (48:55.374)
That could be me and in years. Yeah, that's a good point.

Jacqui Kernot (49:10.983)
might think about a career in cyber security or something that would facilitate positive change in our industry, but they decide to do something else because there's just no one that looks like them and they can't see how they could ever be that. So how we solve that is an interesting question, but I think some, every company really needs to commit to a level of structural programmatic change. And I think it's funny because I, on my leadership team, for example,

Patrick Spencer (49:33.998)
Mm -hmm.

Jacqui Kernot (49:39.624)
You know, it is diverse. I've got one man and the others are women. I kind of love that. But I'm like, you're my diversity pickback. And I think what's interesting about that is the reaction from the market. You know, the number of people who think it's quite strange, who comment on it. And I'm like, actually, guys, this isn't...

This is in reverse. This is what happens in most teams and it's just normalized, right? So it's really interesting to see the market reaction. And look, I haven't necessarily picked X or Y. I just pick who's the best candidate. And I find a lot of women have had to work a lot harder to get where they are. So you bang for buck is pretty impressive actually. But it's really interesting to see that even in 2024, what people think about that.

is, you know, it's really shocking and surprising and a three quarters female leadership team in 2024 shouldn't be that horrifying, really. So yeah, I really hope one day we get to, these things are pretty normal.

Patrick Spencer (50:44.494)
You

Patrick Spencer (50:51.598)
Yeah, yeah. And it can be in certain areas, but it isn't in cybersecurity today, as you well know. So, well, we could go on for another hour or two talking about different topics. In fact, I think of the topics I wrote down that we could cover. We got to about half of them. So we're going to do this again, Jackie. I really appreciate your time. I know our audience is going to find this podcast quite interesting. Tim missed out on a great episode. You'll have to listen to it now.

Jacqui Kernot (51:21.449)
Thanks Tim and thanks Patrick, really appreciate it. It's been great having a chat.

Patrick Spencer (51:25.39)
Thanks for those who would like to check out other episodes. You can do so at kiteworks .com slash Kitecast. Thanks for joining us. Look forward to the next conversation.


People on this episode