94. The role of technology in cyber culture change with Chris McNaughton
Closing out the theme of this season Claire is joined by Chris McNaughton and they discuss how data protection and security awareness are linked, the challenges of insider threat, and how leaders across your business can promote more secure behaviours.
Chris is a Director of SECMON1. Chris’ career commenced in law enforcement, where he was a recognised expert in digital forensics and management of electronic evidence. Moving into the corporate world in 2007, Chris accepted a global role with General Electric (GE) Capital where he was responsible for electronic discovery, digital forensics and investigations. In his position at GE, Chris implemented and managed a number of e-discovery platforms for GE Capital as well as reviewing and improving the Corporate e-discovery platform. In his current role Chris provides advisory services to Government and corporate clients in the cyber security areas of Insider Risk, Data Analytics, Digital Forensics and Workplace Investigations.
Transcript
CP: Hello, and welcome to The Security Collective podcast. I'm your host Claire Pales, and today's guest is Chris McNaughton. Chris is a director of SECMON1, which he co-founded in 2017. Chris's career commenced in law enforcement where he was a recognised expert in digital forensics and the management of electronic evidence. Chris and I covered some ground in today's chat, including how data protection and security awareness are linked, the challenges of insider threat, and how leaders across your business can promote more secure behaviours. Chris has a wealth of experience, and I really enjoyed our discussion. Please welcome Chris McNaughton to The Security Collective.
Chris, it is great to welcome you to The Security Collective podcast today.
CM: Thanks Claire. Thanks for having me.
CP: So you are the co-founder, co-creator of a data protection tool, which put very simply works to monitor staff actions, and forensically analyse data. It identifies potential data security risks, but tell me about how the tool allows security awareness and influence to really be at the core of the work that you do.
CM: Yeah, that's a good question Claire. It probably comes down to timeliness. So being able to react to things people do quickly. But to take one step back, when we started designing this tool a few years ago, what we found is that people were putting, you know, security logs and so forth into SIEMS and correlating all these logs, and that was all great. And they had various monitoring tools, but honestly, they simply weren't using them. They have masses of logs for people's activity, and it would disappear. And there's a couple of things that probably exacerbated that. There was too much noise, there was way too much information coming through. And there was no real way to sort the white noise from the actual activity. Hence, we started developing this tool. And what we do is because it's so reactive, so we were able to refine very quickly, what's the noise from what's the actual concerning stuff, and do it within sometimes, only a couple of hours. What it means is that when someone does something, let's call him Bob from accounting. When Bob from accounting does that thing where he sends some information to his Gmail account, or puts it on USB or uploads to Facebook, or whatever it might be, he's probably doing that because he's trying to do his job. And Bob hasn't been given the tools or the knowledge or whatever he needs to be able to achieve what he's done for his job. With our tool what we do is, okay, we detect that quickly, we're able to alert Bob's manager, I'm picking on Bob but he'll do today. Alert Bob's manager and say, hey, I think Bob needs some steering here, some guidance about things that he's doing. And, you know, he'll hear about it that day, or maybe the next day. And invariably, what Bob does is he'll say, I didn't know that. I didn't realise I could do that. I didn't know I could access it from home, or I didn't know, geez thanks very much. And he'll turn to his colleagues and say, hey, did you know that and he will, he'll actually spread the message. And the important thing, there is timeliness. And so from a security awareness perspective, it's right on point. It's addressing the problem that exists today for that particular user. So, you know, while we have security awareness posters in the lifts, and we have our annual security awareness presentations, and so forth, and we need to do that. But it's often, you know, way above people's heads, and they, you know, it's just, it's the compliance thing, you just got to tick that and get through that for that year, and then they're done. Whereas when it's timely and relevant to them, it has much greater impact. So we find that, you know, as much as it's having a direct impact on specific users, and many teams. We also find CISOs are now using the tool to say, where are our point problems? Where do we direct our security awareness campaigns? What's our problem? So that's actually much more relevant.
CP: And I think the context, you talk about the timing and the relevance, but the context within which poor Bob is working, you know, as you said, and we talk about this all the time, people are just trying to get their jobs done. They're taking the desired path. They've got a lot of pressure Things have changed in the world, obviously, in the last few years. Giving him an understanding of what he or she or anyone in the business can achieve and the tools that are available to them, in the context of their job. As you said, it's not blanket awareness training for everyone. It's not a blanket phishing email that goes out. It's on point It's timely. It's coming from leadership. And I think what I like about it, too, is that it's not covert. It's we're looking at this stuff, we want to support you, you know, it's certainly not a punishment. It's let's work together and get this more secure.
CM: Yeah, absolutely. And we do have organisations who say to us, let's put this in covertly and let's catch you out. And we say let's please not do that. The reason we say that is because the vast majority of activity, we see people actually just trying to do their job. They're not malicious, they're not trying to steal customer information. We do get them, of course, we do see those, they do exist. But the vast majority don't. And the opportunity lost here often in many organisations, is that while they have lots of monitoring tools and various types of ways to see what users are doing, because there's so much noise, what they do is they detune them. And they detune them to the point where they're only looking for the Edward Snowden 's of the world,. They're only looking for the most serious of matters. And what they're missing is a great opportunity to guide the vast majority of the population back on track. So in terms of impact, we measure the impact, of course,. We are starting one next week for an organisation and we'll benchmark first to say, okay, what are we seeing in terms of undesirable activity which is up here, and we'll measure it and we see a pretty much a linear decrease month on month by 15 to 20%. So it's, you know, you can see it, it's actually working. So that's, that's really positive.
CP: And I think those cases, you know, that malicious intent is hopefully less than 1%. You know, it's not the people that are trying to get their job done. It's the people who are disgruntled or are inside the organisation through malicious means. What type of situation generally triggers bad behaviour in internal staff?
CM: You're right, there is often a trigger, it could be something as simple as someone just missed out on a promotion. They are almost invariably a flight risk. So they'll probably be out of the organisation within six to 12 months. Someone who asks for a pay rise and doesn't get it, they are a risk. Obviously, someone who's just resigned they are obviously a massive risk. You know, some other interesting demographics and experiences that we see, such as someone in finance who won't take leave, they are a huge risk to fraud. And we said that time and time again. And you know, we've had CFOs, who've been in a role or financial controllers, have been in a role, they're hard working, you know, he never take sleep and so forth. Then watch him. Because he's not taking leave not because he doesn't want to, it's because he can't. And the reason he can't is because his scheme unravels, if he's takes more than two days leave his scheme unravels, and it starts to fall apart. So we can't take leave. But there's a number of key indicators like that know, that we watch for, and we know from experience that something which is really high risk.
CP: So we talked about Bob from accounts earlier. And you know, staff like that often claim that sending customer data to their Gmail address, it happens because they're trying to get their job done. Once staff know that your tool is there, can you see a measurable decrease in behaviour? Do you see it as a deterrent in the same way that a police car makes people slow down on the roads, for example?
CM: Yeah, that's a good analogy. So yes, we do. What's been an interesting aspect we've seen is that, while we're very overt about what we're monitoring to users, we tell them specifically that, you know, we're looking for data going to email or USB, or whatever it might be, we're quite open about it. What we actually see is, things outside those things we're actually monitoring, decrease as well. For example, phishing click rates go down. And we hear from CISOs saying, hey, since we started monitoring, I'm actually having teams come to me and say, can you present to me on this, because we're not sure what we can do in this space. Can you come and actually let us know. Or people saying, hey, you know, Mr CISO I want to do this, am I allowed to do it? So we see more of an uplift right across the board with awareness and people just thinking a little bit more security minded than they ever had. And that we often say that happens within the first month people start to do that. We think the reason for that is because the way we use the tool, it scales really well across larger organisations. It means that we're sending, you know, there's no, there's no bottleneck, so it's not going to a security team, and they have to assess it. Typically what we do is we send it directly to the managers, it's at the coalface. So people who need to be aware are becoming aware very quickly. So it's good question because you absolutely do so. People have actually said to me in the past, this isn't a proactive tool, this is reactive, isn't it. Yes, it is reactive to some degree, but you've got to look at the results of it. And the results are that it becomes not just a proactive tool, but it becomes a cultural change. The security culture of the whole organisation starts to change and changes reasonably quickly.
CP: And to achieve cultural change, most cybersecurity influence and education programmes, they really need to have that buy in right at the top of the tree. And if it's going to be supported enterprise wide, absolutely. What has been your experience in getting that tone from the top when it comes to influence and change in the cyber culture?
CM: It really does need it, it needs, to work from both ends, actually. So there's a level of nervousness sometimes about what we might see and what we might detect. But we try and talk about the fact that what we will see is 99%, as we talked about before, people just trying to do their job. And that's a tool to be able to give people the opportunity to do their work, you know, work better and more efficiently and certainly more safely. But they're also concerned it might slow the business down. And what we say to that is, well, it doesn't actually slow the business down, what it actually does it speed the business up. And it's because people are working much more effectively, they're much more security minded, upstairs, they're thinking about it. And what it means is, you can actually ease off some of your security controls. You know, security controls, which are blocking all USB or blocking this and blocking that, it's not really that effective. It's a very blunt instrument. And as we know, people find their way around these controls. We call them walking the fence, we see it a number of times, recently where we'll be monitoring for a particular organisation and they'll use us to monitor the effect of various controls. One particular organisation said we're going to block all USB, which honestly wasn't a big deal, it wasn't a big problem, but we're going to block all USB. And I did that, and within days, we saw data leaving via email, or databank uploaded, in some cases to Facebook, and all sorts of things via messenger. So hence, our expression walk the fence because people walk the fence, it's blocked, it's blocked, there's a gap in the fence, and then they'll take it. And so again, you stop someone doing something that they will find another way to sort of get around it.
CP: And have you seen a difference over the last couple of years. We've got a guest on this season, Amy Ertan, and she did some research in the UK into the impact of COVID on people's cybersecurity behaviours, and how psychological contract with the company, loyalty, onboarding, so much influences people's desire to toe the line and desire to meet the compliance requirements of an organisation. Have you seen during COVID a different set of behaviours or more walking the fence or what's been your experience over the last two years maybe compared to other years that this tool has been out there in the world?
CM: Yeah, we think the big one has been people working from home. And it's not the fact that it's less secure working from home, it's the fact that there's a slightly different mindset. People are a little bit more relaxed when they're in the home environment, and they do things normally wouldn't do. So there's some practical things such as I need to print that out, I'm just going to send it to my Gmail account, because printing is blocked here, so I'll just do that. There's also been some issues we've seen around people who, their home environment is just not very secure. And so they've got documents or things on screen or they're in a share house, or whatever it might be. And what we'll see from that is of housemates seeing things that they wouldn't normally see, but in general terms, people are just more relaxed at home.
CP: You talk about the difference between home life and the workplace. I'm interested to know the difference that you've seen if at all, between government and corporate. So they're quite different environments, in my experience, and different mindsets, different approach different purpose to the organisation. What have you seen in terms of the difference between engagement and cyber behaviours between some of your government clients and maybe your corporate clients?
CM: I think there tends to be a little bit more loyalty in private organisations. Particularly if they're run, you know, in an open sort of fashion where we're all part of one team and so forth. Where we haven't really seen that level of loyalty in the government sector. That won't be exclusively across the government sector, but that's generally our experience where you can sort of engender a sense of loyalty in an organisation, particularly when we're talking about using Shadowside, our tool. When users are brought into the 'you're part of the machine, you're part of this, you're a key part of the organisation'. So levels of loyalty, I think, are a bit better. Now it might be a bit unfair, but that's what we see.
CP: And I guess because of that to people who are maybe transitioning from corporate to government or vice versa, potentially have a different level of cyber education, would that would that be true?
CM: Yeah. It's a big leap. I've done it. I went from government, I was ex law enforcement and I went to the corporate sector. And it was a, I must say, you know, I realised I wasn't in Kansas anymore, it was a very big leap. You know, and I think it's healthy going between the corporate and government sectors, because you bring different mindsets. And that's always a healthy thing. But we probably think a bit differently in the corporate sector in terms of our information security. I think there's a sense that we're all in it together. We're all protecting our customer and other sensitive data. Where I think the overarching theme government is someone's there, that's their job to do that. I'll do my bit. But it's, I think that someone's job to do it.
CP: And coming from law enforcement, I'm sure that you have a slightly different mindset to maybe some other people out in the cybersecurity industry. Because obviously, you've come from a place where there are controls and rules and structure. I'm really interested to know coming out of law enforcement into cyber, and I've asked all my guests this season, what do you do in your personal life to protect yourself online? Is there something that the one thing that without a doubt you always do to protect yourself or your family or your own personal information that maybe you didn't do before? But since you've become more present and aware of cybersecurity and the risks, is there something you do personally?
CM: Yeah, you know what, it's the security 101 stuff that is the most important by far. There's all sorts of things we can do. We can put in, you know, firewalls and personal VPN 's, and all sorts of things. But at the end of the day, it's about making sure you've got AV, your antivirus set up. It's to make sure that you are patched, your operating system and all your applications are patched, and it's the password management. So it's make sure that you are not using the same password over and over again and we've all probably been guilty of that. But it's to try and reduce that as much as possible. So in terms of an analogy, what it does is, it's that ASD's Essential Eight. You just can't go past it, because it will make you the top 99% most secure person out there, and just too difficult to compromise. So in terms of an analogy, it's making my house secure, so that the burglar will go next door or the next suburb or the next town. And that's honestly what it's all about. That's it for me. And yeah, there's other things of course we can do, which are a bit more sophisticated, but get the basics things right first.
CP: And I think that's true for our personal lives as much as it's true for every organisation and government agency out there.
CM: Yeah, definitely. And we sometimes over look that. I was talking to an organisation the other day, and they they're struggling with their vulnerability management. And there's always friction between you're running a vulnerability scan all over your services and so forth. And there'll be patches and critical updates missing and so forth. And there's always that friction from the business like look, hang on, if I patch that server there'll be an outage, it might break things. So let's not do that, let's make that an exception. You're heading down a really risky path when you start doing things like that. Because when you patch the machine, at what point is it the right time, and you really just got to honestly just get it done. In business and personal, they're very simple things.
CP: Chris, I love your approach to get the basics right. And I love the use of your tool that helps people not just reactively respond to cybersecurity threats and potential harm, but also that sort of herd immunity where people find out that they're doing something that might be causing risk, and then they tell others, hey, did you know that we shouldn't do this and here's a better way to do it. So I think what you're doing is brilliant. And certainly from a tech perspective, you're turning your technology into something that can make people more aware and educate them. So thank you so much for joining me today. I've really enjoyed the chat and I know that the more organisations that look into how to influence their staff through contextual behavioural change, the better. Absolutely, thank you.
CM: Terrific, thanks Claire, it's been a great chat.