Episode #63 The Human Factor with Gabe Gumbs
"While ROI might be the board's love language, equally we need them to speak ours - how can security be a driving force for business rather than simply a drag on it?"
- Gabe Gumbs
Despite having held a range of leadership positions in security technology, Gabe Gumbs considers his most valuable experience to be the time he spent on the ground as a security practitioner. Now he’s spearheading Spirion’s vision for data privacy in the next decade and beyond, leading the way to a more secure and private tomorrow for us all.
Gabe has a deep-rooted passion for technology, information security, and problem-solving. As Chief Innovation Officer of Spirion—a leader in rapid identification and protection of sensitive data—he’s channeling that passion to make the digital world a safer place. Join us as we discuss this, how data and technology will protect businesses in the future, and the need to uplift everyone’s security literacy.
Links:
Transcript
CP: Hello and welcome to The Security Collective podcast I'm Claire Pales and today's guest is Gabe Gumbs. Gabe has a deep rooted passion for technology and information security. As Chief Innovation Officer at Spirion, a leader in rapid identification and protection of sensitive data, Gabe is channeling that passion to make the digital world a safer place. Wielding a unique mix of technical vision marketing and business acumen, Gabe is shaping the future of data security, and protecting the sensitive personal data of customers, colleagues and communities around the world. Gabe, thank you so much for joining me today.
GG: Claire, thank you for having me and for that warm introduction.
CP: So a good place I like to start for our listeners is what brought you to this place in your career in security today?
GG: So I spent the better part of my career as a practitioner in the security side of the world. So actually implementing security controls, implementing security policies, things of that nature, before taking a lot of what I'd learned there over to the security tool development side of the world, if you would, and in creating security technologies that solve for those same problems that I was solving for as a as a practitioner, and architect, etc. And the thing that kind of got me here is, the one constant has always been that exploration of problem space. Taking the time to understand the problem, versus just running towards any given solution. I happen to think it's what made me a very good security architect, the defender and what makes me uniquely adept at looking at the problems that security practitioners face, and addressing those head on.
CP: I think the word uniquely is an interesting one, because I found throughout my career and speaking to guests on the podcast that most security leaders are unique and there's no true path or trajectory that you take. It's, you know, lots of people fall into security, or they build skills over time that take them into security through transferable skills. And we've talked a lot on the podcast about this skills shortage. And particularly, I know for you, there's maybe a skill shortage that might be coming in privacy from the next area. And and I've had other guests talk about this as well, that they've got concerns that privacy is the next wave of the skill shortage. I'd love to hear your thoughts about this. How do you think we might start to resolve this crisis? Because we've been talking about it for a while. And I guess, is there a crisis in your rights?
GG: I'll just get my religious cat out of the bag. No, no crisis, I happen to think that the thought that there is a crisis is, is a limitation in our ability to see transferable skills, and you use that phrase, and so I want to, I want to hover on that one for a second. Identifying talent to help us solve all these challenges, does not mean having some homegrown pool of people that understand all the esoteric ins and outs of security. And we exist with a healthy amount of hubris around us those in the infosec world that sometimes think that that's the case. There are a lot of transferable skills that would make for excellent security practitioners at all levels. At the analyst level all the way up to the CISO level. We are seeing and we have seen some of those transferable skills at the topper end of the security reporting structures, right. So you know, we have seen developers move into CISO roles, and we have seen lawyers, a lot of lawyers moving to CISO roles, that would make sense. There's some transferable skills there, for sure. And we, we even see some of that happening in the privacy world, but we don't see it further down the reporting chain, if you would nearly as much and I happen to think it's two problems. One of them is very much a not identifying and embracing those transferable skills. The other one is, is one that people similar to myself have a very real responsibility to, which is creating solutions that make it accessible to those people with those transferable skills, right? Do you know anything more esoteric than all of the many acronyms that we use in this industry are the technologies we use to solve these problems. They are not always designed with the operator in mind. They usually obviously designed to sell themselves, right, like look, look at this pretty tool and look at these dashboards. But they don't make it easy and accessible for say, a entry level analyst to understand the problems inside the environment, triage them, solve them, etc. We spent decades trying to solve this. We're somewhere in Sim 5.9 now and looking for single panes of dashes there but it's a larger technology problem across all of the security controls that we have. I, so no, I don't think that there is a skill shortage problem, I think we have a technology problem. And they definitely think we have a lack of willingness to embrace and identify those transferable skills.
CP: It's possibly a podcast for another day, but I loved your comment about the tools that are built and how they're not always built with the operator in mind. They're built to solve for metrics or a measurement or a monitoring, but not necessarily intuitive as to how the analyst might want to operate or what they might need to draw out of that pool of information. And I mean, that's probably a topic topic we could go deep on but, but I want to talk about some of the other skills that security leaders need. And without a doubt, what I find is that one of the most sought after skills, especially in security leaders, is this "board readiness", in air quotes. You know, having that executive presence and being able to communicate with a board. And what I found that then the board really wants to understand, which is another skill security leaders need, is this commercial acumen around things like return on investment. The board gives you a bunch of money as a security later you go away, you come back and you ask for more money. And I think CISOs really struggle with this, how do you give the board a sense of return on investment or value for money? What are your thoughts on that? And how could say CISOs approach this question when it comes from the board, or from the executive about ROI?
GG: I think the first step in that challenge is not discussing ROI from a cost avoidance standpoint, which is a trap that many fall for. Which comes back to those dashboards, just wanting to show metrics, right? Trying to discuss how many breaches we didn't have, or how many fines we didn't have. Again, those types of cost avoidance metrics are not a not a very good way to express a return on an investment. Unless, of course, you know, you're in the insurance business. In which case, yes, a few of those things that happened, the better for you. So first and foremost, we need to not look through the lens of cost avoidance, and we need to look through the lens of business empowerment. So by implementing these sets of controls, ABC, we've allowed the development workflow to be more streamlined, and they can get code from, you know, commit to the customer 10% less time, right, because now we've consolidated all of their environments and secured it. I think, talking about the business outcomes that matter to the business, and I just use the example of a software company because I happen to work for a software company, right, is a much better way of expressing returns on investment. So if you were, say, in the pharmaceutical industry, I'm certain that security has a similar challenge, how do I empower the business to move fast and secure, fast and secure? And by looking at those types of measurements, we can start a more empathetic conversation with the board with regards to you know, returns on investment. But then we should equally not spend too much time talking about ROIs, even though yes, that is that is their love language. We equally have to get them to speak ours, which ultimately comes down to, I use the word empathy here too, you know, what did we protect? How do we know we're protecting it? And how can we share that message with our customers and maybe have security be more of a driving force for business than simply a drag on it.
CP: And when it comes to measurement, I think it's very difficult for boards and executives to understand cyber, and it's actually, from my perspective, a small amount of education would go a long way for these groups. And, and so I think we try to use models and structures to help them to understand because it's a visual aid, but also it's a, we started here, and we're going to there. You know, whether it's investment or whether it's business enablement or whether it's getting faster at something. And so, you know, anchoring security to a maturity model as a measurable way to show progress, I guess helps organisations by default to become more secure, because you can show, you know, we were here and we were there. What's your thinking around security models or maturity models? And what advice would you give to CISOs about how to use these models and get the best outcomes?
GG: Yeah, models are, everyone's got one. Everyone's got a model and some of them are better than others. Not all models are created equally, or all models were created equally and some more equally than others. Yeah. Whichever quote, you'd like to use of my Orwellian plug. Models, I think are good in lieu of absolutely anything and there's and they certainly are a good reference. And for those of us that have to comply with certain regulations, then, you know, models that can align us to that compliance are especially good to make very black and white, the needs of that compliance. And I don't say that to say, you know, just check the boxes, but to conform to those needs. But for our unique individual businesses, models don't do a very good job of mapping to the outcomes that we always need. They can certainly, again, serve as a good framework, a good reference. But copying them, you know, note for note is not is not going to yield the same results for everyone. In fact, as you know, more will have negative results, but not since it's not tailored to their business. That being said, I would, I certainly would advise everyone to understand the business that they're in, as well as the compliance regulations, regimes that come with that, and start there looking for models that can help them improve their programs. I equally wouldn't look at it as kind of a start here, we ended there, because that end state is, is nebulous. But I feel like I'm preaching to the choir, I'm sure your audiences, you know, those like, you know, it's always a continuing state, it's an evolution. It's not just we are now secure, we are there. But for anyone that may have had that thought, then no, it is continuous. And so again, models do kind of fail to capture that evolutionary process, right. Because I've never seen a model that is infinitely long in its capability, in its capabilities. And if you ever had one of those you would go but what I am not going to use this as a model, this is crazy talk. So you start to see the problem break down on both ends, it's neither complete enough to take you into perpetuity. Nor is it, and if it were you would, you would never want to use it. So you've got to tailor them to your own needs. Reference, the one that best suits your business, your industry and it starts there.
CP: I want to zoom out a little bit from that, because, you know, I've talked very much about the CISO and very particular things that they might want to educate parts of the business on or speak with executives about. But if we come out to sort of the enterprise wide level, you know, for a long time, we've talked about the need to uplift the security literacy of everybody. So not just the board or the executive or the decision makers, but actually getting everybody to think about the implications of their choices, and then take action, change behaviour. Some progress has been made in this space. But a lot of people are still, I wouldn't say doing the wrong thing, but maybe taking action that causes additional risk to organisations, could be intentional, could be accidental. I know you're very passionate about problem solving and using data to solve problems. So if we jumped forward three to five years, how do you see data and technology being used to help protect businesses?
GG: I actually think that what we'll start seeing is more focus data for starters. We started from a place of we don't really have a lot of memory and storage to keep data around, so we'll get rid of things. We saw this expressed quite literally in the programs and software we wrote. And then we watched those things get bloated. And now we watch technology move back towards the edge into the cloud. And so now we're watching that re contraction of bloat in just, you know, the actual data that programs are comprised of. And we're seeing a similar arc in terms of how much data companies collect and process and use for their own business gains and advantages. It went from, we couldn't really keep a lot of this stuff around, it's very expensive to do so we'll keep what we need, to storage is so cheap, and we've got the cloud aka other people's machines and, and that will keep everything and we'll analyse all of it. We'll get all deep learning and neural networky on it. Because we can learn so much and we can mine this data. And then privacy has taken a centre stage in this security conversation over the last several years. Vis a vis GDPR, CCP, all those things. And so now you're starting to see a number of businesses question, how much data are we going to keep? And why are we going to keep it? And so to answer your question a bit more directly now, what the role data I think is going to play is a larger one in terms of its outsized influence on our decision making process, but with less data, with more focus data. Because organisations aren't going to want to keep around more data if we don't, if they don't have to, making it a liability. And that in and of itself, will protect people by not keeping around as much data as we did, by not retaining all that sensitive data about Claire Pales that we didn't need to, that we change the purpose why we originally collected it versus how we are processing it today. By simply reducing that, we will have already protected people in better ways first and foremost. I can keep going, actually. But the only, the only thing else I would add to that is, is that better than stewardship of that data, it's going to come through simply reducing that footprint, just eliminating how much sensitive data we keep around.
CP: And through that process, I suppose the less data we've got, the less data is at risk. And then we still need to try to manage, you know, looping back to the start, we still need to manage the tools we're using to monitor that data. And I guess the value that using those tools is bringing, and then also, you know, our, our employee communities and teaching them how to deal with that data. You know, it doesn't matter whether you've got a lake of data or, or a small cup, it's really about looking after that data. And, yeah, I like your thinking around better ways to manage how much data we collect on people and what we keep. We've still got that age old question of how do we stop people making decisions that make that data leak?
GG: I feel like there's an empathy answer to that one also, but maybe that's my big hammer and everything starts looking like a nail. But when the people handling that data really have a sense for how it impacts other human beings, I think they start to make different decisions. When they don't just see it as this amorphous data set and I'm just moving bits and bytes and documents around. But there's a human being attached to this data set, many of them. And there's real impact to my decisions. I think, I think when that becomes a bit more part of our fabric as data handlers, all of us, I think we'll see some changes there. And there's some effort to do that, right. Just making it personal amongst engineers, is a movement I've seen even just as privacy starts becoming more pervasive, at the coder level even, right. Privacy by design, which is where a lot of that is born out of. It started with this notion that by design, our systems need to respect and protect privacy. And so that is a very good vision and mandate to start with to get it all the way down to the ground floor of making a personal and individual level, and that will impact, that will directly impact actions. That will directly impact someone going, you know what, maybe I shouldn't leave that car in the parking lot with that laptop and all those records on it. Or maybe I won't just gmail that to myself, because I know it's going to take maybe 10 minutes, but if this gets out man, that's like I that's a lot of people whose lives are going to get screwed tomorrow morning. So you know that it starts with those little things.
CP: It's just humanising that information.
GG; Yeah, humanising it indeed.
CP: Gabe thank you so much for your time today. I've really enjoyed hearing your perspective on all things security. We've covered quite a few topics. I really appreciate your time, it's been great to chat.
GG: Claire, thanks for having me.