
Strategy Meets Reality Podcast
Traditional strategy is broken.
The world is complex, unpredictable, and constantly shifting—yet most strategy still relies on outdated assumptions of control, certainty, and linear plans.
Strategy Meets Reality is a podcast for leaders who know that theory alone doesn’t cut it.
Hosted by Mike Jones, organisational psychologist and systems thinker, this show features honest, unfiltered conversations with leaders, strategists, and practitioners who’ve had to live with the consequences of strategy.
We go beyond frameworks to explore what it really takes to make strategy work in the real world—where trade-offs are messy, power dynamics matter, and complexity won’t go away.
No jargon. No fluff. Just real insight into how strategy and execution actually happen.
🎧 New episodes every Tuesday. Subscribe and rethink your strategy.
Strategy Meets Reality Podcast
When Strategy Meets Security: Glenn Wilson on Technical Debt, Developer Voice, and Defensive Thinking
Cybersecurity isn’t just about protecting data—it’s about enabling execution under pressure.
In this episode of Strategy Meets Reality, Mike Jones is joined by Glenn Wilson—cybersecurity expert, OODA loop practitioner, and founder of Dynaminet—to explore how the challenges of security mirror those of strategy. From technical debt to tool overload, commander's intent to learning loops, they unpack how poor decision-making, misaligned incentives, and lack of feedback erode both resilience and execution.
This is not a technical talk. It’s a strategy conversation about where things break—and what to do about it.
🔍 In this episode:
- Why strategy and cybersecurity face the same execution challenges
- The hidden cost of technical debt and over-engineering
- Why developer voice matters for resilience and tooling
- Using OODA and chaos engineering to build adaptive responses
- What commander's intent looks like in security
- How to embed security without killing innovation or joy
🎧 Keywords: Strategy Execution, Cybersecurity, OODA Loop, Technical Debt, Developer Experience, Chaos Engineering, Mission Command, Distributed Decision-Making, Leadership, Resilience
📘 Learn more: https://dynaminet.com
📬 Connect with Glenn: Glenn Wilson on LinkedIn
👂 Enjoying the show?
Subscribe and leave a review on your favourite platform — it helps more people find the podcast.
🔗 Full episodes, show notes, and resources: https://www.lbiconsulting.com/strategymeetsreality-podcast
📺 Watch on YouTube → https://www.youtube.com/@StrategyMeetsReality
🎧 Listen on Spotify, Apple Podcasts, and Buzzsprout
💬 Connect with host Mike Jones → https://www.linkedin.com/in/mike-h-jones/
Glenn Wilson (00:00)
A learning organization is one that is constantly looking at ways to evolve and understand.
their position in the environment.
We need to build that psychological safety into the system so that basically someone can say, I've just done this.
I I made a real bit of a mess up, know, and not that person feel that they're to be fired for doing that. that is actually an opportunity to learn from this experience
Mike Jones (00:18)
Yeah, yeah.
Glenn Wilson (00:22)
Security comes from being creative, and you need to be able to
build that creativity into your teams and not build up technical debt and be proud of what they're delivering.
everyone's entitled to have pride in their work. if we take that away from people, They're not really developing
Mike Jones (00:44)
Welcome back to Strategy Meets Reality podcast. Today I am delighted to be joined by Glenn Wilson. We first met on LinkedIn. I think I seem to meet most of my guests on LinkedIn. It's just a good way to meet guests, but Glenn, it's great to have you on.
Glenn Wilson (00:57)
Hi Mike, pleasure to be here.
Mike Jones (00:59)
Cool. Just for our listeners, can you please provide a bit of background, a bit of context about what you've been up to lately?
Glenn Wilson (01:05)
Yeah, sure. So, yeah, I'm Glenn Wilson. I'm predominantly a cybersecurity consultant, freelance consultant, so I don't work for any of the big companies. I've been in cybersecurity for a number of years, although my progression as, you know, no one maps out a career into cybersecurity. It's just incidental. So I started off many years ago as a market researcher that evolved into crunching numbers. Crunching numbers meant
writing code, writing code then turn into more of a security slant on it as the code became much more important to my clients. So when you're writing code that goes into chip and pin devices, for example, or healthcare and stuff like that. So security became a thing for me. And over time, I evolved into somebody more more attuned to the leadership security. So how do leaders integrate security into what they're doing as opposed to being hands-on.
with the developers and show them how to do security. I'm much more around the strategy of it. I'm currently doing a masters in systems thinking and practice as well at the Open University. And my dissertation is also taking a focus on surface security from a holistic systemic point of view.
Mike Jones (01:58)
Mm.
Cool. And for the listeners, they're probably thinking like there's a strategy show and we talk about strategy and execution. They're probably wondering why we've got a cybersecurity expert on. But I think, you know, from conversations we've had, there's a lot of commonality between the challenges that you have as a leader in cybersecurity or trying to help organization and leaders in cybersecurity to the challenges we face in.
executing strategy across an organization. So, you know, from that point, let's explore those commonalities. What are the challenges do you feel that, you know, with cyber security in organizations?
Glenn Wilson (02:45)
I think the main challenge we're facing at the moment, and I'm going to say it, we don't have a strategy. We don't know what we're doing. So with some of the clients we work with, you have strategies for certain parts of security. there might be, and when I say strategy, I really mean policy. I think strategy and policy are closely linked. So when you think about a policy for doing code scanning, for example, to identify vulnerabilities,
Mike Jones (02:51)
Yeah. ⁓
Mm.
Glenn Wilson (03:09)
it could be a policy or strategy around how you manage vulnerabilities. And then if you go into the incident management, know, what's your strategy for dealing with incidents? And we don't really seem to have a good way of doing this at the moment. And that's borne out by some of the reports we're seeing coming out. So for example, the data breach reports show that we're seeing more incidents than ever before. The cost of incidents is rising.
Mike Jones (03:30)
Mm.
Glenn Wilson (03:32)
more companies are being affected by data breaches. So effectively, we're not doing a great job of cybersecurity. And I think it boils down to the fact that we're stuck in a paradigm. We're stuck in some sort of way of doing stuff that doesn't quite work out, which is one of the essence of my dissertation really is why are we doing things in a way that just doesn't seem to be bearing fruit? And I'll give you an example, vulnerability management. ⁓
We quite often see developers develop software with vulnerabilities inside that code. And that vulnerability needs to be prioritized and potentially fixed if it's considered a problem. But we have this conflict between the security priorities and the priorities of the business. And it can be quite difficult to balance those out. So.
What happens is we're delivering features, we're delivering products faster and faster and faster. So we're building up more and more technical debt and that technical debt is much more aligned to the security challenges we're facing at the moment because we're creating vulnerable code. We're not fixing it. We develop more code on top of that vulnerable code. And then eventually, you know, we end up with the data breach. We end up with a situation where an attacker has been able to
get into our systems and either compromise our whole business by a ransomware attack or stolen data or weaponized our company against someone else or something. So it is a challenge and there doesn't seem to be a clear cohesive strategy about how we defend against this type of attack. And that's worn out as I said by the numbers that we're seeing.
Mike Jones (05:09)
Yeah, I that's an interesting challenge you find is the tension. I think it's that tension between, know, let's look at very short term and let's get this stuff out, but they're not then, you know, balancing that with the future, the future risks around looking at, what's going to happen? How's that going to constrain us in the future? And I think that shortism, that short-termism.
I think it's become really challenging for organisations.
Glenn Wilson (05:35)
Yeah, definitely. Again, that's a massive tension, isn't it? The short term versus the long term. You know, we are trying to develop products at a high cadence. And I'll come on to DevOps and DevSecOps in a moment, a topic that I've written about. But the whole idea is that we want to deliver features faster. Not every team does that. A lot of teams do develop code over a period of time and then release something.
And what happens then is they accumulate technical debt because although the release cycle is say every six months within those six months or one year, whatever that release cycle is, the slower cadence, they still develop paying code that's accumulating debt. Maybe the testing isn't up to scratch and so forth. Because again, there's pressures from the business that we, you know, we haven't got time to do testing. We just need to get that piece into the final release and then keep on building that way.
Mike Jones (06:24)
Yeah.
Glenn Wilson (06:25)
Yeah, and so whether you're doing DevOps and DevSecOps or Agile development, where you're working at a higher cadence, say you're say, daily or multiple times a day, or perhaps even just once a week, you're still accumulating debt because we've got this speed of delivery. And it doesn't matter which methodology you're using, you end up in this strange situation where you're accumulating security debt. And it's quite worrying because these linger.
And attacks happened long way down the line affecting code that was written many years ago. And those developers have moved on. They've forgotten their code. who's going to come along and fix them? Yeah, sure.
Mike Jones (06:52)
Yeah, yeah.
Yeah, Just a quick one just to go back on it. What are you
calling technical debt? What does that mean to businesses?
Glenn Wilson (07:10)
Yeah.
Yeah, technical debt ⁓ is a good one. I've got a friend that actually doesn't like to call it technical debt. He likes to call it business debt. So technical debt is where you have codes. So talk about software, let's say. We have code that's been written in a way that was really to get the feature out. So we've missed some checks and balances within the code. So maybe error handling hasn't been dealt with properly.
you know, the code needs to flow to a nice ending. But sometimes we leave that ending out. know, if the application fouls, we don't know what's gonna happen. It's unknown because the developer hasn't written it. So that's a technical debt piece because the developers miss something out. Not because they want to, but because they've been, you know, they're just under time pressures. And it's those types of bits of code that's easy to miss out or they've...
Mike Jones (07:47)
you
Glenn Wilson (07:58)
written something for say a proof of concept or they've written something very quickly just to show that they can do something and they've written it in a much more loose relaxed way not so many checks and balances again and then they're really set into production and again you've accumulated that technical debt because near production contains code that really shouldn't have been in production ⁓ and that means that you end up with this security debt. reason why I've heard it called business debt is because
Mike Jones (08:20)
Yeah.
Glenn Wilson (08:25)
It's the business driving this prioritization. And so really is a debt to the business. It shouldn't be considered a debt that the developers need to sort out. It's a debt that the business needs to sort out. ⁓
Mike Jones (08:36)
Yeah, because
when it goes wrong, the whole business is going to suffer because that's going to then obviously lead to a breach, probably lead to fines from the information security officers and stuff like that. But it's interesting, what's driving that debt? Is it because the leaders that are making the decisions that are setting the timelines, are they disconnected between
what it actually takes, because that's a challenge in business, especially just generally in business. But when you start to get to more technical stuff, like coding and that, it's not something that you just, you know, can just get a handle on. You know, go down the shop floor, get a couple of briefs with the lads and, you know, ladies around what's going on. And then, you know, you go back up to make decisions. There's a lot to it. And I think that's a big disconnect that happens that.
My worldview about what it takes to develop this product and this code is probably disconnected with the reality of the coders that have got the time pressure to deliver this. And that time pressure can amount to what seemingly upstairs may just seem like is a bit of a missing bit of code, but that missing bit of code could be a vulnerability that could cause...
massive ramifications, like just got to look at what Co-op and Marks and Spencer's have had. That was huge, millions and millions. know, and that's been generous about the cost to recover that breach.
Glenn Wilson (09:59)
Yeah, definitely. And it's a supply chain issue as there as well. So it wasn't so much they were breached as a supplier that was breached. You know, that had implications at either end of the supply chain. So farmers weren't able to get the produce to the shops quickly enough because the supply chain was broken. And of course, not so Spencer's and Coop weren't able to sell the stuff because that stuff wasn't coming through to the shops. was it was a it and that shows the impact of just one line of code perhaps being wrong. But yeah, this this
Mike Jones (10:03)
Mm.
Mmm.
Glenn Wilson (10:25)
Disconnect, it is a challenge. And I see this all the time, know, you know, developers seem to sit in this siloed area, they've developed their code, and it's a mysterious area of the company. really knows what goes on there. But these developers are under pressure. The business is constantly trying to develop new features, new code, new products all the time.
It's like hops back to Taylorism that these developers, what we do is we know how many lines of code they can write. it's like, you know, it's like Taylor going through with these like a stopwatch, you know, how many, how many lines of code can you write in a day? Well, okay, we'll give them a few more lines of code because they don't seem to be fitting the whole eight hours. Let's say, give them more, you know, and I think they end up in that situation. But coding is, as any developer knows, it's a really difficult thing to do. You know, you're
Mike Jones (11:04)
Thank you.
Glenn Wilson (11:13)
You're piecing together lots of different pieces of code from other places. You've got abstract knowledge that you're trying to work out how it's gonna work, what does the business want? You've got to test it with the users to make sure that you're actually delivering something the users can actually work with. So yeah, it's very difficult to align those two. And especially if you don't have that technical background.
You know, you literally are looking at a mysterious black box where they're these special people are building stuff, but they're not building it quick enough. Let's get more stuff done anyway, because they're wonderful people. And yeah, so I think that is a difficult challenge to me, I think. ⁓
Mike Jones (11:48)
Yeah. You brought a good point there about getting in the black box. I have to talk about this with people around. I think we're too quick to try and jump into that black box and start to interfere. And that's where we limit the decision space, especially encoders that decision space is that space to be able to do what you want them to do. Where you can avoid getting into the black box by understanding what inputs do they need.
what the outcomes that they should be producing, especially the inputs in this thing. I often think in leadership, you've been studying system thinking, you think about viable systems models, the inputs are really important, but I think they're overlooked. I think the inputs for what these people, be it coders or anyone in the organization needs to enact decisions to develop things are often overlooked and we...
they're the first things that get taken away but we still expect the outcome to be exactly the same.
Glenn Wilson (12:40)
Yeah, that's what pulls down to tooling. So if we think about a business, I've been using the word business quite a bit, it's, business is really anything that's non-development. And I think about say purchasing team, the purchasing team will say, we're gonna purchase this software that our developers can use. So certain ideas or whatever. And they feed that into the developers, but they really had no, it's usually cost-based.
Mike Jones (12:50)
Mm.
Glenn Wilson (13:06)
It's usually what is the best deal we can get from a vendor so we can give to our developers. And that comes down to security tooling as well. And there's no interaction with the developers to say, you know, what do you use? How do you use it? How can you use it? What would you like to do to improve the security of your products that you're developing? What we tend to be is much more prescriptive as business people and say, we're gonna buy this because it's the cheapest deal. We've got a deal with this vendor and you're gonna use these tools and.
Mike Jones (13:09)
Yeah.
Glenn Wilson (13:31)
And it is multiple tools as well. And the tools aren't really fit for purpose for what they're trying to do to developers. So you end up with this input that's wrong. And therefore the outcome is gonna be a little less, you what is it gonna be more, I don't know, it's gonna be, you're not gonna see an output that you're expecting as pretty that way. You're gonna see, and also developers are gonna take more time to learn these tools or try to work around some of the,
some of the code, when they're doing the coding, they might not be able to do the stuff they want to do, this particular type of IDE or whatever, know, so, or as a new language that they need to learn, you know, and it's a challenge for them to work out how to use these tools. And so that input doesn't quite work. And then the output is not the same. And also the instructions are what to build, you know,
Mike Jones (14:04)
Yeah, yeah.
No.
Glenn Wilson (14:18)
There's this element sometimes, and I've seen this, where developers are sort of given the freedom to build what they think they should build, but developers aren't really, they're not people that engage with the users that frequently. I know the Agile Manifesto and the Agile Movement were trying to change that back in 2000, 2001, but we don't really do that in business. I don't really see that very much, especially in large organizations. You end up having like a
product owner or business analyst in the old days telling you, the developer, what to develop. And that might be wrong. That might not be the right thing to develop. But developers go ahead and develop it anyway. And the output's not quite what the initial attention was. And so we go back to that technical debt again. We build up debt because we're never producing something that the users don't want, the customers don't want.
Mike Jones (15:06)
Yeah,
yeah.
Glenn Wilson (15:06)
Yeah,
so input and output very important. They correlate. So is it rubbish in rubbish out? Yeah.
Mike Jones (15:13)
Yeah, that? Yeah, yeah. That's a good point. You you've got two
really useful things there. One is about, you know, the involvement of the people at the edges of organization that actually doing this stuff. They are the experts because they're the ones that are doing it and they are using it. We need to make sure that they've got the right tools to do the job. You know, that's something to reflect on for leaders that are listening.
How often do we engage with those people when these decisions are made about what tools we use? So think there is a tension of our cost, but that's the conversation. That's the whole point of the conversation. Here's the success criteria of what we're trying to get at. Now here's the options. Let's find, I would say the least worst option, because it's never a perfect option, but it's the least worst option that balances the need of the people at the frontline and also...
Glenn Wilson (16:03)
Yeah, yeah.
Mike Jones (16:07)
the business need to normally a financial constraint or, you know, a policy constraint that they've got. But that's, that's a good one to reflect on how often do we actually involve those people that use it day to day in those decision making. And the other one is around that, you know, that connection with the customer and the user. know, yeah, cause the, man, the agile manifesto was all about that getting closest to customers.
possible. Sometimes that isn't, but if it isn't then how do we handle that constraint? How do we enable that co-organisation so that we're not wasting our time providing value that no one wants? The whole point of a business is to provide value that people want and they want to pay for it and they love that experience, they're going to carry on coming back. Not if someone wants a Ferrari and we give them a Skoda, it's not going to work.
Glenn Wilson (17:00)
Yeah, and you just mentioned the word value there. And that's an important concept in security, right? Because how do you measure the value of security? And I think that's very difficult because is it the value of not being breached? You know, if you've got a secure system and it's not breached, or is it the value that you're delivering to your customer in terms of we are a secure company?
Mike Jones (17:10)
Mm.
Yeah.
Glenn Wilson (17:24)
there's a lower risk of your data being stolen if you work with us or use our products. What is the value to the business? Because a lot of it seems to be quite abstract. You're building value on something that might not happen.
Mike Jones (17:30)
Yeah.
Well, yeah, This is
the problem around risk and innovation along with cybersecurity. When you're talking to organizations about this, it's getting that tension between today and tomorrow, really, the future is that, do this for you and we can potentially make you safer in the long term, but they're like, well, but it's going to cost this.
They're like, no, it's all about efficiency. I suppose it's a bit like that Taylorism, it's all about efficiency. Let's try and get as efficient as possible. And it's a hard sell. And often when things are struggling, the first things that go are stuff like cybersecurity, innovation, all those things that are going to support them, either protect them or enable them to adapt in the longterm. Yeah, so it's a different.
Glenn Wilson (18:04)
Hmm.
Mike Jones (18:23)
Yeah, it's interesting to think what these is actually considered the value of stuff like cybersecurity innovation really is.
Glenn Wilson (18:29)
Yeah, yes, it is difficult and you quite often hear that cyber security is a cost to the business and know, rather than a value. So we don't, you know, like there's been a round of redundancies across a number of organizations that I've been involved with over the years and quite often the security teams are the ones that seem to be struggling to hold on to their numbers, even though, you know, there's been well documented that there's a huge
imbalance between the number of security people and the number of developers and engineers. And I see that now with some of my clients. You're looking at, say 30 developers looking after 10,000, sorry, security engineers looking after 1,000 or 10,000 developers. And that ratio just doesn't scale. So what we're doing to try and...
Mike Jones (19:12)
No.
Glenn Wilson (19:16)
close that gap. So there's another one as well, which we struggle with. And again, it boils down to the whole idea of value, doesn't it? Like, what is the value of a security engineer if it's a cost? So yeah.
Mike Jones (19:26)
Yeah, yeah. But then you've got
the, I suppose, the perceived comfort of governance. So when you're definitely in security, you've got governance, you've got for cyber world, you've got ISO 27001. And that's a joy of being true to that myself. But then there's that perceived thing of governance. Well, we've got ISO 27001, we must be fine. And I think in the same cybersecurity world is the same in...
Glenn Wilson (19:35)
Yes, okay.
Yeah.
Mm.
Mike Jones (19:53)
strategy execution, always come down to common, you've got humans involved in that system, which I don't think you need to account for.
Glenn Wilson (20:01)
Yeah, exactly. It doesn't account for the complexity, right? I you know, I say 27,001 is a checkbox exercise. It's a static piece of paper, essentially, some questions. But it's trying to measure a dynamic system with lots of complexities. And that system is changing all the time, which is the reason why you see companies at 27,001
Mike Jones (20:09)
Yeah, yeah.
Glenn Wilson (20:22)
compliant and being breached because it's not really fit for purpose. my feeling about ⁓ compliance and such as 27,001 and there are others is that if you follow what they're asking you to do, then you are building some element of security into it. But it's not the be all and end all. It is a start. It's a starting point. And yeah, if you check the boxes, you can give it to the auditors and they'll say you're 27,001 compliant. But that is where you start from.
Mike Jones (20:37)
Mm.
Glenn Wilson (20:46)
the next, it's about like learning a car, you drive a car, you learn how to drive the car with an instructor, but actually you don't really start learning to drive a car properly until after you've passed your test. You're out in real world driving on your own and dealing with all those different situations that you've never seen before, that's how you learn. And it's exactly the same thing, I think, in this case, in this situation as well. So, yeah.
Mike Jones (20:57)
Yeah,
Yeah, I suppose that balance between compliance and quality, isn't it? So I can be very compliant, but do I really have the quality around that? know, are my people, and I suppose this comes back to thinking around stuff like Adam's third law, you know, you know, with risk, the more we centralise risk, the more we create risk across the system, because we're trying to control it. And we try and control it with policies, processes.
Glenn Wilson (21:10)
Yeah.
Mike Jones (21:33)
strict rules where actually it's the opposite that you need is actually develop the people to understand and being aware of the challenges you've got to enable them to think. It's not to limit their decision space, it's actually to increase the capacity for decisions which is going to help us in the long term.
Glenn Wilson (21:38)
Yeah. Yeah.
⁓ definitely. That's exactly how I try to talk to my clients is saying, I use no aid in ways, enablement and empowerment, but that's what it is. It's given the developers the ability to make their decisions and know that they're making good decisions and right decisions. And that autonomous.
Mike Jones (22:03)
Yeah.
Glenn Wilson (22:13)
way of working is again, we're going back to the agile manifesto. That was what that was trying to do as well. Tried to build a ability of autonomy into the development lifecycle. DevOps has done the same when you integrate operations and developers into the same type of structure. So having that autonomy is good. But I guess also you need cohesion. You need to be a cohesive organization. Otherwise you're going to end up with like chaos. People did all their own.
Mike Jones (22:37)
Mm-mm.
Glenn Wilson (22:38)
own autonomy. So you do need some sort of element of, you know, alignment to some degree between these different teams to make sure that they are doing the right thing and following the right process. I know you were in the army or the military and there's this thing about command is intent, you know, and I think that's the same sort of thing we're looking at here. know, developers need to know what is the intention of what we're trying to do here? You know, what do we really want to try to deliver?
rather than having this prescriptive nature of, you know, what's happening right now. So, yeah, definitely it is is something that needs to be looked at. Yeah.
Thank you.
Mike Jones (23:12)
you get it with exactly what we're talking about in a sense, the commander's intent's really crucial because that's what enables initiative, enables autonomy. Without it, then choices are endless. There's no sort of scaffolding or cognitive framework to then be able to go, well, what have I been asked to do? And what's the situation saying? And how do I adapt from that? But my fear is that I think we do need
Glenn Wilson (23:23)
Yeah.
Mike Jones (23:34)
coherence, obviously, because I've talked about that anyway, that we have some sort of structure to enable us to bring us together. But we've got to be mindful with things like, I say 27,001, and I'm not saying it's a bad thing by the way, I think it's good. It provides that, like they said, but there is a risk that you go from removing thinking from people and then people become
stupid in the system, not because they are stupid, but because they walk through the door and they're not allowed to make decisions. It's all done by policy. Policy said that, or the rule says that. There's no sort of learning or understanding around that. Rather than seeing something like ISO 27001 as a, you know, that's a coherent thing. This is our, this is the minimum thing we're going for. We have this, this gives us enough structure to know roughly how we sort of organise it. But we do need to provide that.
space for people to have good understanding of the context and to actually make decisions and know why they're making those decisions. think that makes you more, because you said earlier that in a cyber world, in a real world, it's dynamic. If it's dynamic, then we need thinkers. If it's not dynamic and we're in a factory, then you could probably get away with a bit of high division of labor and a bit hierarchical policy.
Glenn Wilson (24:36)
Exactly.
Hmm.
Yeah.
Yeah, yeah. There's another side to security as well, which we haven't really touched on. We talked about like the preventative side of security, which is about writing good products in the first place. But there's also that whole side of defensive security and how do we protect ourselves when there is an incident, when code does get breached. And that's another area that is a challenge for a lot of organizations at the moment. And so again, as I say, you see
Mike Jones (25:11)
Mm.
Glenn Wilson (25:20)
A lot of companies being affected by ransomware and there are some parachutes, is like, know, there's security insurance and, you know, people pay enough money into security insurance and that requires some sort of compliance to some sort of standard ISO 27001 to show that you aren't doing security. But we have insurance to help us get through that, but actually that's a cheap way of doing it really. What we should be doing is looking at, you know, how do we build resilience into our organizations so that we
if we are breached or there is a problem, we know how to deal with it. And I think that's an area that there hasn't been much work on at the moment. And we're not really exploring that as much as we should do. There is a concept for something called security chaos engineering. was actually created years ago. Chaos engineers created years ago at Netflix and
a friend of mine, Aaron Reinhart, took it further to security chaos engineering and Kelly Shortridge wrote a good book with him on it as well. But that's about building that resilience. So security chaos engineering is a bit like, I guess, the red teaming in a military context. So red teaming in security context is like loads of what we call white hackers.
Mike Jones (26:22)
yeah, yeah, yeah.
Glenn Wilson (26:28)
They try to so they're internal hackers. They try to find holes in your code and then you fix those theory you fix that code or we fix that product here And that's what red teaming has become in the security world But I believe in the world of military red teaming is about scenario training. So training for different scenarios Yeah, and and that's what security chaos engineering is about is about creating these The resilience to say well if this were to happen this type of thing were to happen. How would we get around it?
Mike Jones (26:38)
Yeah, yeah.
Yes. Yeah.
Glenn Wilson (26:55)
and let's build that into our system so that we can get around it. We can actually overcome it. NECFIX gave a really good example of that. When I first started out with chaos engineering, when the whole of the Eastern AWS board went down and they were able to carry on streaming and no one else was able to do what they wanted to do. Their business were badly affected. People said, well, how did you do that? And that's when the whole idea of chaos engineering came out.
And that is that they built that resilience into them into their system so that they could actually deal with it. And that's I see the same thing should happen with security. As I say, there is a good book on it at the moment, but we don't practice it enough. And I don't know why we don't practice it because it just seems like the right thing to do to in my mind. I've been talking about security case engineering for a long time, but scenario building, scenario training and helping developers or incident.
Mike Jones (27:38)
Yeah, yeah, yeah.
Glenn Wilson (27:44)
managers actually understand what could happen and how they get around it.
Mike Jones (27:47)
Yeah, I think that's crucial because I was looking at people about red teaming to come on the show and I got Marcus Dimbleby, he came on, but it's quite difficult because when you now talk about red teaming to people, it's almost lost its original idea and it's now the cyber version of it, which is like you said, it's like, I'm going to test around the edges and you're going to fix that rather than what it should be, which is...
Glenn Wilson (27:55)
yeah.
Yeah.
Mike Jones (28:11)
really going through testing things. What if this happens, start simulating these problems and learn from it. There's a guy I know here called Don Vandegrift. He's a great guy and he talks about outcome-based learning, which is exactly that. It's about, let's think of a problem. This scenario is probably plausible and put people under pressure to think about what would you do in that situation.
Glenn Wilson (28:20)
Yes.
Mike Jones (28:34)
And it's not about having a definitive answer. It's the same in the military. We don't do this stuff to say, yeah, we've got the answer to everything. What we do do it for is so that when an incident happens, we're not suddenly exploiting all our cognitive capacity, trying to think about how we do with it. We're in building enough scaffolding to make decisions. So we know that, what happens if we get casual here? We won't predict exactly how it's going to be, but it's enough.
so that people know what they're doing, they can just get on with it. And the leaders then, and everybody else, still got that spare capacity to look forward and go, right, okay, now we're dealing with that, what's next? And it makes them more adaptable in there. And I think that's probably really useful, definitely in your world.
Glenn Wilson (29:10)
Yeah. Yeah.
Yeah, I actually love
this. I really do love this because I think this is the essence of what security is about really. You know, it's about having adaptability. You know, we're trying to be too prescriptive in what we're doing, but we're dealing with a threat landscape that's changing all the time. mean, 10 years ago, there was no such thing as ransomware. Now we've got AI, we've got new threats appearing there. And you know, it's a constantly changing landscape and we need to...
Mike Jones (29:23)
Mm.
Mm.
Glenn Wilson (29:42)
be able to adapt to that and adapt to the different types of threats that are going to affect us and how we actually deal with those threats. Because we're always going to write vulnerable software. We're always going to write vulnerable, develop vulnerable products. It's never going to be 100 % secure. But what we need to be able to do is build that resilience into organizations so that when there is a breach, it's, ah, we deal with that. That's good.
Mike Jones (30:04)
Yeah,
yeah. And you saw the two different responses recently with Marks and Spencer in co-op. Now, I think both struggled with that adaptiveness about what to do, but I think co-op were a bit bolder in their decisions and they just decided to take the hit and they shut all their systems off, disconnected it completely and dealt with it, which meant that, yeah, they had nothing.
Glenn Wilson (30:09)
Yeah.
Mike Jones (30:29)
they literally had to shut everything down and they isolated it and dealt with it, which in the long term was quicker because where Marks and Spencer's, they tried to almost like fight through it and it just kept dragging on and dragging on and dragging on, which I think cause them more issues than probably just being a bit more decisive and saying, right, let's just cut it and move on. I don't know which was right or wrong, because I'm not really in that world, but.
Glenn Wilson (30:45)
Mm.
Mike Jones (30:58)
it's quite interesting to see two different responses to a similar situation.
Glenn Wilson (31:02)
Yeah, definitely both were reactive, but one obviously decided that, you know, this is a situation that perhaps the best way forward is to just stop, let's see what's going on and then rebuild from there. It sounds to me that Marks and Spencer were trying to be resilient, but didn't really have that capacity to be resilient. ⁓ You know, so yeah, so yeah, I like those two different.
Mike Jones (31:18)
Yeah, yeah, yeah.
Glenn Wilson (31:23)
views of the same. It was the same incident more or less, but different ways of dealing with it. yeah, it is an interesting area. as you said earlier, it's not just about showing people how to do something when something happens. It's more about building that condition into people's, know, like so it's conditioning. I like to use the example of fire, the fire brigade. So the fire brigade, they go through these drills daily, you know, about how to
Mike Jones (31:41)
Yes.
Glenn Wilson (31:48)
you know, people out of cars, how to deal with fires, how to do, you know, lots of different scenarios that they practice. But every incident they go to will be unique and they would never have seen something exactly the same as that. going to be, it might be similar, but they might have different casualties. They might have someone trapped in a car in a different way. A fire might have,
you know, a fire might be developing in ways that they couldn't have imagined and, you know, putting people at risk and how they deal with that. As you said, they have the capacity to know what to do because it's been trained in them. It's become more of a behavioral instinct as opposed to having to think about what they need to do. And I think cyber security could learn from that. We could definitely learn from that. know, you know, how do we, you know, build that resilience into our
into our organization so that when there is a cyber attack, we can deal with it. We know what to do. We may not necessarily seen that exact attack before, but we know what to do. Or at least try some stuff out that could work. But yeah, that's.
Mike Jones (32:48)
I know,
it the, I've got, I don't know what organization it is now, but it's something to do with Cyberworld and they provide like tabletop exercises that you can download. I'm going to say the NCA, but that's not it. It's not that, it's something, but the, often with those things, they're good, but they're not, they're not context specific.
Glenn Wilson (32:58)
yeah.
Mike Jones (33:11)
I think if you really want to get the value, you need to have something that's really context specific to yours. And also what I've seen, you know, with tabletop exercises and stuff like that, what you tend to have is you've got the, you've got leadership and the leadership are doing it. Where the point with the fire service and the military is that everyone's involved. And I'm not going, well, you know, you're now going to do this.
Glenn Wilson (33:14)
Yeah. Yeah.
Mike Jones (33:34)
I say to the people there, well, you know, like I said, the thing we have, we have a casual at this point and I just point to something and go, you know, what do we do? Because it's not in those situations. It cares very little what I think because I'm not the one doing it. And I don't have the capacity to direct everyone in those situations because we just wouldn't have that. So I need people have the confidence that people have assimilated that.
Glenn Wilson (33:43)
Yeah.
Mike Jones (33:59)
that are actually going to do it. we've had, I'm very, this, I'm not a very technical person, but we've got a threat in this server, to that, well, what would you do? Well, have we done that? Then I need to do this, this, this, and this, okay. Well, what impact would that have on that team? And you point to the team, you go, well, that means this to us, so we would probably need to do this. Okay, cool, how would you communicate that?
It's that real get into that detail about how do you do those things. So when we actually go ahead and do it, I know things will not run smoothly, because it never does, but it will run a lot more smoother than me having to personally direct every step of that situation, because you're just relying on one person, and that's not a good place to be.
Glenn Wilson (34:47)
Yeah, you mentioned tabletop exercise as well there and tabletop exercises might experience once a year activities. And the issue involves a whole bunch of people that would not really be involved in an incident really. As you say, there'd be lots of managers there. As I say, the fire service is trained daily. They drill daily and I guess the military.
Mike Jones (34:55)
Yeah, yeah, yeah. Yeah, yeah.
Glenn Wilson (35:09)
trying daily, they drill daily, so they can deal with this. yeah, it's, yeah.
Mike Jones (35:13)
And we teach people outcome-based learning. It's really simple. You literally do it in five minutes. And I think that's it. It's like after action reviews in organizations. They're done like once a year or after a major incident when something goes wrong, but they're just not practiced consistently throughout the year. So you're not learning. You're only learning when something goes wrong. And that's the same with this resilience. You're only doing it
Glenn Wilson (35:35)
Yeah, yeah, good. Yeah.
Mike Jones (35:41)
because, oh right, we need to now demonstrate to the ISO people that we've done a tabletop exercise, rather than, and I think that's the essence of what you're trying to talk about cyber and I'm talking about in strategy, is that we do things because it's almost scripted to do things, rather than doing things because underlying, that's the activity that's going to help you for any of these situations.
Glenn Wilson (35:45)
Yes.
Yeah, I don't know whether you've worked in any organizations and they've said to you right before you can start here, you need to ⁓ do this 20 minute test that shows that you're cyber security aware. ⁓ And then we'll come back to you in about a year's time and teach you this again. You know, that's not a learning organization. A learning organization is one that can, you know, is constantly looking at ways to evolve and understand.
Mike Jones (36:17)
Yeah.
Glenn Wilson (36:31)
you know, their position in the environment.
You mentioned also the communication. Communication is so important. The communication between departments, between certain roles within an organization, it's so important to have that communication because that's also where you learn as well. It's, you know, how can we build more resilience into what we're doing, you know, work collaboratively to do that. And then another communication channel, which is very important, is understanding when
internally something's not quite right and you can you need to act on it quickly you know so that you you you know you're not going to leave yourself open how many times have we heard of breaches where no one knows what's actually going on until a couple of days later and they said
damn, I think we've been breached. Yeah, it's got to happen. There's a very sophisticated cyber security attack on our systems. It just a ransomware attack that some kiddie probably sent off to the service and got in through some phishing attack. So yeah.
Mike Jones (37:13)
Yeah, yeah, yeah.
Yeah, yeah, yeah.
And that's that feedback loop, isn't it? It's rarely closed. that trust that people can speak up and say, I think something's happened here. Rather than, often think a lot of this stuff in organizations, in the cyber world, in the security world, just generally in organizations, it's the fear.
Glenn Wilson (37:28)
Yeah.
Mike Jones (37:48)
that I've done something wrong, I've clicked that, I should have done that, I'm gonna try and keep quiet, rather than just open up to it. People are gonna make mistakes, like we can learn from it.
Glenn Wilson (37:51)
Yeah.
Yeah. Yeah.
And you hear that with phishing simulation. So phishing attack is where someone has sent an email and it contains a link to a malicious website and they click on that link and end up giving away some credentials or some information that's important. We do these phishing simulations in these organizations where we send emails to people with links like that, but they don't go to a malicious website. They go to an internal website that
that probably explains that you've just clicked a link that you shouldn't have clicked on. Now, we send these phishing simulation emails out all the time. That's not training. That is just trying to catch fish. ⁓ We're not really developing that ability to teach people what, you know, I quite often click on those emails and I report them and I get a message back saying, done, you've just reported a phishing simulation. What have I learned from that? I've learned nothing, you know?
Mike Jones (38:32)
Yeah, yeah, yeah, yeah,
No, no, no, no.
Glenn Wilson (38:48)
tick the box. What about those people that are opening emails all the time to open up documents or click on links because that's how they're interfacing with suppliers or customers? You know, how do you tell them not to click on a link? Are they meant to check every single one to make sure that it is absolutely valid before they open it? Or do we need to develop a system that allows them to click on these links and do it safely every time?
⁓ you know, that's, that's another way. And the only way you're to do that is by building that resilience into organization is not by training people, not to click on links and email.
Mike Jones (39:10)
Yeah, yeah.
Yes,
and I think this is where that that compliance thing we talking about earlier and it's almost like we just don't want people to think. We just sort of want them to, you know, to just not do something without them understanding that, you know, we had even in my business, a small one, we had an issue with that and someone clicked on something. it wasn't an issue because we sorted out it was relatively small. wasn't a big drama.
Glenn Wilson (39:25)
Yeah.
Yeah.
Mike Jones (39:45)
But after that it was more like it was giving people the confidence to have a look and go, well, I'm not sure about this. And you know, you say something and it's quite obvious when you teach people around what you're looking for. How do you know that this is like a potential fishing thing? you know, cause it's meant to be from this organization, but the, you know, the prefix of the thing is .org, not .co.uk, which I'm normally used to. And you know, it's just trying to get people.
thinking in that sense rather than I think we put a lot of fear into people and about don't click this or you know the world's gonna fall apart.
Glenn Wilson (40:17)
Yeah. Yeah.
Or when someone does click on a link, they feel that fear of actually telling someone that they've done it and they've entered, especially if they've gone as far as entering their username password into a hacker's website. You know, they'll keep that quiet. Oh, no, I don't want to tell anybody. And that's wrong as well. We need to build that psychological safety into the system so that basically someone can say, I've just done this.
Mike Jones (40:25)
Yes.
Glenn Wilson (40:41)
I I made a real bit of a mess up, know, and not feel, not that person feel that they're to be fired for doing that. You know, that is actually an opportunity to learn from this experience
Mike Jones (40:47)
Yeah, yeah.
Glenn Wilson (40:51)
and understand, okay, what can we do better? Or, okay, someone's clicked on this link, they've given them their credentials. What does that mean to the business? What does that mean to the individual as well? Because it could also be quite significant for the individual. But what can we do to limit the risk now that this has actually happened?
As opposed to someone clicking on the link and being very quiet and all hell breaks loose because no one wants to speak up about what they've done.
Mike Jones (41:11)
Yeah.
No, that's the thing around with a lot of this and a part of resilience as well is that initiative. The sooner you know, sooner you can take the initiative away from the hacker because your speed, but you're not gonna get that. Well, I think they're two interlinked. One, if you don't have the resilience and people aren't ready to adapt to those situations and roughly know what they could do. And two, if people aren't gonna tell you.
then you can't seize that initiative and you're always going to be on the back foot. And that's not where you want to be as an organization. You want to be able to understand this isn't wrong and seize that initiative really quickly so that you can minimize the risk that you've got.
Glenn Wilson (41:47)
Yeah, yeah.
Yeah, I dare I say it, but it's sorry in my head I've got the OODA loop. And the attackers, the attackers are trying to disrupt your orientation. You need to reverse that and try to disrupt the attacker's orientation. I think that's probably the way you look at it.
Mike Jones (41:56)
Yes, yeah, yeah, exactly. ⁓
Yeah.
And all this stuff we're talking about with, you know, this practice around Red Team or outcome-based learning, making yourself adaptable, having the right sort of frameworks in place that aren't too restrictive, giving people the space to make decisions. That all helps the orientation phase. So it enables that, yeah, that quickness from, or speed from observation to...
Glenn Wilson (42:23)
Absolutely.
Mike Jones (42:29)
orientation to make these quick decisions to act and then constantly, you know, adapt into what's how that's unfolding. Yeah. I think that's really closely aligned to Oodloot. Can you speak so much? I love the Oodloot, I can go on about all the
Glenn Wilson (42:38)
Yeah, definitely.
I do know, I think I mentioned to you previously that five, six years ago, I got the wrong end of the stick of the Udalupe. I think I saw one those linear circles of the Udalupe and then I think then I got introduced to John Boyd's Uda sketch. Let's call it a sketch. It's not a loop, it's a sketch. And then listening to people like Brian McGraw, one of your previous guests, you know, like realizing that John Boyd actually had a really
Mike Jones (42:51)
Yes. Yeah.
Gotcha.
Glenn Wilson (43:05)
deep understanding of businesses. studied the Toyota production system. He had studied the way that businesses work. know, a lot of people think it's just his fighter pilot instinct coming out, but it wasn't anything like that at all. He was a very, he was a non-academic academic. He studied this stuff in depth and his OODA loop was the outcome of that. So if anyone's listening, just go and check that out because that podcast out, because that's a good one that you had to do with Mike McGrath there.
Mike Jones (43:14)
Yeah, yeah, yeah.
And
I think it's really useful definitely, if we're talking about with, know, strategy and cyber security, and actually to say much commonality being the challenge we have, it's all about protecting that ability to think in that decision space, rather than trying to restrict it, which is often counterintuitive to what people are taught. You know, it's not just in cyber world, it's in...
in leadership, governance, all this, it's how do we restrict decision space? Because if we restrict decision space, then there's gonna be less risks, but we know that not to be true. And actually it creates more bureaucracy, more inertia, and it prevents you from adapting to situations.
Glenn Wilson (44:13)
Yeah,
yeah, yeah, definitely. Yeah, it really does. Yeah. ⁓
Mike Jones (44:18)
He
How have you, so you said you're looking at this stuff for your masters. What sort of the key things that you've talked about in a more systemic view of this though?
Glenn Wilson (44:28)
Yeah, so what I'm looking at really is vulnerability management. We have a challenge in cybersecurity that we talked about a technical debt that could manifest in like a huge database of vulnerabilities that we have within our own software. And we've tried this systematic way of doing dealing with this. You know, we often hear about, how do we prioritize? There's different ways of prioritizing. We have like CVSS scores.
We have EPSS schools, have Kev schools. We try to prioritize on that, but these are just different ways of dealing with the same systematic way of dealing with it. And what I'm trying to say is maybe there's a systemic way that can support us as well. And I'm looking at the viable system model by Stafford Beer with ESM. How do we build a viable vulnerability management system? And using the model that Stafford Beer came up with to try and do that. And so my dissertation is exploring that area.
So, you know, how do we get the right communication channels? You know, not just between the systems, but between the different layers of the systems as well. What does governance look like? You know, how does the system three system four homeostasis look like in the vulnerability management? know, system four looking at the future and outward threats and system three looking at the inward and now developments that we're doing. then.
you know, the communication between those as well. So I'm trying to, you know, work out what, how that looks and whether there's something there that we can learn from. So aligning systemic and systematic ways of doing stuff.
Mike Jones (45:45)
Yeah, yeah, yeah.
Yeah, I can imagine how system two can play a lot into that around the coordination between the elements from sort of the DevOps and so on and different teams be useful. Yeah, that sounds really, really good. And it's so good that you can use these sort of methodologies to look at a different lens in the organization and go, well, you know, what do we need to do? What's our capabilities now? Do we then look after that? the. Yes, we can.
Glenn Wilson (46:12)
⁓ Yeah.
Yeah,
I was just gonna say, know, so something that I've already identified that could be missing is identity, that system five, you know, identity, you know, what does, you know, identity in terms of vulnerability management as a system, you know, how do we build that policy around vulnerability management? We build policies around lots of different things around the circumference of it, but it doesn't really have its own identity.
Mike Jones (46:26)
okay.
Yeah, yeah.
Glenn Wilson (46:42)
So we have policies around scanning code. We have policies around time to fix code, stuff like that. But there's no real policy that is overarching within that system. think of system being a little system of... So Stafford B's systems are fractal. They're recursive. So I'm looking at this model here, this particular system here, and then you've got other systems that build around it.
You've got other systems that do seem to have their own identity, but vulnerability management itself, this is based on my research, obviously. ⁓ I would probably recommend that people explore a bit further in their own context, but yeah, that's something which I've found so far.
Mike Jones (47:14)
Yeah.
I assume that that's similar into organisation. think there's a bit of us where it's almost just assumed. So it's like there's nothing, well, it's like execution. It's just assumed that, you know, we make strategy and we execute it rather than thinking, no, there should be, you know, it sounds kind of shift to what we've been talking about, but enough coherence or structure around what we're doing to help guide by principles.
Ascent, you we don't, we're not on about rules. It's all about guiding principles to enable people to do this stuff because you can do that. That gives them sufficient, um, cognitive framework to be able to adapt from rather than when there's nothing that you probably found, there's no identity around it. It then, um, has a real risk of, um, fracturing because you get divergence of action. You get divergence of action and people are doing all different things, which then
Glenn Wilson (47:54)
Yeah.
Hmm.
Mike Jones (48:18)
makes you more vulnerable or just as vulnerable if everything was so tightly governed that you could adapt. So it's about having that boundary between too rigid and too chaotic. It's that bit of stability in the middle that you're trying to tightrope.
Glenn Wilson (48:32)
Yeah. Yeah, yeah.
Yeah, yeah, definitely. So yeah, it's been an interesting year and a bit studying this. So it's certainly given an appetite for drilling more into systemic ways of dealing with security.
Mike Jones (48:40)
Yeah.
Yeah. And when you think about this stuff as well, and the same in execution we just talking about, is that, you know, with that vulnerability management and the security there, same with execution, I need people. And if my people aren't engaged or have a good level of commitment towards the organisation, i.e. if I treat my people pretty bad, you know, treat them like idiots.
I don't give them the resources that they need to do the job. I put massive strains on them like you saw with the DevOps, you were talking about with the DevOps people, really time pressured. Then that's going to potentially increase the, what we call counter work behaviors. So in the execution world, in the counter work behaviors is that people come bit despondent. They will put barriers up to execution where I suppose in the cyber world, correct me, but.
Glenn Wilson (49:23)
Hmm.
Mike Jones (49:36)
that counter work behaviors can start looking to that insider threat. Actually people actively looking for malicious ways to bring down the system.
Glenn Wilson (49:44)
Yeah, definitely. As Damien said, everyone's entitled to have pride in their work. And, you know, if we take that away from people, people are not proud in their work. They don't have that pride in their work. They're not really developing
something they really love and feel that they'd be happy about. And that's where the problems occur. so, yeah, I think there is a learning there that you probably do need to.
give the developers a freedom to actually enjoy what they're doing, enjoy their code, enjoy the development process, do it in the right way so they can actually say, you know, I've written some really secure stuff and I'm really proud of that.
Mike Jones (50:18)
Yeah, yeah,
And I think that's a, like you said that word enjoy and I totally resonate with it. I think it's really important that people enjoy and have pride. I just think that that word enjoyment gets captured in a different corporate way and it becomes, enjoyment becomes, you know, how do we create all these other initiatives? I always joke, you know, about tabletop yoga and all this other crap, but all that does then is it, it, it,
Glenn Wilson (50:24)
Yeah.
Mike Jones (50:43)
it reduces that capacity that people have, it takes them away from what they want to do and then it puts more pressure on them which then decreases their enjoyment. So it's interesting to how different people's perspective on enjoyment can actually have a counter effect. But I totally resonate with that enjoyment. If you get that, that's fantastic.
Glenn Wilson (50:47)
Yeah.
Yeah.
Yeah.
One of the principles of DevOps actually is five principles of DevOps. One of them is focus, flow and joy. So DevOps as written by Gene Kim and Jess Humble and I can't remember who the third author was, but yeah, that's basically what they talk about is focus, flow and joy.
Mike Jones (51:15)
That's cool. I wonder if they mean flow is in me. Hi, Cheeks. Send me highs. idea would flow. Yeah. Yeah.
Glenn Wilson (51:20)
Yes, exactly. That's exactly where
Gene Kim got his influence. You know, he's influenced by him. Yeah, I can never pronounce his name, but yeah. Yeah.
Mike Jones (51:26)
Alright, yeah, it took me a while to learn his name.
He appears quite a lot in stuff I talk about, I had to learn how to say his name. So I went and got it into Google and it broke it down to, yeah, and I listened to other people say it, yeah, so it's me hi, Cheek sent me hi. Yeah, If listeners, he's written a book called Flow.
Glenn Wilson (51:36)
Yeah.
Okay, that's the second word for it.
Mike Jones (51:51)
And it's all about that idea. And really, if you want to boil it down to its very simplest part of his concept is that if you ever felt you're in that moment where you're so engrossed in what you're doing that you've just lost track of time. And that's really what we're saying about flow. And you only get flow if you've got the right resources, you've got clarity, you're talking about commander's intent, you've got the support and you...
Glenn Wilson (51:51)
Yes.
Yeah.
Mike Jones (52:17)
generally enjoy the task. get those things, you get flow, and that means people just so engrossed, so in tune to it, but then they get a sense of enjoyment, not like I would, enjoyment into what they're doing. Yeah. ⁓
Glenn Wilson (52:27)
Yeah, yeah, definitely. And think Deming did
actually say everyone's entitled to joy at work. think, yeah, we do, we do.
Mike Jones (52:33)
Yeah, I agree. Spend a lot of time there.
Cool. I thought it was fascinating with this concept of bringing those two worlds together around cyber and strategy and how actually there's very similar commonalities or challenges that we have. But for our listeners, Glenn, there something you want to leave our listeners to think about or reflect from about this episode?
Glenn Wilson (52:59)
Yeah, I guess what I think I would say to people is learn to enjoy your work. And I think if you're a leader, build that capacity into your organization. Because if you don't build that capacity, you'd be too prescriptive. You're telling your developers what to do, or your staff what to do. You're just not going to build that ability to remain secure. Security comes from being creative, and you need to be able to
Mike Jones (53:03)
Yeah.
Glenn Wilson (53:23)
build that creativity into your teams to learn how to become, write your secure code and not build up technical debt and be proud of what they're delivering.
So I guess that's my key takeaway there really.
Mike Jones (53:33)
Yeah, yeah. I like the idea of technical debt and I think that's very relevant outside of cyber security. It's just that debt of things that we enable to happen in an organization that are probably going to constrain us or create further risk in the future. So I think that's really good. And your idea of giving that space, that ability for people to act, I think is crucial in an organization, especially it's more dynamic.
Glenn Wilson (53:52)
Yes.
Mike Jones (54:01)
environment we find ourselves in. absolutely pleasure. Thank you very much, Glenn. I appreciate that. That's good. No, no, it's fantastic. And for people that are listening, you enjoyed it, this conversation as much as I have, then please like and subscribe and share to your network so they can get value from this. Yeah. And I look forward to another conversation next week. So thank you very much, Glenn. And I look forward to speaking to you again soon.
Glenn Wilson (54:05)
I've really enjoyed this. Thank you.
Thank
you, Mike. See you soon.
Mike Jones (54:27)
See you soon.