Excellent.
So we can move to the next well, it's about noon now, and time for introducing our next guest. It's Tom Langford of sapien. Give him a warm welcome, please. He's traveled heavily today. I think he wasn't easy for him to be here in time, but he's managed welcome, Tom.
Should we park
Tom? I've introduced you as the Caesar of sapien. Maybe wanna say so more about your position, your person, your role before you introduce your thoughts to people.
Sure.
So I, I run the global security office for sapien and now for publicist group, because Sapian has recently been acquired by the publicist group, a French based media agency.
Okay. So maybe you wanna just jump into absolutely your slides that you have prepared. Thank you for being here. Absolutely.
The slides coming up. Marvelous. Good afternoon, ladies and gentlemen. Thank you very much. What a wonderfully large room with, you know, a few people in it, in it, but no matter it'll make it a more intimate affair. And trust me, I've got a few intimate questions for you. So here we go.
So first off I'm here on my own, on my own merits, everything I say to you is based on my own opinions, not my companies. This is great because at the end, when you break into rapture, applauses, it'll all be my own work. If you don't like it. It's because I haven't had the support of my company. So it's a win-win for everybody. A note to the gentleman at the back, the front monitors are not working. They're on a blank screen. So then we're gonna talk about three things.
Firstly, our interpretation of risk, secondly, how we measure risk and also then how we subsequently effectively treat it. My premise of this is that we don't do any of this particularly. Well.
Now the interpretation of risk, we all know this question. The question is, which is more dangerous to human beings, the coconut or the shark. Now I think we all know in our heads, but please shout out any answers, which is, which actually kills more people every year coconuts. Exactly. And yet when our little monkey brains look at the shark, we have a desire to run away.
And when they look at the coconuts, we have a desire. Well, we actually get hungry to be perfectly honest with you. We have a desire to eat it. So what we have here is is this a monkey brain inside our head versus the rational brain, the modern brain that understands the difference between the two. Now this is very simple example, but we know which one of these is riskier.
However, there are many, many other examples where we get it so very, very wrong.
So I said, we were going to get intimate hands up in the air. Those of you who flush your toilets with the lid open, I'm not talking about the seats I'm talking about with the lid open. Come on. We're all friends here. Okay. So there's a few about 50% that's normally about right? So 50% great. So when we flush the toilet, after we have done our business, what actually happens to the contents of that toilet?
Well, I can tell you what it does is aerosolize the contents of that toilet and throw it up to 12 feet in all directions. That's quite an interesting thought given, you know what you have just placed within said, receptacle, what else do we keep in the bathroom? The toothbrush normally on display and a nice glass. So hands up, those of you who brush their teeth with fecal matter should be the same ones.
Come on. Okay.
So, but it's an interesting conundrum here because you should really be placing well art. Your, your thoughts now are I'm going to close the lid on my toilet tonight, before I brush my teeth. I'm not going to play that game anymore. And it's not surprising really.
So a, a study at university of Arizona found that are 49 harmful bacteria per square inch on the average toilet seat. That's not too good, but then let's actually look at the real risk here. Because when we look at a, an average mouse, there's 1700 harmful bacteria on the same mouse compared to your toilet seat, it gets better, you know, nearly three and a half thousand on your keyboard over 20 well, nearly 21,000 on your desk. And over 25,000 on your phone. Now that really puts things into context.
So the next time you have your lunch at your desk, while you're on the phone, using your computer, you are actually putting yourself in far greater risk of illness than flushing the toilet with the lid up.
But I dare say that next week, when you get back to the office, you will have closed the lid on your toilet and you will still eat your sandwich at your desk because actually this one doesn't feel like it should do you more harm, but actually it does. So we have this concept here. We have the perceived risk, the coconut versus the shark. We know we've worked it out because we are smart.
Stand up straight monkeys. Basically we know what we're doing. We then have the hygiene risk. These are the risks that we fix, even though technically we don't have to now start applying this to your day to day risk assessment roles. You start thinking, what risks am I actually fixing? Because I feel like I should not because they are more risky, but I feel like I should, because you actually have quite a few more actual risks out there than you than you thought you had had before.
So our actual perception of risk is wrong in the first place.
It actually, it actually comes about quite, quite poorly moving swiftly on. However, the way we measure risk is also very interesting.
Now, a little brief background here. So a very good friend of mine, name of ARD Malik, who now works at alien volt. He wrote a book supporting the CI S S P qualification. And in the book he was describing risk models. And so he came up with the Mallek risk model, which basically measured impact and likelihood. And he likened it to a fight in the pub. You can tell he's English, he likes that sort of thing, but he likened it to a fight in the pub. And as you can see, you know, everything ranges from, if the impact doesn't hurt is at the best you got ouch and holy crap.
And the same again, with the likelihood ain't happening, possibly it's on holy crap.
And the measurements increase as the further down to the right you go. So anything's from, you know, all my drink, Steve, get a cab to a and E or need an ambulance. Now I thought this was a good example, cause it actually shows in real terms what those numbers mean, which were gonna come onto a minute.
But I, you know, I use the ISO 27,005 model and there's a certain, there's a different element in there. So I came up with the Langford Malik model, which I, he has agreed to put into his new book. I'm glad to say what it does. It's again from the ISO 27,005, it works on likelihood, ease of exploitation and asset value. So likelihood ain't happening all the way up to holy crap, ease of exploitation. I'm a ninja, I'm a drunk ninja or I'm just playing drunk. So how easily am I gonna be exploited?
And the asset value my arms, my legs, my chest, et cetera, et cetera.
And so you can see that you start to get a pattern within here. Now, what this is very good at is showing your risk appetite all the way from nothing happening to actually what it's completely unacceptable going to the mortuary is unacceptable. Risk appetite is such a flexible thing and therefore such a very difficult thing to ascertain. Let's put this into a slightly different context. Anybody who's between 18 and 24, a little bit of a fisticuffs in the pub on a Saturday night is probably the sign of a fairly good night.
If however, like me, you are, you know, 38, sorry, 40 plus married with kids, et cetera. Your risk appetite is completely different.
You know, if someone even nudges into me and spills my drink, that's, that's my whole night ruin. I'm going home. I'm not having any of it whatsoever. Try and liken this to your company. How do you measure your risk here?
Now, when we put that, let's put that straight back into the context of the ISO model zero to eight, simple as that, many of you know this, but actually trying to work out where along that, that matrix your tolerance for risk is, is one of the most important things. And it could change on a very, very regular basis.
Now we use numbers in there. Numbers are great. Numbers are very simple to and easy to understand. They're easy to put across to board members.
We're a zero, we're a five, we're an eight, but there is a problem with ALS just as there is a problem with numbers, sorry, with, with colors, for instance. So if we talk about traffic lights, for instance, high, medium, low it's no, no, no difference. 0 1 2. For instance, you could say the same thing, but for instance, with the traffic light system, nobody's gonna use green because it means there's no more budget. And conversely, nobody's gonna say it's red because it means they've failed, right?
So therefore just about everybody chooses Amber or somewhere in between and quite apart from a fact, if you look at it from the other way, green may actually be too expensive. Amber May actually be perfectly acceptable.
If green is, is seen as too expensive, it's all about the scope with which and the context with which you communicate. Green is above industry. Average red is below industry average. Where do we want to be in our risk models or in our, in our security models. According to our industry is one way of looking at it.
You may reach green, but you're not allowed to do so by spending more money. That's fine. If you are spending boatloads of money, you reach green. You are gonna have that money cut from your next budget, for instance.
So again, it's contextual. There is a problem with ORs. There is a problem with traffic lights, if they are used incorrectly, if they're communicated effectively. And if the scope and the context is communicated effectively, they can be very effective. You could use percentages, you could use Def con values. It doesn't really matter. As long as the people who you are talking to understand what you are talking about. If you just tell 'em that everything's green, the moment something goes wrong, you're gonna be fired because they don't understand the context of that green
And yet.
And I understand there was a talk yesterday about a black Swan. So Nicholas, Nicholas Taleb's book about black Swan events. And yet you do your risk models and actually most of your risks come outta black swans. The black Swan model, for those who didn't hear it is the one that says these are risks or incidents that happen that were not foreseen at the time. And yet in hindsight were perfectly clear. So stock market crashes are very often seen as black Swan events. He also described just to put this into context, he also described an industry that is actually very, very good at risk management.
Can anybody give me a suggestion of what this industry might be an industry that's great at risk management
Banks is a good one. It's not on my slide, so it's wrong. Next endurance. It's a good, but it's not good enough medical, same good, but not good enough.
All right, let's move on. It's the gambling industry in Las Vegas, right? They know risks. The house always wins. They understand risk. And yet in this particular, in this particular casino, four things happened three of which, oh, sorry. Three things happened. And a four thing was, was, was avoided. Thankfully that would've put the bus that nearly put the business out business. The first one being what the manager's son was kidnapped. And he used casino funds to secure the release, which is illegal. And so they got five millions and millions of dollars for doing that.
You know, well, such his life. The other one is when a there's somebody whose job it is to file tax returns to the IRS, all the big rollers who win money, they get, they have to notify the IRS, the tax people of these winnings.
This guy was dutifully filling in all the forms. And then for 15 years kept them in a box under his desk and didn't send them to the IRS. You can imagine the size of the fines that came outta that. The third one Siegfried and Roy, those marvelously blonde main magicians had a pet tiger.
As you do said tiger molded one of them on the stage they had insurance for, if the tiger mauled a member of the audience, not for, if it molded one of the acts. So again, they lost millions there who would've, who would've thought it right? And a fourth one, which is probably the most concerning one. I was a construction worker with the casino. He was laid off because of an injury. He didn't like his payoff. So he decided to get his own back by strapping plastic explosives, to the supporting pillars of the, of the casino and was going to set them off.
And, but he was fortunately caught just in time now, which one of those incidents would go on most people's risk registers. It certainly wasn't on the risk registers of these very professional people within the Vegas environment. So black swans are the things that are really gonna hurt you. Not the things that really are gonna come across in your risk registers half the time. So we move on to the final party, the treatment of risk, how we actually treat risk is often wrong.
And it, it ties in the previous two sections here. So there's a, there's a it's, it's, it's either an old story or it's a myth. I'm not sure, but it makes sense. The story is there are six chimpanzees in a room and there's a ladder in the room with a banana at the top of the ladder. Obviously one of the chimpanzees makes a break and runs up the ladder to get the, the banana freezing cold water is poured into the room.
Drenching all of the chimpanzees, they all run off car on a corner. Monkey tries it again. Sorry. Chimpanzee tries it again.
Drenched in water to the point where they don't go up the ladder. Obviously they take a wet chimpanzee out and put a dry chimpanzee. In first thing he does is go up the ladder for the, for the, for the banana, the five wet ones, beat him up because they don't want to get wet and so on. So you pull another wet one out and put a dry one in the dry. One goes up the ladder, they all beat him up. Then nobody gets the banana until you get a point where there are six dry chimpanzees in this room and they put another one in.
And when the seventh Chi chimpanzee goes up the ladder, they all beat him up, but they dunno why.
Now, if you don't think that you do this, I'm just gonna say one word, very carefully antivirus. But you know, we do this. Who's gonna be the first person to take antivirus off their systems. I know it's not gonna be me, but we do things even though they're not necessarily the right things to do in our environments. Here's another example, lock leads. So I calculated for 15,000 people. If we removed lock leads, I could save half a million dollars every three years.
I can tell you that we have not lost half a million dollars worth of laptops that were not locked within a three year, five year, even six or seven year period. But we are still insisting on locking these laptops down because we should be relying on. What's actually important to us, which is the data we should be relying on. Encryption. The hardware, most people's insurance successes are $10,000, 10,000 euros.
So when a single laptop goes missing, you can't do anything about it anyway, right? But if your data goes missing, that's gonna cost you a little bit more probably.
So we focus on the wrong thing there very often, we focus on the lock leads, not on the actual data. We should be encrypting, not locking stuff down. Think of the, the, the balance and the trade off there. Because what we are looking at here is a difference between a causation and a correlation. And just to describe this, did you know that the more cheese you eat in America, the more likely you are to die by being tangled in your own bedsheets? It's true.
Look, the graphs there. It says. So there's another one. It says that the, the number of the number of deaths, seaborne deaths or water deaths on the Eastern seaboard of the us goes up according to the number of films that Nicholas cage appears in every year.
So, you know, if we want to reduce sea deaths, we need to try and stop Nicholas cage from acting. But this is a distinction. These are correlations. There is no causation between the two. This is great website, SP correlations it's worth looking at, but this is the problem that we often face is that we think we are fixing something and we see a result, but it's actually not even related to it. It's a correlation, nothing else. There is no direct link between the two.
And you, it's very important that we understand the difference between the two. So therefore, what I've just said is that your risk registers are useless. You can't do anything about it. Incidents are gonna happen. They're gonna knock you outta the water.
You know, it doesn't matter how much money you throw at it. You are still going to get at risk.
So what you know, what's the point of continuing now?
Well, the, the point is you need a flexible risk response. You need to tie your risk management activities very closely to your incident management activities. There are circular activities, your risk management table stakes activities of finding risks, put 'em in the risk register, needs to be fed into the incident management team. And then when incidents happen, they need to feed that back to you to go into your risk register. You are the, the, the risk register is the theoretical. The incident management is the actual and you're feeding it through.
So it's making your risk register far, far more accurate, far, far, far less theoretical, and far more practical.
So three things I'd like you to take away from here. First of all, let's recognize, understand the difference between hygiene risks and actual risks. A hygiene risk for me is the lock lead. An actual risk is the data being lost.
So let's, let's focus more on actual risks versus hygiene. Look through your risk register. How many of these are hygiene risks? The second one start to spot patterns in your risks over time. So what's become a commodity versus what's become a black Swan. If you go back 15 years, DDoS attacks were a black Swan event. They were brand new. Nobody even heard of them. They're now a complete commodity. You can just download something off the internet, put in an IP address, hit go, and you are, you are now the DDoS king. So it's things will change.
So what actually has turned from an incident that that is being labeled as a black Swan into something that actually needs to sit quite squarely in your risk register, because it's a commodity. It happens very, very regularly, you know, and finally a risk doesn't go away just because it hasn't happened.
Doesn't, you know, don't forget the difference between causation and correlation. If you've fixed a risk, if you mitigate a risk or avoid a risk, that risk doesn't actually necessarily go away, just cuz it hasn't happened. Doesn't mean that you are in the clear, it can still happen to you in ways and means that you may not even experience or have even thought of before.
Thank you very much. I can be contacted on any of these internet residencies, although I'm sure I'm about to be talked at very loudly by one of these gentlemen as well. Thank you.
That was great. Tom.
Maybe you wanna sit with us for a minute. You were pretty fast. We're a bit ahead of schedule. All of you have the opportunity to impose on questions if you feel like that. Sure.
But let, let me start with one. How do you tell people not to eat cheese?
I mean, how you make sure that your organization will follow your thoughts, understands your, your metrics and maybe have their own understanding. If there is a correlation or a true link between incidents.
I think something else I'm very, very, very fussy about is the way we communicate to the people in the organization. So the way we educate, educate them and make them aware of things.
We don't, we don't sell information security to people anymore. We market to them. So let me put it into context. In the old days you had washing powder that you were told, use this, it washes whiter than white. If you don't use it, you're an idiot job done.
And, and that actually worked in the old days in the old black and white mad men days that actually worked are clients are, people are so much more sophisticated now that that simply doesn't work. They have this innate spider sense of when they know they're being talked down to of when they know they're being patronized and they will blanket off because as far as they're concerned, it's just getting in the way of their jobs.
Now, if you start marketing to them, I saw an advert for a car the other day and it was a, a full page, full color. And it was probably the lower 15% was the wind screen and maybe the top of the wheel of the car and the rest of it was this open Vista, this beautiful open Vista. They were selling a lifestyle by our car. This is your lifestyle. They didn't touch on any of the technical specifications. They didn't say why it was better than the competition. They didn't say even why you should buy it. That innate thing of this is the lifestyle you will lead when you buy our car.
And that's how we need to, we need to do it. We need to pull the value that we have, either the actual lessons that we wish to te teach people and combine that with a story, combine that with a story in order to create the experience that they want to adopt.
Okay. Is this story necessarily a more emotional perspective? Yeah. Or might that be at the end of technical figure again?
I think, well, if it's the right technical figure, it will, it will create a, a visceral response. So you can use technical figures. It depends on your audience. It depends on your business or it could just be something that evokes a response in somebody. So for instance, it could be, we use short humorous films for instance, and there are characters in there that virtually everybody can relate to doing stupid things that you shouldn't be doing. And of course they relate to them and go, I wouldn't do that. At least I don't think I would do that.
And I'm definitely not gonna do it now because I kind of saw myself in that position a little while ago, you're evoking a feeling that they are remembering. If you see what to mean,
Is that because security and with that, the risk of security and the cloud is become becoming even more complex. Is that what you saw at the end taking over complexity?
Maybe not only liability in other things, but is maybe by selling this emotion or at least this, having this approach is it that you make people feel better and you take it over or are the mature customers that as you are mentioning them in a constant dialogue, is, is it half and half? How do you see that? I think
It's more about creating that dialogue. I think that there are huge complexities and you can't, you can't teach people to respond to every single situation they're gonna come up against every single complex environment they're gonna come up against.
So you teach simple basics and you teach, as I say, an attitude and a lifestyle, but you are also maintaining that dialogue. So the campaign that we are running is a two year campaign. So we have weekly stuff going out every week for two years. And then we, you know, in the last six months we review and we'll revamp. We've just started this by the way, this it's not like I've been doing this for years.
It's, we've just started this. But it's about maintaining that dialogue and maintaining the onion skin approach of, you know, you've got your training, you've got your films, you've got your core messages right at the center and things become less security relevant as you go out.
But they, they sort of pull people into and, and towards your core messages. So we have film nights, for instance, where we show films that may or may not have a security theme to them. We sponsor drinks evenings.
We, we run competitions, we put some cardboard cutouts of some of the characters. And the first thing people wanted to do was, was well, decorate them to be honest with you. So we ran competitions about who did the best decorations. Somebody turned one of the characters into a pirate, including her ID, badge.
It was, it was quite impressive. So it's about engagement with people and they just know that it's about security and they know who we are. And then they, they start to know more about the situations that they find themselves in.
Ah, we should speak to these guys. So it's, it's, it's building a, rather than trying to teach people in two months time, you must do this, this, this, and this, you build them up. It's like trying to teach your child to walk within the first three weeks of it being born. It's not gonna happen. It takes a couple of years of regular ongoing communication and building up of other simple skills in order to get to the point where your organization is, you know, able to stand on its own two feet.
Okay. The way you describe it, it seems to be the ongoing dialogue that you explain.
And with my legal perspective, having mentioned it in the keynote, we have a more theoretical approach to things. You were distinction between hygienic and actual risk. Looking at it from a legal perspective, I felt right away that I wasn't undercover where the hygienic risk could be at the end.
For me, my perspective as a lawyer, again, that is a true risk. Especially if you look at the upcoming security legislation, both in the us and Europe, if you look at the 24 hour duty to notice customers, are you even going that far in your dialogue that you ex explain or help people take their hands and, and guide them through a process after a possible breach? Or would that be even more continuous? Or how far do you go with that?
I, I, I think as far as you can, yes, we, we will always look to educate people as a result of a breach. There's no question about that.
I think, I think when it comes to, to risk, the industry I'm in is probably less risk averse than most others. But what I have seen is in organizations that are risk sterile, for instance, they actually create risks as a result of that. So the risks that say, nobody can do anything, stop doing everything is the moment that the shadow it pops up, right? And all your data is being spread to the four corners of the earth.
You need to have a far more balanced approach to risk and allow people to do their jobs because actually their jobs is what's bringing the money in and, you know, paying for your bills. At the end of the day, you need to obviously create some boundaries, but you also need to ensure that it's, it's a boundary of dialogue, not a, not a 12 foot wall. It's gotta be something that actually people can, can contest and discuss with you because actually whatever your risk attitude is, it may be that you need to take a little bit more risk in this area or a little bit less risk. Right.
And it's important that, that dialogue and it has to shift constantly any policy that has, there are no exceptions is a policy that's doomed to failure.
Okay. How would work? How would this approach work for a bank? Let's say they have a very rigid authority in their back.
Yeah, absolutely. Absolutely. I think it could work. There's no reason why not. It's about measuring that risk, risk appetite. And as long as you are comfortable with where that risk appetite is, that's fine. But I also think that as a result of those attitudes, well, two things, one, I think banks take more risks than we actually know. I've spoken to a few people. Who've worked for some British banks.
So you do know,
Yeah.
I've been utterly appalled by some of the things I've done far too risky, but, but secondly, I think banks it's, it's only the new challenger banks that are innovating and their reason they're innovating is because they actually have a slightly different approach to risk. The big traditional banks, certainly in the UK have taken a long time to move the fact that I put a check into, into my bank and it takes five days to clear before it hits my account is utterly absurd. And it's only now that that's starting to change because they're taking on more risk. Now I believe there's a bank.
We can take a photograph of the check and it will be deposited in your account straight away. Now there's risk in that. Right. Right. Of course there's risk in that, but it's an acceptable risk as far as they're concerned in order to attract new customers and to, to deliver an innovative service. So I think, you know, yes, it does work in a bank, but I also think, I think risk attitudes are shifting in those industries as well.
Okay. Thank you. Is there any questions from any of you? Not at the moment.
So Tom, thank you very, very much. That was my pleasure. Not only very, again, from a legal perspective, to understand more your practical approach. I think for every perspective, it's interesting to see how, how your personal Langford metric works. Thanks for showing that to us. Thanks for your time. And I hope you to have you later on, on the panel with some continuous thoughts. Thank you very much. Thank
You. Thanks. Don't forget to put your lip down.