Next speaker is talking about something completely different. No, that's joking. Remember this speech, the talk of hunts earlier this afternoon about the free, important activities of information security departments for actually managing the security of organizations it's prevent, detect, and react. And we did not hear so much about react and some way of learning how to actually react is to learn from people actually practicing this and trying to implement processes in organizations. Please welcome Howard Manila.
So Howard, you've been working with Expedia. Is that right at a store in your, in your Vita?
I was, I started a Halloween of 2005 and I left the organization in March of 2012. So now it's my second year of operation as an independent.
Okay. And the role I was, I was stumbling over the role you had there were called principal resiliency, strategist.
Yes. They thought that a fancy title would serve in lieu of salary.
That's a good argument
For them,
Obviously. So welcome. And Floyd is yours.
Thank you very much. Good afternoon. I am a black good afternoon. Sit up please. Okay. We can only safely conclude that swans are deaf. Okay. There we go.
This is a different kind of a presentation. Trust me. I am a black Swan. I am misunderstood. I am misdefined and not everyone believes I exist. And I'm okay with that. I'd like to share a little bit about me a little bit about a little bit about yourselves and why you can't see me and give you some practical what to dos about me. In other words, how do you plan for the unplannable? This is a different kind of a presentation there. It's not technical. I'm going to speak English to you. There is not an acronym in a presentation.
There's not a bullet list in a presentation, and there's only one disaster, bad, ugly picture in the entire presentation. The first person that notices it identifies Yale's bingo will get a prize. So
A little bit about me. I'm going to modify the definition from the expert and authority in the seem tole. He says that I am rare.
I say, I am increasingly common and I should know I am unpredictable. I am massively game changing. And in hindsight, I am completely foreseeable and I'm not always bad. People think of black swans as bad things. An example of a good black Swan was the advent of the internet into the worldwide web from a geeky little defense network and the ability for billions of people to have 75 ways to buy a book. Okay. So you've all seen me. Let's go through a little 21st century history. And remember that 21st century is only 15% over. I won't go into detail on all of these, all of these events.
You're all familiar with them. Nine 11 asymmetrical attack against a world's superpower using a known threat vector, 2003, the us Northeast blackout 55 million people impacted because a wire overheated and shorted on a tree took down our grid 2010.
I fi if it was not for the presence of an ice cap on that volcano, we wouldn't have even read about it. And that caused 6 billion worth of damage Fukushima.
Again, we're all familiar. I wanna read a definition or I wanna read a quote from a university of Washington seismologist. So this is a grownup speaking about that waves. This high are completely predictable after such a large earthquake, but they're still almost unimaginable. So completely predictable yet. Unimaginable, that, that sounds like a, a cognitive dissonance to me and 2011, super storm Sandy 65 billion worth of damage from a storm that was not even hurricane strength when it hit the us coast. Now let me briefly mention my sister, the white Swan.
These are events that do not meet the criteria. They are not unpredictable massively game changing in hindsight, foreseeable Y2K was predictable using seventh grade algebra.
Obviously I won't go into detail in the interest of time, but regarding the us recession and the upcoming collapse of the Euro's currency, mistakes gaps, mismanagement politics, or incompetence, excuse the repetition, do not a black Swan make. They are not excuses. So why am I hard to see?
Well, because I'm dark, I'm only kidding. I'm hard to see because standard risk management, as you all know, it deals with knowns and known unknowns. So risk management by questionnaire. For example, you send out a questionnaire to all your executives and say, tell me your risks and it'll come back with their knowns. Or maybe some of their known unknowns, Boeing engineers of coin to phrase UNC UNCs or unknown unknowns. You don't know what you don't know. And here's an example. Let's imagine that you're a Turkey and you are doing risk management for a Turkey farm to raise your family.
If you're doing risk management, like all of you normally do and are familiar with, you're gonna look at supply chain resiliency. Is there an uninterrupted supply of Turkey, food, and water? You're gonna look at perimeter security as fences high enough to keep the coyotes out. You might do criminal background checks on the, on the farm staff and you will safely conclude logically that that Turkey farm is a great place to raise your family right up until the day before Thanksgiving. And you will experience the black Swan.
And for those non Americans, Thanksgiving is when 300 million Americans eat Turkey.
So let's talk about some trends. Technology 2002 was a digital age tipping point. That was the year that there was more digital knowledge in the world or more digital information in the world than analog information. And there was more data developed, more data generated in the last 20 years than the last 2000 years.
And again, most of it digital now I'll agree that 80% of that data was cat videos and just a Bieber pictures. But some of that data was really important. And if you look at the data that Christopher Columbus needed to do his voyage versus the data that Neil Armstrong used for his voyage, the difference is staggering and remembering when floppy discs went from three to 20 K to 1.2 Meg, anybody here remember what a floppy disc was? Show hands, all right, we got a room full of grownups. This is amazing.
So we used to talk about data in terms of kilobytes and then megabytes and then gigabytes, and then terabytes. Anybody know what comes after terabytes petabytes, correct after petabytes exabytes. Thank you. And then zetabytes and then yys, that's 20 zeros. It's so yys so humans, if you look at humans, it took us thousands of years to reach a population of 4 billion. And that was in 1980. We invented IP addressing in 1974, that was 4 billion addresses and we've already run out. So think about the internet of things.
We'll touch upon that later, but it's been given very good treatment by, by Jackson and Hans. So distant intermediation of technology. Let's talk about that. So in 1980s, we had mainframe green screens. I want to ask who remembers what our mainframe is? We had one environment, it weighed several tons cost, millions of dollars, water cooled, centrally managed.
Okay, that's fine. Nowadays, we have tablets and smartphones. We have B Y O D. You're all wrestling with B Y O D strategy and policy. You have thousands of environments. They weigh several ounces. They cost nothing. They're Uber cool. And they're managed very poorly. So it's a very different paradigm who here show of hands who here has ever accidentally left their mainframe at a Starbucks. Anybody. Alright. One.
Yeah, thank it. Wasn't only me. Thank you. Who heres show of hands lent their mainframe to their girlfriend or the boyfriend, and then they broke up and now they can't get it back. Okay. It's a very different paradigm. Social and business speed is also much faster. We have Google glass, we have wearables. We have disposables. We have the internet of things. I won't spend any time on this Han and Jackson did a great job. The speed of business is also accelerating.
So you talk about just in time, you talk about the internet. An internet year is three months.
Email is now too slow for a lot of things. And if you think about social networking and how quick that is, your CEO will get caught in a compromising situation. And a Twitter verse will be lighting up before he can even straighten up and close the blinds. So it's moving thing. Things are moving much faster when the world moves this fast, it's gonna trip and there'll be the white Swan. The black Swan events are becoming more frequently. We have the rise of global extremism. The world's getting crankier and faster. We have the globalization of economies.
If you think about 75 years or so ago, a crisis in a single country did not really impact the entire European continent. And we had that. We had a couple of exceptions.
We had that certain Archduke and we had that Austrian paper hanger. But other than that, a problem in a country remained in that country.
Nowadays, as you all know, we have a problem with any pigs country that'll affect. The entire Eurozone events are also becoming more impactful. So that source in the cloud, we spent a lot of time talking about the cloud. I remember in 1996, I went to a call center in salt lake city, a warehouse size building, hundreds of people serving one client Marriott hotels. Nowadays you walk into a call center, middle or Sabu, and it's a warehouse sized building, hundreds of, of agents. And they're servicing hundreds of customers. They're servicing all your organizations and your competitors.
And if you really wanna stay up at night after the shifts are over, they're all dating each other and sharing their secrets. So single events can, can provide more skew.
Let's talk about two worlds, the world of medio stand and the world of extremist stand. Has anybody heard these terms? Nobody. Okay. Wow. Okay.
One, thank you. Okay. Mediocre stand is the world where a single data point does not appreciably shift the distribution. So let's talk about this group and the distribution of IQ. It's pretty much a bell curve. Some of you're really smart. Some of you're not so smart and you know who you are, but if we bring in a Maryland VO avant, or, or Stephen Hawkings, a single data point, won't really shift that distribution. Now let's talk about extremist stand and looking at this group, there's a distribution of say net worth. So I'm sure it's all over the map, but there's some clustering to it.
If we bring in a Zuckerberg or a gates or a buffet or a, a ma that single data point will radically shift what that profile looks like. So I will submit to you that the world is moving from mediocrity to extreme static. And that's where I live.
Okay. Now I'll start talking about probability. It's misunderstood. There's a difference between one specific event versus a set of related events. And here's an example, the birthday puzzle, and how many of you're familiar with a birthday puzzle? Take 30 people in a bar. What are the chances of sharing a birthday?
Okay, great for you that aren't 30 people. Three is a 65 days in a year. You think maybe the probability is 8%. It's actually over 70% because it's not the odds that you will share my birthday. You will share my birthday. It's the odds that they will share, or they will share. It's the odds that any two people will share a birthday. So that's different.
Who
Here, this is audience participation, who here wants to create an absolutely impossible event in your hotel room tonight. I need this Munich. I need a volunteer.
Okay, sir. Please come up a brave soul. Indeed. I'm gonna give you a, a gift. Okay? Thank you. This is ordinary DECA play cards. It is yours. And thank you very much. You may sit down. Thanks. Okay. So go to your hotel room, go to your bed, open a deck of card, shuffle it well, and lay it out.
Face, face up all 52 cards. What are the odds for that sequence? Anybody want to guess? Okay. It's 52 times 51 times 50 that's exceeds quadri. The odds are a hundred percent because he said he would do it.
So a different way to look at probability. Now risk management suffers from a number of cognitive biases. I won't go into the details. I don't want to get tripped up in a psychology, but I want you to get thinking about how human beings are flawed in understanding risk. So we can move on to the real meat of this discussion, which is mitigation. Okay? So we're gonna go really fast.
Now gambler's fallacy easiest. One to understand flip a coin. Nine times it comes up. Heads. Tails is not due anchoring fallacy, Def fixation on previously known risk and not moving on to current state or evolving risk. So I promise you, nobody will ever take over an airplane with a set of box cutters anymore ever, but our friends at the TSA still gro you down and make sure you don't have one. Now the business analogy there is that you have executives that say, oh, I remember that event in 1900, whatever. That's why we do business continuity. That's why we do cybersecurity or whatever.
That's great, but they're not focusing on the future. As you are in this room,
We have normalcy bias, which is the tendency to discount the probability of events that are not normal to you. It's kind of like somebody moving to Seattle from somewhere and saying why you're planning for earthquakes. This is in California. We have biased blind spot. Here is a here's audience participation who here thinks they are a better than average risk manager. Wow. You're you're too modest. Okay. Bias blind spot is a belief that one is less biased than others.
Normally I do this, this speech and everybody's hand goes up because everybody is better than average. We have choice supportive bias, which is the memory that one's choices were better than they really are. We have zero risk bias, which is the psychological tendency to prefer complete, complete elimination of Ari of one risk over larger overall reduction, over several dimensions. And we have availability, bias, availability bias is the overestimation of vivid or memorable events because they're emotionally available to us.
And we have examples like child abductions in airplane crashes.
They're thankfully very rare and very low probability, but during the news they're emotionally available. So we think that they happen more than they really do. Okay.
Now, if you're flying off from the conference and you discover that your pilot is a killer Terminator robot for the future, you can get off and rebook your flight. Otherwise you will probably survive the flight. The chances of an airplane crash are the probability is about 10 million to one. An airplane crash is actually enjoying 95% survival rate. So that's a little bit different from what we might think.
Now, show of hands who flew here versus taking some of the motor transportation. Okay. Keep your hands up. How many of you expect to survive the flight home? Okay. To say I'm out. How many of your organizations have policies against concentration of key, key command risk? You can't have more than X executives on a flight. Does anybody here have that as a policy? Okay. Yeah. A couple. How many of those organizations encourage carpooling?
Okay. It's a different, different way of looking at it. Okay.
We have base rate bias, which is the miss or overestimation of probability due to concentrate due to ignorance of base data and focusing psychologically in the edge data. We have an example here, which we can't get into because time is my enemy here. So I apologize. But just to look at this slide, I promise you that the 99% rate is misunderstood and misrepresented of what will really happen. In this example, I can go over this example, you know, offline and happy to have the conversation. Okay. The Texas sharp shooter fallacy, a Texan wants to prove he his sharp shooter.
So he takes out his six, his six gun. He Schutze at the side of a barn. He runs up to wherever the bullet hit, draws a bullseye target around it and runs back and says, yeha, a sharp shooter, the Texas sharp shooter fallacy is the use of results to mistakenly drive or mistakenly prove your biased hypothesis.
And the example, which, which probably resonates in this room is that I'm sure you'll have executives that said, well, that last breach or that last outage, that wasn't, you know, we survived. So therefore it wasn't really an emergency. And that's radically flaw thinking.
I will submit to you, there is an example here, but again, time is not our friend. And I promised George that I would state to 20 minutes and not a second more confirmation bias. This is a tendency to direct thoughts to confirming a hypothesis rather than disproving it or needlessly collecting additional data to confirm when you don't have to. And there's a couple of examples here, and we won't go into 'em because again of time, but context helps confirmation bias. I will say that. Okay. Now let's talk about how to mitigate, because excuse me, that's really what you want to hear.
So how to mitigate let's move past a bad thing happening and think about some bad thing happening. So if you have 30 supplies in your supply chain and you're doing risk assessment of those or security assessment, do you look at the chances or probability of something happening to that supplier, that supplier, or do you look at the probability of something happening anywhere in that supplier population or their suppliers?
Okay.
Assuming, yes. I think the technical data security term is get over. It don't rely solely on your ability to predict because, and probable events happen all the time. The chances of a particular event are low are, are very low, but the chance of some event are actually very high. We could talk a little bit more about that. If we had time, that's the latest, I'm sorry. Lemme go back. That's the latest thinking is cybersecurity. You all are thinking, it's not a matter of when, of, if it's a matter of when and that's right. Thinking how many organizations here have antivirus IDs or firewalls.
Awesome. Hands did not go up. That's amazing. How many of you who have your hands up? Know what your company's business continuity plan is? Okay. Not as many hands. How many of you have practiced a, a breach or an outage through exercise?
A lot of you, how many have integrated that with a business continuity exercise to say, now the business is impacted. What are we doing about it? Okay. A couple. All right. That's progressive. Okay. Reactive to proactive, to preemptive.
I had a, I had a vice president at a household end company. Tell me, well, our crisis plan or business continuity plan is we'll find all the VPs. We'll get 'em on the phone and we'll figure it out because we're really, really smart. That's being reactive. Proactive would be pub, at least publish the conference bridge number in advance. So you're not hunting these people down or maybe use emergency notification to contact them. Preemptive would mean you've thought through what the responses are. You've done some choreography.
You've done role-based assignments to this team with three deep redundancy. So when they're getting on the phone, they're not making things up. As you know, the two things you don't want people doing during a, a crisis situation are thinking and making up stuff. Okay? Balancing all of these, all these dimensions, risk management deals with prediction. We talked about that high availability and redundancy talk about prevention, but you still need business continuity or disaster recovery, crisis management for response. So risk management, high availability and redundancy security.
They do not obviate the need for planning of what if, let me say that again. Security redundancy, high availability, hardening do not obviate the requirement to plan for what if it fails? And I have examples.
Okay. Focusing on the cause. That's what a lot of us do, risk management or, or cyber. We talk about, you know, what about this attack? What about that attack? What if it's this other attack? What if our headquarters is attacked by a bomb or a dirty bomb or a flood, or how cl you know, how dirty is the bomb?
A better leading practice is to say, let's focus on the impact and not the cause, regardless of cause you lost your workplace. Let's figure out what to do regardless of cause your technology is compromised. Let's figure out what to do.
Okay. Supply chain, I'm gonna move really fast. Cause I think the third base coach is gonna wave me in. We all know that your suppliers problems or your problems, a tsunami in Japan can close down factories in New York. So some of the best practices around supply chain to assist them with building their plans, to assist them with recovery.
How many of you, your organizations assist your suppliers when they have an outage or a breach and spend money on the recovery? Okay. That's kind of a progressive thing looking to be preemptive and not reactive in your assessment, assess them beforehand. Don't accept their word for it. As far as their resiliency or their security. If you ask them for a copy of their plans and they send you the glossy PowerPoint or PDF about how they're really, really cool, throw that away and dig harder, joint planning, joint exercising. That's a great best practice.
I don't have time to go over this, this example, which is kind of unfortunate
And nobody El bingo. So nobody gets the prize.
Okay, train, train, train, train. The only couple more concepts to go. The most important thing you can do is to train and exercise. A kind of pitfall of training and exercise is to focus on the exotics of the scenario and the details and not focus on the expected outcomes that you want to train your people to exhibit such as thinking to moves ahead, such as dealing with ambiguity and breaking events, such as managing with purpose under stress. Those are more important than dealing with well, what happens if it's this exact attack through this attack, vector and a compromised these servers?
Okay. Lastly, counterbalancing your blind spots. Admit you have a problem admit that you have a need to compensate for cognitive or experiential bias or gaps, solicit diversity of inputs into your enterprise. Thinking with lateral thinkers, obnoxious dissidents and others that you can include. And if anyone here is an a marker for an obnoxious dissident, please see me at the reception. And lastly, we don't have time unfortunately, to talk about what your black swans are. Mine are. The internet goes away for two weeks because that'll never happen.
Mine are extremists can bring down a plane without detection. And we talked about that.
Of course, that could never happen. And what really keeps me up at night is my UNC UNCs. I don't know what I don't know. So in conclusion, hopefully I've given you some things to think about. Hopefully you've learned a little bit about me. Hopefully you've learned a little bit about yourselves and how to reduce risk. And hopefully I've given you some dues, some stop doings and some takeaways, and that you can take the conversation we've started and take it back to your stakeholders and, and your peers and the industry. So in conclusion, I'm a black Swan.
It's been a pleasure to have you accompany me on this. Well, mental walkabout and I thank you.
Thank you, Howard. Absolutely. Very short question.
Yes, sir. How long does it take to raise the maturity level of an organization so that you think it's an acceptable level?
I've seen it happen in six months or a year and I've seen it be a journey. The thing to me is that it's, it's not in talking about how it's not in talking about the disasters and being like a disaster geek or the, the cybersecurity paranoids it's talking to executives in terms they understand the C-suite is not afraid of ISIS. They're scared to death of plaintiff's attorneys.
So if you talk to your, to your executive stakeholders about the ability to provide an affirmative defense, the ability to demonstrate a standard of due care and limitation of their liability, not the companies, but theirs and their wives and their homes, then you've got their hearts and minds and you can get their wallets.
Yeah, I would agree. Thank you very much.
You bet it's privilege.