Thank you very much. Yes, that's me.
And that's not me. I would be scared to do this, but I think this is a good picture to just sort of think ahead of how will the role of security professionals and especially the CSOs change. We have been in the digital corner for the last years, and I'm gonna do a quick history of, of that history of, you know, what we did, but I think going forward with the internet of things and in the background, you see a couple of things. Those wind turbines GE is a major manufacturer of wind turbines. These are things.
And if you go into one of these wind turbines, you would be surprised of how many IP addresses are in these wind turbines. They're remotely controlled. We get data from them and so forth.
And I, of course the guy there is standing on another thing where a train here in Europe, we don't see so many GE locomotives, but in the us it's the defect or standard. And believe it or not, the locomotives are also not driven by software.
I mean, they still have these engines, but they have firmware. And I only recently learned that our engineers take a USB stick every now and then, and just plug it into the locomotive to just change the settings of the locomotives. So same disclaimer as yesterday, you know, this is not an official GE talk.
Yeah, nothing else. So originally it was it security. I don't know how long you are in that business. I started security pretty much the day of nine 11, a pure coincidence, just a few days before, actually on 1st of September, I started a part-time role in security. I was a local it leader. Security was at that time, very low profile. And when nine 11 came along, that was when it all sort of exploded. So those were the days of it. Security and thet was the technology was where the focus on was on. So we were talking about firewalls and, and all these things that you can read here.
You know, this very well, just a remark here somewhere. It says IP version six network.
I mean, today we all do network scanning in the previous con previous presentation this morning, somebody pointed out that there is more IP version, six addresses, and there is atoms on this earth.
And many other earthers are at somewhere else. Their IP version six has more IP addresses than atoms in the universe is. So how do you scan such a network? But that just as a side note, so this was a very say technical area. Eventually it morphed into information security where you realized the technology is not really what we want to protect.
We, you really wanna protect the information, digital rights management and these things. I mean, the driver obviously was that the perimeter became fuzzy. The firewalls got more and more pinholes, bring your own device in, in all these things. And I guess that's where we are still mostly at information security. But because of this change, I saw that a lot of the CSOs were no longer technical guys that came from the business. Even CIOs come from the business. They're not it people anymore to a large degree because when you manage information, it's not so much about the technology anymore.
It's about the content of the information. And then eventually it became it risk, which is beyond just the security. Security is just part of it risk. So we had the classical, it, it security, confidential, integrity, availability, the famous CIA. But now you add other things like software license management, retention, period. We had a three day workout, five, six years ago in GE where we discussed whether information is an asset or a liability. In other words, do you want to store it as long as you can? Or do you want to delete it as soon as you can? We did not come to a conclusion.
I have to say. And then there is laws and regulations. We just heard just before the, the coffee break about the upcoming changes in the EU privacy laws, all this is it risk the fact that you have it poses all sorts of risk.
And there's probably more, and here's the latest thing. This is what I wanna focus on, which is product security. When we say internet of things, the things are products that are manufactured by us and many others.
I mean, locomotives, windmills, light bulbs. We, some, somebody had a slide yesterday with the Siemens light bulbs where every single light bulb has an IP address, your refrigerator, your toaster, your car, all these things are products that have embedded software and it, and that's why product security is the next thing that we need to look at. GE actually started a software center in San Ramoa in the, the, the bay area, San Francisco, to just deal with software development for embedded software in product.
It's not the traditional it,
The big difference between the classic it, and I, I mentioned that yesterday and the internet of things or the product with embedded software, all of a sudden the products are part of your analog world, the, the real world, and in a presentation earlier this morning, it was about all the sensors. So I'm less worried about the sensors, but the, the actuators, the things that change something, if you can control the behavior of things, of products through it, you're all of a sudden you're touching the analog world.
Let me just take a side step and talk about stocks, net, how it actually worked. You all know about stocks, net. It destroyed the century. Fuges in the Iran, nuclear plants and the way it was, it worked was very simple. It falsified the readings of the, of the centrifuges, the RPMs, how many RPMs it did.
So the, the readings on the meters in the control room always said, the RPM of the centrifuges is within spec. Everything is just perfect. At the same time, it controlled the motors to spin up the centrifuges way above the specs. But nobody noticed this because the reading said, everything is just fine. And then they're just, you know, fell apart because they were just spinning too fast. So that was a great example of how through a digital attack malfunction, whatever you want to call it, you have all of a sudden caused a physical damage.
So let's go through CIA because really nothing has changed. We're still talking about CIA. So let's talk about CIA first. That's the easy part, light bulbs, toasters refrigerators, coffee machines. They reveal your pattern. When you get up in the morning, when do you leave the house? You're turning off all these appliances. If somebody has this information, they will know when your house is unpopulated. What is the best time to just break in? They're not after changing anything, you know, integrity bridge. They just want this information to then attack your house. Very simple, right?
The question is, who gets fired? Is it the engineer who built the light bulb or the toaster or the refrigerator? Or is it the CSO? Because it was ultimately the software, which was not properly encrypting the data or other security leaks. I think that question is, is not answered yet, but we will have these, these questions coming up and the need to be, be resolved.
And if you are the CSO, you wanna make sure you are not the one that gets fired, integrity risks. So that's manipulation of things. There was another presentation this morning that alluded to the German steel mill.
So I don't have to go into this. The steel mill could not be taken down in a, in an orderly manner.
The, the damage was pretty heavy. It was not very well publicized for multiple reasons, electric cars. And this is a favorite. I drove a BMW I three. I don't know whether you had a, any opportunity to, to drive any electric car. I drove an I three. The first thing I noticed when I drove and I took the foot off the pedal, the car goes into recuperation, which is the technical term for breaking and feeding the energy back into the battery. That surprised me. I assumed the car would coast as any normal car does.
When you take the foot off the pedal, I almost got hit rear-ended by somebody else because I was breaking the car at a moment where nobody would expect this. I saw the red traffic light, far away.
I said, okay, I just coast the right traffic, light foot of the pedal break. You know, somebody almost hit me in the back. It seems that many people have actually also fed this back to BMWs. And we don't like this feature. So eventually there was a software update where this behavior changes, but the car owners were not told now, how would you like it? That your car, all of a sudden behaves differently.
So instead of breaking it coasts, or instead of coasting, it breaks whatever it is, but it is, it behaves fundamentally different because a little change in the software was made and an accident is caused.
So again, the question who gets fired, is it the car engineer, or is it the software developer or the CSO? And I could go on and on with examples like this, but I think they relate to everyday life. And you can probably just rate this from there. Then you have availability risks. They're the ones that are most thought about. If you do speak German are highly recommend to spoke blackout.
It is unfortunately to my knowledge only available in German. The plot is that the smart meters that are rolled out through Europe by the time this book is, is, is printed, is, is playing the plot. The smart meters get attacked, such that they're simply shut down the electricity supply to the house, not to a power plant just to the house.
Now, those of you who have a little bit of background in electrics or electric and engineering know electric power that is generated must be consumed.
You cannot throw it away. You must. And if it's not consumed, you must either have other consumers. The windmills, the power windmills that you see sometimes are actually consuming energy. They're turning, even though there is no wind, simply because they eat power that is generated and not use someplace else, or you have to shut down power plants.
So what the attacker do in this plot here is they take down all the households, all of a sudden there's way too much electricity in the grid. So power plants need to be taken down very quickly. Then they turn on the meters again. So the houses are back on the grid. All of a sudden there is a, a power shortage power plants need to be drawn up. And the whole thing starts to isolate. And before you know it, all of Europe is without electricity. I'm not gonna go into more details. The book then describes what happens to society.
When on a very large scale, say all of central Europe, there's no power for one day, two days, three days, eventually the diesel of the power generators in the hospitals, they run out the fridges, start to melt the stuff in there. Oh, there's no gasoline because the pumps on the, on the gasoline stations don't work. Either the traffic lights don't function, society basically falls apart. After about 10 days. It's a great book, definitely worth reading. If you can read German.
So I think I mentioned this yesterday. I'm convinced that hacks will happen. And others have said this as well.
Yes, we will need to do all sorts of things to avoid this. And we will do all sorts of things, but hacks will happen and they will be successful. So we need to be prepared for the black swans. I like that presentation yesterday. Things will happen with completely unknown consequences, maybe.
Yeah.
You know, this picture, I mean, sometimes the heck is very trivial. That was the TV five mourn situation where hackers took down the TV station. And it was actually in a broadcast in the background on those yellow pads, you could read the passwords of the Facebook account and the Twitter accounts of this thing.
I mean, that was a very trivial hack. So what do we do? We need to understand the end points and the end points are no longer smartphones in PCs alone. They're actually by majority things, the one here is actually the control unit in a car with an IP address. So we meet, we need to understand all these, these endpoints that access data generate or manipulate or store data and there, and the number will simply explode.
I mean, traditionally in an enterprise, you had an inventory of your PCs and maybe your smartphones maybe, and your servers, and that was about it. And all of a sudden you have all these other things.
Oops, there it is. So first step inventory, some call it digital identity. There's many words for this, but I think we're all saying the same thing. We need to know what's out there. You cannot protect what you don't know. There's really no difference between a human that takes some sort of an action and an autonomous thing that triggers activities based on data. It collects through sensors there isn't. So you also need for these things, you need identities, but you can only assign identities to things. If you even know that they're out there yesterday, I was very impressed by a presentation.
I, I wish I could remember the name of the gentleman where he showed that in his house 5% or 8% of his traffic went to China and he had no clue why. And he enumerated some like two dozen things in his house, light bulbs and so forth.
Now just imagine you have a hundred or 200, because everything you buy in the future has an IRPs. I bought a new stereo a few months ago and I was stunned. I could not just simply plug it in and play the stereo. I had to activate the new stereo. Yamaha St. I had to actually activate it without an internet connection. I could not activate the new stereo.
I have no clue what this thing sent over to your Maha. So inventory is a key thing that is for you at home. And of course in a big company, understand what they do these things, and then have an incident response process.
So the challenge now for the CSO is the incidents that happen outside your normal world. Normally incidents happen within your company, or if you have your stuff in the cloud, you still know where it is. But if you have products that you deliver, toasters, fridges, wind mills, light bulbs, whatever it is, the, the damage will happen somewhere.
And you have no clue where it is. I think we can learn from the car manufacturers. If they have something going wrong with the gas pedal, they call back a million cars to change the gas pedal or the airbag or whatever it is. We might actually have similar things. And I think yesterday, somebody reported about some gadget. I forgot what it was, where the certificate expired on a certain day. And all of a sudden all these gadgets didn't work anymore. So you had to send them back.
I will. I think we will see more of product recalls.
Maybe, maybe even light bulbs, maybe all of a sudden light bulbs have to be called back. I don't know. And then for future products, and this is what we do in our Santa Ramon software center is we need to apply the same rigor for software development, for embedded software that we are used to in the classical it world code reviews, penetration tests, and all these things. This largely does not happen today. But I think that is something we must go to, which will be a massive problem because there's a lot more embedded products than we have servers and applications in our traditional.
It just a few closing comments, basically everything we learned we should apply to, to the things, to the software and the things transparency to the customers is very important. Tell them what you collect.
The, the Samsung TV thing is a great example. Allow opt out as much as possible.
Of course, the less data you collect, the less is your risk. So the data minimalization is not only a legal requirement within the EU. It may actually be a very good way to protect you and your company to collect less data.
Let the users become part of the risk decision. If they know what data you collect, what actions these things can take and let them choose which of these features they're opt in and opt out, they become part of the risk thing.
I, I give you an example. Yes, it's great to have a stove which you can remotely control and say, Hey, I'm leaving work now. Just heat up my dinner. Great. So if this is of high value to you, then you may want to enable this at the same time, there is a risk, you know, somebody could potentially even set your house on fire and you, the end user should make that decision, not the vendor.
And lastly, don't, don't hide behind obscure terms of service.
Again, the something thing, I mean, the, the link is there. If you, if you read these terms of service, yes, you did agree that the something TV will listen to you and send everything that you say over to some third party, but did you really read it? And if you, if you look at the way it's formulated, I mean, it's you overlook it?
I mean, mostly people don't read this. When I buy a TV, I have no, no desire to read the terms of service. And even if I do read them, do I really understand what they say, protect yourself, protect your company, by having very clear terms of services that really spell out what data you collect and what happens with it. And that was it. Thank you
Questions. If sure.
We're gonna have, it's very informal, even though we have this big room, so we're gonna, Carson's going to get a handheld mic and I have a couple of questions, Hans, and we're gonna keep this as a, as informal, a discussion as possible. But let me ask a couple of things that occurred. That was a fantastic presentation. Thank you.
The, on the notion of product security, you know, we had in operating technology, you have employees. So in the way you have a centralized set of personnel who are using the devices and an internet of things it's distributed set of personnel, essentially. So there any thoughts about that? How do you recruit? How might we recruit people? Is it an education process? Is there incentive that can make people have behavior kind of like you have when hand washing to prevent disease in a society?
Are there ways that we can start to recruit people into behaviors, like data minimization, things so that they don't provide as much data into the system so that people where the, where the dis, where the products are being actually distributed, that we can start with having good data sanitation, I guess at the front end,
We saw a slide earlier this day, which showed that 90%. I think it was, I think it was Carson.
No, I don't can't remember. It was, but that 90% of other European citizens are worried about all these things, but 70% used them anyway. Right.
And I, I have two kids age, 26 and 24. And I look at them and we have these debates. Yes. They understand all these risks and yes, they're scared, but they use all that stuff. So I think education and the education has to be everywhere. And the family in the school, it will be a long way. So that people make that risk assessment and consciously accept the risks that go with these things.
Yes, indeed. Sometimes it's mundane things like WhatsApp. My kids use WhatsApp. I don't because I have no value in WhatsApp, but for my kids, the value is very high. So it's an educational thing. Yeah.
It was interesting.
They, in our school locally in Seattle, they started a program where they had older children, teaching younger children about the issues because in terms of their role models. So they had high schoolers going into middle school for instance, and younger students. So they'd listen to the, rather than listening to teachers who didn't know the technology, as well as the high school students. It was interesting.
Yeah.
The, the one thing I want to mention is there's always turning points in history. I, I, I mentioned yesterday nine 11. There's always things that happen that change awareness and very often overreaction, and maybe it takes a big happening of some sort that will highlight certain risk and which, which will then drive a changing behavior.
Yes, yes, indeed. Those events, they become a narrative that everyone understands at the same time. Yeah. You'd mentioned also the question of who gets fired. And that's a really interesting element of interesting way to put it. One of the things that we're gonna have in the track is talking about kind of the legal issues, economic issues in the cross over there, you know, in us law, at least if there's a causation question, if you have a chain of different things that happen that lead to a harm, there's two different approaches.
There's one approach, which is called contributory negligence, where if, if I'm 90% liable and you're 10% liable, then I get the entire damages. I have to pay the entire damages. There's a thing called comparative negligence where if I'm 90% liable and you're 10% liable by finding in the court, then we pay 90 and 10%.
What are some of the ways we might think about measuring causation from an engineering perspective or other perspectives to start to think about, cuz if you, in your later comment, you said, well, the heck it is gonna be hacked. So there will be liability. Yes.
So if there will be liability, essentially what we're talking about is insurance. We're gonna have liability. And the question is who pays? And so one of the questions seems is how might we measure those relative responsibilities, I guess, is there any, is there any work done in engineering, fashion or business things? I know the legal issues are kind of established, but there's so much unknown in terms of that allocation of risk. Do you have any thoughts on that?
I think that will be a legal decision in the end. It will go to courts.
If, if you're in a position like we are, where we develop the product, the hardware product and the software, then it's very simple. Yeah. Right. It's one company that is liable.
I mean, ultimately you go to sells you the product and if it has embedded software that malfunctions, you still go to the person that, or the company that sells you, the product, it's gonna be interesting when the software comes from a third party, a subcontractor, you know, a supplier. But I think tho so then there will be a whole chain. So you Sue the vendor of the product that malfunction did damage to you. That vendor will assess if that was caused by something that they bought from a, from somebody else. And I think that will ultimately have to be resolved in court, these sort of things.
Yeah.
And it's not a technical
Question. Yeah.
And it's, and it's interesting, isn't it that you have that you'll be all the different variations. You'll have all the different situations where there, you know, something happened because there external circumstances that led to harm in one case and not another. And it feels like a opportunity for in quite frankly for insurance, because you know, you're gonna have risk. And the question is the allocation. Yes. Yeah. So people wanna buy premiums for them. So that's a business model.
Anybody who's looking for, new business models out there, the also you mentioned the stepping the experience of the refreshing of the, of the batteries and, and that, that slowed the car and that performance. And one of the things that we've, I've heard some discussions about in the hallways and, and anticipating in some of the presentations is obviously the question of trust and trusting systems.
And I've heard the mention before people talking about trust, starting with reliability and in your circumstance, you had an expectation of the performance of the car.
It performed differently was less reliable as a result. I mean, at least in that instance. And can you talk a little bit about that extension of the, you know, when you don't, you there, you were aware that the break, the breaking action happened, but with the information flows, as the gentleman was saying, the information was gonna China, there's no awareness, right?
And so the trust, the idea of earning trust of these systems, what I'm wondering is might there be something like a, a making visible the stream of data or some surrogate for the stream of data so that people can experience that to earn trust. So that might be intentionally making it available, not trying to get it secretly, but trying to be clear about the stream so that they can start to get reliability of the stream going out and then earning trust.
I mean, these cars have big displaced now, right? So it should be fairly simple to just flash a message to the user. That's something has been changed. If you think about it, I don't know how it is in your country. But when I change from my car from summer tires to winter tires, they put a big sticker in my dashboard that says, maximum speed, one 60. Yeah. My car can go 200, but those winter tires are only speced for one 60. So they have this big sticker there to just remind me and later models, they even said it electronically so that my car cannot go above one 70, just stops there.
One 60, it just honk some signal. So I mean, why not flashing something on the, on the display saying your car has received an update and this is the change behavior.
So at least there's a notice at the time of the, of
The, so we are back to the, to the thing of notice.
Yeah.
Again, that's ed really a form of education
Essentially. And, and maybe even consent, no, I don't want this update. I want the car to behave like it did. I don't want that change.
That's right. Or exceeding the speed that the tires are rated for. And also question thought when you were talking about the, in general about the, both the value of the data, and then the challenge of the data I was thinking about what are some other, and I had mentioned in my keynote yesterday about the idea of a dual use technology and data has good and bad uses.
And I was thinking, well, what are some of the other distribution channels currently of things? And I thought about pharmaceuticals. For instance, you can have pharmaceuticals and drugs that are, have good cure illness, but they can also be used in criminal activity. Things like that. Are there there's some, you have any thoughts about the distribution channels of data, essentially that you have these dual use notions, there's a regulation essentially, of, of pharmaceuticals in order to make sure that they are kept in the proper channels, like a chain of custody kind of idea.
Is that something that is scalable enough that it could work in a data setting or might that crimp the use of data, the big data idea, such that the value wouldn't be there for that? Do you have any thoughts on that?
I don't really have any thoughts. The only thing that comes to mind is synchronous versus asynchronous. I mentioned our locomotives. We do not update the firmware, our locals online. We do it asynchronous. So just the fact that you do it slower and have review processes and have it air gapped, for example, you know, is something that helps. But is it scalable?
I mean, how many windmills do we have versus how many light bulbs are there?
That's indeed.
So, so are there any questions from the audience of hands? Yes. We have a question over here, Jeff, over there.
Yeah. I'm wondering what your thoughts are with, with regard to software updates when they aren't air gapped. And that a lot of times there's some software updates that may include some security features that are important, but they also give feature changes and you don't get to select which you get and you can compromise the device pretty quickly just by, through an update.
Yes, that was exactly the case. I mentioned where, I mean, there was a software update where the behavior of the car, you know, changed drastically when you took the foot of the, the gas pedal, which is no longer a gas pedal. I come back to consent and information well, informed consent. So ideally I don't think it's gonna happen that way, but ideally, and in a way, if you, if you look at your windows PC, you can set the setting such that it will only tell you that there is an update and they can have a look at it in detail.
And it will actually list every single KB, whatever you know, that it does. So if you have the time and the energy, most people don't, you know, you can go through it and say, yeah, I want this.
I want, no, I don't want this. And you can go through, this is the average human consumer capable to do this. I doubt it people. And I include
Myself, especially when we have thousands of devices each.
Yeah, exactly. So your fridge, your, your toaster, your light bulbs, you just can't handle it. So it comes back to trust. Yeah. The thing with trust is the key thing. You have to trust the manufacturer that you have a certain diligence in testing these things and will things go wrong? Yes.
Well, humans, things will go wrong, but we, we, we need this trust. There's a great book by Bruce Schneider, my favorite author, liars and outliers. If you have read it. And in that book, he explains that societies, whether this is in our age or even in the very old age, societies only work when you trust each other soci, and yes, there will be outliers. And the outliers are those that break the rules for good reasons. And the liars are those that break the rules, you know, for bad reasons.
And you have to tolerate a certain percentage of liars and outliers in the system, because if you would restricted too much, you know, instead of having just trust the whole society collapses, and it's, it's absolutely the same here. We need to have a certain level of trust to make this digital society work. If we restricted too much, it's not gonna work. That would be, oh, I have to approve every single software change on my light bulb. It just doesn't work. So it only works with a certain level of trust.
And with that level of trust comes along the danger of malfunctioning, and we have to accept it. It's a risk assessment.
I think we have another question here.
Yeah.
I, I found interesting to learn that you, as a CSO are also responsible for the product security. So in my world, it's more like the C covers enterprise it and product owners should cover the product security topics. So what did change in your role when moving from the enterprise? It responsibility to the product things, because I would say that like risk driven architecture designs don't work maybe too.
Well, if you, your customer then has the, the responsibility to have that risk. What's the data we actually moving around.
And, and how critical is it? I know.
So what, what, what the change in with respect to this enterprise versus product security
Now I'm not responsible for product security. Okay. When I said CSO was a generic term, I mean, G is a large company. So we have that separate, I'm responsible for the enterprise.
It, but we, we also have a CSO for product security. So it all rolls up under it risk.
You know, I mentioned it risk. There's more things in there. And under it risk, one thing is the enterprise it security, which is what I do. And there is a CSO for the product security. We talk to each other a lot. We have in common, you know, so practices where the guys that develop product security can learn from what we did in the past. And sometimes vice versa, talking to engineers is very interesting.
So it's, it's separate it's it's because it's gigantic. It's big, but you know, again, I, I'm not product security. That was a generic term.
It'll, it'll be interesting in the internet of things as that ramps up, that it's, you know, the data that comes off the internet of things will feed the marketing function internally, the finance function internally in a variety of other functions. So one would imagine there might actually be a blending of those and those roles.
And that's why we talk to each other. Exactly.
I mean, if you think about, we produce aircraft engines, right? Yeah. The data we get from the aircraft engines, we can use this in many ways. And ultimately, you know, these are our customers that we want to sell, maybe a different maintenance schedule because we see that this engine, because maybe they fly through an area with more dust in the atmosphere wears out quicker. Right.
So yes, there there's a connectivity between these things and that's why we talk to each other.
Well, let's do we have another question?
Oh, thank you.
Hi. Yes. You said that you were more concerned about, sorry about actuators than sensors. But I think as the stocks, net example showed if sensors are compromised and ASC sending faulty data in that can trigger all sorts of bad things to happen. And so the question is, are people doing enough about protecting sensor devices, making them tamper proof so they can't be compromised. So that secrets can't be stolen from them. And is that an area you think we should do more in sense, a bit like leading question.
I don't think they're doing enough. You're right.
Of course you have to protect the sensors as well. But if it takes stocks, net a great, as the example, changing the sensors or attacking the sensors was one thing, but that didn't do damage. Those sensory fuges would still have run absolutely normal. The damage was done because the actuator was then attacked, but that was not detected in the control room because the sensor was manipulated. Right. But the damage was done through manipulating the actuators.
But if, again, I think about light bulbs, thermostats fridges, if you do some sniffing, if you have these gadgets at home and do some sniffing, I do this every now and then you would be surprised how much data is unencrypted sent to God knows where, and I have no clue what they do with it. So are we doing enough? No. And I think we are only at the start of this. Some mishaps, large mishaps will come to light and be reported that people start with this. And I think eventually will become a selling point. I sell your fridge, which does everything with SSL and certificates.
And, and, and then you might prefer that fridge over the fridge that does unencrypted traffic and tells the world how much butter and beer you have in your fridge. Well, thank you. Please join me in thanking Hans for award, for presentation.