Hey, gentlemen and lady for joining us. Connie's back with us virtually. I'd like you each to first introduce yourselves, say where you're from, and then give me your opening statement. So there are two parts. Who you are, where you're from, no three parts, who you are from, where you're from, and just an opening statement.
We'll, we'll start with Enco please.
Okay. Okay. Sorry.
I'm, I'm working for Chef Real, that is a spinoff of Milano and is a innovation center that transferred the results of academia into innovation projects. And I work on cybersecurity. My statement is more or less the same as my talk before, that we need to rethink the cybersecurity more on the human aspects.
I nearly forgot that I'm here. Hi. Spend lots my time. I'm security, security awareness evangelist, and head of channel for cyber mri.
And I think as long as we don't treat the employees restfully as don't, as long as we don't treat them like we self won't be treating, we are not able to use any technology to motivate them to do security awareness or to help themself to survive. Yeah. Say like this,
That's a choice. So my name is Yanas. I'm the managing editor of Target Figure Background Cybersecurity, a daily briefing on cybersecurity. And my opening statement is that cyber awareness is nothing that you can just check it a box and then it's done. It has to be a constant challenge for a company and also for its employees.
Yeah. My name is Boris Boster. I'm leading the governance risk compliance department at Iana Utilities Company in Germany. And my opening statement would be plaster lit fu soil with awareness, because we all need to become a human firewall.
And, sorry, we should have begun with you, Connie, but hey, we'll, we'll save the best for last. Shall I spin it that way? We've saved the best for last.
What a beautiful entry. Thank you. Connie McIntosh, head of security for Ericsson's Market area. My statement would be that it is time that we bought the social and psychological sciences around human behaviors into cybersecurity and realize that this is much more than a technological challenge. We've been fighting with technical tools and, and mitigations and it's not working. So it's time we, we really broaden our horizons.
Great.
Thanks son. At the risk of getting a very similar answer to what I got this morning, I'd, I'd, I'd like to start with well seeing this is so basic, seeing that so many things are traced to human factor. I'll start with you Connie then is, is why is it so poorly done? Why is it so often neglected or overlooked or just, you know, people just don't get round to it, it's just kind of, oh, perhaps we understand the importance of it, but hey, there's just so much else to do.
Yeah.
Look, I I still believe it's not highly recognized that the human factor, despite year on year, the human factor being, you know, 95% of the cause of all breaches. And I still think it is not recognized that we cannot solve this issue with technology alone. That really it's the human that we must engineer rather than our systems. We are really good at system engineering now, I think we're, you know, really good at that, okay, when we, you know, we can't stop the human factors, so we must start engineering that side.
And I think there is, there is a way we can do it, but when we're not there yet, we're not evolved enough to recognize that this is the challenge.
Great. Thanks. And I would agree, I'd, I'd just like to ask the, the, the, the panelists in the room though. Why aren't, why aren't we recognizing this and why is it taking so long?
I mean,
I'm, I'm not sure we don't recognize it, but we are just too slow and it's also too hard to measure. So when is the human really aware and, and, and how much awareness is enough?
So when, when, when we go out, just, just this morning it was, it was the perfect example. I have it on my phone. Just this morning. I sat in the, in the hallway and we are on a cybersecurity leadership summit and someone leave their chair with a laptop unlocked Yeah. For 10 minutes. Hello. Hello. Where where are we?
So, so it's, it's, it's awareness is one part. It's hard to measure.
We, and we also, we cannot really, we cannot really influence the people on, on and, and make them discipline and, and constantly aware because we all, we are just humans. And that is what is making it so hard because we doing, we will make failures anyway.
Okay. You wanted to
Say Yes, thank you. I just wanted to, to agree with you and also emphasize what, what Connie just said and Enrico and a speech earlier.
I think the main problem is that we didn't focus on things like psychology, behavioral studies, things like nudging that are important parts of bringing awareness into a company and to employees. I mean, just thinking this way, when the safety belt was invited, it took years for our ages Yeah. And campaigns to, to, to convince people to put it on. And it's like, it's the risk of their lives, right?
So it, it needs to get into behavior and the tech only focus on technology is just, they're getting better in technology is well and good, but we also need to change the mentality and the risk awareness. And this is done with a whole bunch of different approaches.
Enrico, from your studies, did what, did you come across any way that you can make this real though? Because I mean, I know in in Britain there was a strong campaign around seat belts and, and song, but it was very graphic. We had sort of very graphic images of of of crash dummies and that sort of thing.
So, you know, what's, what have you seen is the effective, because you've, you've now studied people's responses, you've done a lot of research around this area. What is an effective way to to to, to get that message across that yes you are important in this equation?
Yeah, well one of the things that I observe a lot of times from companies is that the, let's say calling this way the, so what I mean they are most of the times aware that the humans are vulnerable and that exist things like advance social engineering attacks. But at the same time they have no ways to, let's say solve the problem. So most of the times they are left alone without concrete solutions, without a way to concretely do training without a way to concretely assist the humans in a way or another.
So they, it is a sort of a open problems they know they have but know also that there are no solutions for that. And also for example, just just about training one thing.
So, so just to one thing is doing training, but just for example, if you do training for adults, one thing is doing training, one thing is doing training with the proper way. For example, there is a thing called ology that is for training for all cells. Most of the time it's not used own cybersecurity, but in other human science as it is.
Okay.
No, I think it starts somewhere else. It starts a little bit further. I think if we don't make sure that our employees feel good with what they are doing. If they don't have a, we feeling like I protect my company because I'm a part of my company and I take responsibility to be aware that there's something outside. As long as we are not able to do that, we can install as much technology as we find. It doesn't help if the employee don't give a shit for the company, Susie, I say like this, but that's what I see outside.
You need to motivate the employees to become a part of what they are doing and not to be just an employee. That's my point of view.
Okay.
So Connie, from your statistics earlier, I think you said fishing was one of the, the highest problems. I mean we've said a lot around fishing, but I mean, is that still the very biggest threat from from human, human, the human factor or you know, where should we be concentrating our efforts?
Yeah, no, it absolutely is still the largest threat vector still called out by the, the Anisa 2022 threat report, which was released last week. And I think it's still shown that 90% of a p t actors still actively use phishing as, as a method of infiltration into network. So there is absolutely, you know, still a massive problem in, in that regard despite having, you know, email filtering, you know, in, in many organizations they still get through.
Okay, so
But Courtney, you're doing fishing. That is the first step. So you have to, you get the, you get the victim on the hook and you have to drill it out of the water. So I will say that's level one. Why don't you do lobstering? That's what we are doing. We let the victim go in the cage himself, you know, and yeah,
Absolutely.
So, okay, so we now start, I don't wanna get too much time in, in the problem space, but so we look at the solution side. The question I want to ask is, is modernizing and automating it processes enough?
I mean if, you know, I guess the theory is if you make it safer, so like we had the, the automotive analogy yesterday where, you know, over time we've now got safety glass, we've got safety belts, we've got superior braking systems and so on. So we're not putting so much reliance on, on the driver keeping themselves safe or you know, we're kind of removing that a bit of that responsibility.
So, you know, is that the, is that the way forward in, in it to kind of automate systems or, or do we need to go beyond that or do something in conjunction with that?
I think it's a good example with the automated car, but you know, you still have to ask the questions, do we need a driver license when you have an automated car? And I think you do because there will be or might be the case that the car is not driving automatically.
So you need to be sure of what you're doing and even if technology gets get better and better, I think we have the thing with the mentality of of being responsible for the security of your company, but also what Connie said, the one person can be a huge threat for the whole company, which is something we need to address even more. And also in politics, for example, to, to realize that consumer security is directly linked to the security not nation, but of companies as well.
I just, little thing I, I saw a picture with a lot of docs on it and one seal and AI was not able to identify the seal. I can, that is the answer.
Okay.
Yeah.
You need, you need somebody who's helping them. Technology alone is nothing, you know,
Actually just automating doesn't help. So the question, it was provocative definitely, but, but it won't help just to automate anything. And because the human finally needs to deal with, still with the automation processes, we are still interacting. So wherever we have a human machine interface is where, where the failure will start and, and therefore just to automate, wipe out human.
So, nope.
So enco, I mean, so I guess there is just so much automation can do and so much processes can do, but now obviously the flip side of that coin is, is, I think you've touched on it briefly earlier, is security awareness and training. But you know, how can we make sure that, that that's effective and meaningful and get buy in from both stakeholders, ie. Management and and end users?
Yeah, well for saying something about training, one of the limits, well by my point view, there are two limits in the funding effectively. One that could be solved is that we need to involve human sciences cuz we are dealing with humans. So we have to involve in cybersecurity more profiles like psychologists, cognitive sciences, design and marketing experts. And so all the type of people that are not used to talk with, let's say security experts, but they needs to create a sort of shared playground where to discuss these things.
And the second thing is that we in the previous talk just tried to demonstrate how easy is to push the pedal in deception and how his is to decept to people on the security side. We have some several constraints on ethics, legal and so on. While cybersecurity instead does not have any type of constraint so they can push the metal up to the 100% and deceive people and we are not able to test the same way people while on technology.
We can do, because we can do replicate mal, see if the system are resistance and so on, on the human side, we cannot do this completely
More form.
Well I think there is not much to add. I think if with with the continuous training we, we can develop a culture, a security culture and that is more in the interest of what we, what we all I think want to achieve because it's not just the fishing mail. Because even if we, if we manage to get less than 1% clicking, because there will always be the 1% that clicks on the link, we, we need to get the people in Phish to report the emails first.
That that is at least my opinion, that if if someone reports an email, the organization has the means and the possibility to react, to ban the mails from the accounts, from the exchange servers to delete them to, to follow up the earl that is requested and so on.
But continuous training also means our software developers need to be made aware of what they can do to make the company, and here I'm totally with you, we need to, to have this mindset that I'm doing this for the company and for myself to make the company more secure because all of this patching, coding, this, this is all kind of a mindset.
And only if we can increase the security values in the mindset of the people and probably gamification is one of the parts where we can work with only then we can be successful.
Okay.
Connie, Connie, in your earlier presentation, unfortunately you kind of had to rush the end because you ran out of time. So I guess you, what you were trying to drive towards is, is this whole question is, is what does truly effective use of people as your front line of defense really look like and how is it achieved? So maybe you could just kind of get, get back at the back to your, your conclusion and just walk us through that.
Yeah, look, I think we just touched on some of the topics being that, you know, design thinking needs to incorporate the, you know, the, the social factors, the human factors, the psychological factors, but you know, knowledge is power. We know that and we know that the average users don't understand how big the risk is.
How, I mean, think about how many of you actually tell, you know, users how, whether you've been breached, whether or not, you know, your network has been penetrated. Mostly the time we find organizations hide this information, like it's a big secret and, and that it's a bad thing. And you know, if you don't make them informed about the real threat, they don't really see it as, you know, real to them. I would say the other human problem is trust. People trust implicitly, unless you're security people like us who we get how bad it is.
And we know that, you know, you can't trust, and I think security people generically think differently and assume everybody else thinks with trust in mind, but humans trust. And that's, that's why in, in terms of social engineering and fishing, you can lure people very easily because people are trusting, I hate to say this, but I think a, an absolute carrot and stick is needed. I think you need to name and shame people. I think you need to make it visible and deterrent and you need to obviously encourage those who are doing a good job because leading by example is how you build that culture.
You know, there are many, there's always people you can say, you know, AI automation is a great thing and it stops, you know, some of the human errors, but there are humans coding and, and there are thousands of hackers hacking against that and they only need one bug and, and it happens. So, you know, we, we just need to I think focus a lot more on it and I'm sure this great panel have even more ideas on, on that. So I'll let you guys speak to
That. Thanks.
Yeah,
I want want to say one more sentence because what I, what I think what we have to think about is that that this culture of fear, what we always have by fishing, all those things, we have to change something. We have to give the people who report an email, for example, a benefit. They should have fun with it because fear paralyzed and fun gives topamine and that helps us to make them more beware, is that the right word? Beware. Aware.
Aware, yeah. Awareness function. Yeah.
I'd like to, to sort of still stay with this as long as we can and this idea of what the first line of defense looks like.
I mean, have you, can you any of you give great examples in your organizations or any organizations that you've worked for? You don't have to reference them, but just what you've seen work really well because, you know, c said name and shame, but one of my earlier speakers said is way, it's way better to, to kind of use fame, you know, and kind of praise the people who do really well at stuff rather than kind of pointing out the ones who don't do so well because then that engages them and the cybersecurity folks come across as friends rather than, than enemies.
And you don't, you don't wanna alienate your, your user group. So I I, I'd really appreciate some great examples of, of things that you found have worked well in organizations worked or something that you've come across in your studies.
So maybe I can add to this because we just carried out our human firewall cybersecurity month in October in in alliance with the Ina bse and so o and we had a, we had it completely virtual and what was really impressive is we are running a life hacking show since a few years already with different modules, but these life hacking shows are still the hit when it comes to, to where people get interested in. So we had several occasions where the live hacking was by far the most frequent at once. And then what is always interesting is what happened really.
So we, we actually do pen testing and purple teaming and, and and try to, you know, we, we use external hackers to get into the company and then we speak about it, we make it public knowledge in our company and spread the news how we got hacked.
And this is something where people get interested, where they learn from and where they take the notes from that hackers don't care whether you are too busy, hackers don't care whether there is an old system EOL system whatsoever, but they take note of it and that makes them more aware and more curious about the topic and then we have nice advertisements and so on. So they love it.
Yeah, no that's a great point and I really like that idea of, of letting people know what happens, what is happening, what the threats are in their own environments. And also we heard from one of the earlier speakers, if you make it real in their, in their environment and and and their experience. And I think that echoes what, what Connie was saying earlier too.
I dunno, you wanted to say something?
Yeah. Wanted to jump in and say to, in addition to Barris, I think it's very interesting when you talk to companies that were affected by a ransomware incident for example, they tell the story that the whole team is just gathering and the people in communications in legal, even the people in manufacturing are so aware and wanted to be part of saving the company or helping. And I think when you use this energy before it happens in a, in a, in a thing like where you can learn about how does it work, what can I do even if I'm not in it?
I think that's a great idea.
What we did in September, for example, for a customer was where the security awareness training physical with 35 persons and I showed them or we showed them what happens if we have credentials and we just created a man in the middle attack in about six minutes live on the table and they really had been shocked what happened with their credentials.
And then if it not cost only money, if it's cost for example, a wrong profile on LinkedIn because you are there second time that kills your brain and we do things like this that really makes sense for the people and then they understand what's needed to do.
Yeah, and from your studies of,
From your
Studies
For my studies, what comes out then is that we have to stop victimizing the persons, I mean stop blaming rather than victimizing stop blaming the persons because at the end of the day, most of the times, as an example, a world of shame, a part of that is illegal to put the shame and is violates, I mean the 80 relationship between the employer and employee. But anyway, the cause is that most of the times you are blaming people and the design is also made to blame people.
So you remember change the password, otherwise you we you will be locked off and so on. Marcus, more constructive way of presenting the problem. For example, in our training sessions, we most of the times link merge together the personal advantages for their own private lives, for, for their child.
And at the same time for the companies giving them the impression, first of all letting them to understand, for example, even with simulated attacks and understanding, letting them understand how easy it is, but at the same time, linking the private lives with the energy they put with the child and with the companies trying put a sort of virto circle.
Yeah, I think that's also one of the examples that I've, I've heard from, from sort of a banking group in the UK is where they did cyber awareness for families.
They, so they didn't even do it in the work setting and then they kind of said, you know, this, these are the principles that how you can keep your family safe. And then sort of made the analogy that, well, you know, your company is part of your family by extension and you have the same responsibility towards your company.
I mean, have you seen things like that and is that, is that what we're talking about?
Paris?
We, we are not doing something like that, but of course e is too big for this. We have 70,000 employees across Europe.
So, so, but of course we try to, like you said, not to blame the person but to tell them, hey, even if you clicked on a fishing nail, you have been tricked. You are a victim, not someone who who needs to get blamed. You got tricked. And this is, this is a mindset that is also coming and coming through.
So one of the, the things that I was driving at earlier when I was asking sort of what, what why are the reasons we don't do it? Because this morning I borrow you on that panel where I asked, you know, why don't we do it? And it's just cause they, well, you know, the business gets in the way.
And I just wondered, well here, you know, in this context as well, the kind of broader human, human factor, I mean is is that also true? And do you get any, you know, how do you get, how do you engage with the business and how do you make, make them invest as much as they would in technology because it's, it's easier to sell another, another bit of kit perhaps to stop x, y, z attack.
So how do you, how do you sell greater investment in people and in the like social analysis and psychology and all the things that we've been talking about,
My investors, if I understood your question right, my investors that I, that I help my employees to become a member of this human firewall call like this. So if they, if they start to become responsible themselves, if they have this, we feeling, like I said at the beginning, then I have this, this awareness in the hats and in the hearts. So they play this game with us.
It's not, not an abstract thing, it's a real thing, you know, okay, maybe I got it wrong, but
We would try to make it tangible. So whenever we do our life hacking, we perform it, we, we, we don't tell them this is to secure eon, just eon, but we tell them, hey, this is something you can use all in your private life. When we talk about passwords cracking and how to create good passwords, hey it's, it's your Amazon account that is at risk. That's not eon, that's, that's not your company.
That is your private life, your, your account that, that is at risk at well, and if you use a password manager, then use it wisely and, and, and use it stringently and constantly and yeah. So we try to make it tangible and give them, give our employees also the means how to act on it.
Okay. Well as inevitably was gonna happen, we're, we're sort of rapidly approaching the end of the time slot.
So I'll, I will start with you this time, Colleen, you just gimme your, your kind key takeaway or takeaways that you know, if, if if, if people watching or participating in this session take away nothing else from from from it, what would you like that, what would you like that to be?
Really, really educate your users about the threats and risks that your organization is under and don't be afraid to share incidents widely.
I think we need to debunk, you know, the fact, I think everybody knows that at some point everybody's going to, you know, have some sort of major incident and you know, let's be more open with our people so that they take it as real to them because it only when it is real to them, is it going to actually mean something
Enka?
Yeah, I would say that my experience, and I'm working mostly with average medium average companies in terms of sites, the first things to prevent social engineering is to use social engineering to convince the people inside that social engineering needs to be investigated.
Because most of the times I have to convince not only the C but also the hr, the lawyers, the other, there are a lot of other, let's say roles in the company that does not consider social engineering problem at all because they come from years in which someone said them that the only problem is about antiviral malware and so on. So the first thing is to use and social engineer that your same company and convinced you investigate the problem.
And while, while we're still with you, can I, I wanted to ask you earlier, was there one thing in your research that surprised you that you know, that once you, you'd kind of gone through all this, was there anything that, that valid surface that you went well, well I really didn't expect that or were all, was everything in line with your expectations?
Well, one of the things that surprised me is the difference between the claims of some of the training companies that say if you go on with the training, your, the relevance of your fishing campaigns reduce drastically over couple of three, four weeks or amount. What I realize instead is that if you do fishing campaigns, very targeted or even a dock on single person, the average level of, let's say of people that are clicking remain the same over months if not years.
And today, after 30 years of testing like this, I test more or less 200,000 persons and the average is 30%.
That is very surprising for me in every sector.
Okay, that's that's really interesting. Okay. Sen your closing statement.
My closing, my closing statement is if we make the people, I must repeat it because it's absolutely important, makes the people responsible for them what they are doing and let's say become a part of the whole thing. And the whole thing is that every, every human needs to be aware that the enemy could be your neighbor, for example. Yeah.
What, what One second. We really individuals be fishings to our employees and then we call them and ask them, hey, what was the motivation to click on this one? So we learn much more about the people and then we show them how they can make it better. That's what we do in the
Company. And you also made the point to me earlier about identifying with the company said that that was very important.
Yeah. That that is I repeated the whole day, the whole night. Yeah. Make your your employees become a member of the company, not just an employee. Yeah. Must be a family. Yeah.
Sorry, not just sheeps. Yeah.
You know, my name is when they call me the sheep often, you
Know, you honest.
Yeah. I think with all the regulation coming up, like then there's two directive will mean, which will mean so much restrictions and new regulation for companies that are not aware that it will come and new, new regulations coming up. I would love that all this learnings will go out to these companies and they understand that being cyber resilient is not only about talent and technology is also about investing into your human resources actually.
And so what, what kind of does that look like though?
You know, what do you, if you say investing in human resources, what, what, what does that look like? I mean, is that, is that kind of just, is that just training or is that kind of other things enabling, you know, how do you empower people?
I think very important is, I don't know if you got that, but I have to do the data security trainings once a year and it's like a video which is played and I think I, yeah, I have to do it, but I think I don't get anything out of it. So I just need to be aware myself.
And I think we need to understand that getting people on board on the security culture is something that requires identification, constant training, being aware of the threats and also being aware of your personal risk when you are digital.
But the training once a year you can forget about it. Really? What a shit, let's do less.
Well you can't forget about the training once
A year. Once a year. You must once a
Year because you need to be compliant with regulations and so on.
But if, if the, if I want the people to learn something from this session, embrace the people. So take them where they are and make them aware that they have the power to change them so they, they can make the world a little more secure.
This is, you could also cluster everything on lit soils and because that's, if, if there is one person less who clicks a link, the world is a safer place. The the second one that they should get create an environment of trust with your employees. They need to, they don't need to fear, they must not fear that they get repercussions if they click on a link, even if there is an incident behind it.
It's, they have been tricked. That's, that's a fundamental. And the third one do provide continuous training, but not only on the typical awareness topics, but think big, think about the patch processes, your business owners who need to know how to classify their applications and so on.
So, so think bigger and, and make this a holistic approach and then create a
Culture. But in your experience, who drives this?
I mean, you know, you don't just create a culture, it's something that, that has to be agreed throughout the organization. So you know, what is it? Is it top down, bottom up? Both. How does that work?
I I think if you don't have role models, it wouldn't work and the role models must be the leaders.
So, so they are, you need to, they're buy in and period, but you need to go bottom up from the activities and, and, and from the plan and program you are creating because the operations is where the things go wrong. It's, it's not in the, it's not in building the strategy what a CEO and the CSU does.
It's, it's down there where the admin clicks on the link and gives away their credentials. There it goes wrong. So you need to pick them up from the operations and then bottom up and include also the specific cases. Probably someone in HR needs a needs and deserves a different awareness training, a different fishing campaign than someone in sales or, or in our regards working in, in, in the substation on a control room. So they're quite different. They have quite different drop profiles, profiles, quite different tasks and therefore pick the people up where they are.
Okay, thanks. Did you want to add something?
I wanted to say, but a lot of companies do not have the resources, not the time to do things like that. So there are a lot of, a lot of companies outside doing managed service in that way.
They, you can use them and that helps often also for little companies, for example. Yeah, sure. I think so
That is also another things that needs to be, let's say, underlined in a way or another that even the most trained person sooner or later will fail because we are not even all the time 100% efficient. So stop doing things at the wrong time, for example, later in the evening and things like this and you can only probabilistic reduce the risk, not eliminated all.
Okay, great. Thanks very much everybody. I think the, the, the general message was, was clear is kind of once a year training we all know doesn't work and it's, it's really important not to, to name and shame, but to perhaps walk, work on the walk of fame or the wall of fame and to involve people, include them as in the community of the company and focus on the individuals and their circumstances where they work and the kinds of threats that affect them rather than just theoretical threats out there. Tell them when things go wrong. Be honest.
I think that's goes to be trust, be honest with your organizations. Okay. Please join me in thanking the panel for their wonderful contribution.