Yes, we have a super long title proofing your success, and it's hard to, to spell out anyway. And it is, this is one of these sessions where I wasn't exactly sure whether it'll be super packed or super empty because on one hand it's a totally boring topic metrics. On the other hand, it's interesting to see that some of our most frequently downloaded reports are about K and KPIs.
And so, so what I, what I try to do for, for this, for, for this session is to, to put together not a list of 100 K and KPIs, there will be a relatively few, but, but you have research access and there's a report and K and KPIs for cybersecurity. There's one on K eyes and KPIs for identity management, access governance.
And I want to talk more about, and look more about how do we, what does it mean to come up with meaningful things? How do we do it, right? What is the stuff behind that?
So I'd like to talk about the, what the, why the, when, so the value of KPIs and care eyes, the how and the where, so where do we collect? Where do we display it? The which a little bit, so which ones should you select and the who? So also who should work on these topics of, of care, eyes and KPIs. So it is a journey. So to speak across all the areas I, I found being relevant around K and KPIs, and that's where I wanna start. And as usual, I, I think for these things, it's always good to start with a little bit from the Wikipedia.
What do we mean by K and KPI?
So a key risk indicator is a measure used in management to indicate how risky an activity is or how much at risk an activity is used matrix used by organizations to provide an early signal of increasing risk exposures in various areas of the enterprise. I think, yes, we, it's a matrix that helps us understanding risk is going up or risk is going down, where are we with our risk status, cetera, and the performance indicator or key performance indicator is something which is more about the success. The broker we are doing on certain activities, such as bro check program. Cetera.
I'll talk about more about this immediately. I give you a fair chance to take a photo. All the slides should be available for download afterwards.
Anyway, question. Yes. It might give a little of a little of a delay, but these slides definitely will be available. I think as well as there will be a video available from that session. So Ks and KPIs are complementary and some things can act as KRI as well as, as KPI. And they are some way they are inseparable.
So, so I think it doesn't make much sense to say I do a KRI project and then I do a KPI initiative. I think you always should read it as one. And by the way, if you go more towards the front row, it might be easier to read what is on the screen. I saw some looking at this, so it is, but, but I know it's like, it's cool. So the first row is, is, is usually empty or everyone tries not to sit in the first row, but feel free to move closer to me.
I, I relatively rarely by it during my presentations. And so, so you don't need to be really worried about that.
So, but from these measures, some are better indicators for risk. So the number of often accounts in a directory or a system is a relatively good or indicator for risk. So it's a strong indicator when it's about risk and others are better indicators for performance. So how long does it take to provision a new user? It's a strong indicator for the performance, but it's, if it's about onboarding the user, the risk only occurs when the user's onboarded. So as long as it's not onboarded and the user doesn't have any entitlements, there's little risk associated.
So it's probably more down there while the average time of deprovisioning goes into both equations. So how long does it take to remove entitlements? That's a risk thing and we can clearly discuss about, oh, is this a little higher or a little lower, but it's also performing, how good are you in your processes?
So I think it can make a lot of sense to think about when you think about indicators in general to create sort of such a metric metrics where you say, okay, I put them in, I, I used it metrics, and then I placed them roughly, you know, you don't need to be super exact, but truly this helps you to understand, okay, the ones you, which are really far up here might be irrelevant, even if they're not relevant for performance, but the best risk indicators, the ones here are your sort of best general purpose indicators here.
You might have some which you say, okay, I add this for the KPI, for the performance. And behind that is always the thinking limit, the number of indicators you have, because you need to measure them and you need to report on them, et cetera. So the ideal would be to have a relatively low number of indicators in place because, and, but good indicators, because that, then you, you say, okay, you have a good balance between what you can.
So, so about the results about the, the information you gather and the effort you put into that, and truly you should avoid having indicators that are to the bottom left. So an indicator that's not a good risk and not a good performance indicator. Doesn't make sense for you. And I think this is a simple exercise. So I'm an Analyst and Analyst love quadrants and metricses and spiders and so on.
Yes, but honestly, I think the advantage is if you, especially when you construct the metric, right, everyone of us is used to see, think, okay, upper right at is good, lower left is, is not good. And it helps you, you know, to, to visualize some of the things to, to, and as I said, don't, don't go over the top in the metrics, in the sense of I, is it 8.2? Or is it 8.4?
That doesn't matter. Here it is rough. And you know, Analyst always do educated, guessing we call it, which is better than guessing.
So, so we know something. And then, you know, at the end part of it is if yes, you can measure things. Sometimes you need to do an educated guess and here educated guess. I think it's, it's definitely good enough to, to, to look at which of the indicators might be good ones for you or lesser good ones for you.
And so, as I said, the advantage is lesser metrics for more indications. This is the target to start with.
And the approach is also are quite similar. It's metrics for improvement. So we measure, we analyze the report and then we improve either we improve by mitigating risks, which is the improvement, or we improve by having a better performance. So it's always about, and it's the same. We do everywhere at the end of the day. Sometimes we don't know that we do it, but we do it. So we look at things and think about, okay, oh, such a good, how can we improve?
And so this is at the end, the point, which is always just to be kept in mind that we, we, we, we start always with the measurement, we must analyze this. We will have to report on some of these things. And you know, every one of you probably is used to, to either, or, or maybe both to, to be asked for, okay, risk metrics.
So we, we, yesterday evening we had our C council and I think three or four couldn't attempt because they had a supervised board meeting the same day, because every C has to regularly report these days about his risks and identity risks are cybersecurity, risks are business risks. And so it's, it's about reporting. And then it's about improving because when you come up the next time and say, okay, it got worse, it will be a UNLA supervisory board meeting, or it'll be unpleasant for you.
If your C has to go to the supervisory board meeting and comes back to you and say, okay, that wasn't very pleasant, both needs are fun. And we, we need, as I've said, we need both because in some, some way it is KPIs are about, or both are in some way about doing the right things.
So, so what do we need to do to get better or to mitigate risks and doing the things right.
So, so KPI K probably are more doing the right things side of it. The equation while KPIs are about, do we do the things right? Are we really getting better with all the investments? And I don't know how many of you have been in charge of an identity management program already, who has been in charge of a program. So you might have received the question of, oh, we spent already X Euro, why aren't we done yet? Or where did this money end up?
So a very common question, oh, we spend 17 million for identity management. We are still struggling. And then it's very good to have metrics that help you showing that the things have changed.
So this is an important thing.
And there, there are different layers also of care, eyes and KPIs. And I think we have as identity or tech people, and, and think many of us are, we have a tendency to only look at the, this layer, the operational layer. And I think this is one of my key messages for this talk. Think broader, think bigger. Don't just think about technical in the first slide. I have technical care, eyes and KPIs, but it's more, it's definitely more, we need to look at other things because we have, we need to demonstrate that we are moving forward in our IM program overall, or in cybersecurity.
You can apply this to every area. It's, it's, it's pretty simple. These principles, all, and these methods all work across everything. So there's nothing where you could say, okay, it doesn't work. The point is it's not just operational risk and performance because the business leaders will focus on the strategy.
Do we make progress? Do we solve the strategic themes? Are we ready for supporting the new digital service we need to deploy have part of identity management ready that helps us in doing so. The program leaders say, okay, my project doesn't on time.
The line managers focus on operations. And so for the strategy, it is for instance, the state of the strategy. So where do we stand? Talk about this more in detail in a minute, the execution on enforcing the strategy. So we have a plan. Are we following that plan? And how far we are we in this plan, which gaps do we have? Where are the things we, we, we need to do in identity management to be ready to serve the digital business over the next years, which we haven't the brokers and speed and implementing the maturity level we have achieved.
And also the, the, the, the level of maturity we have achieved compared to standard maturity metrics or to our peers.
It's interesting. We do a lot of benchmarking projects for identity management or for cybersecurity. And the organizations are way more interested in saying, where do we stand compared to our peers then saying, okay, we are above CMM three, or we are reaching CMM four. The main question always is how do we stand compared to our peers?
Because then the leaders say, okay, if we're roughly as good as our peers, or maybe a little above, depending a little bit on the organization, some say, okay, we just want to be at a level of our peers. I would say, we want to be always a little ahead because we are the coolest one and the most innovative ones, etcetera. But this is what they they're most frequently interested in. And I think this is one of the things. Then they also tend to ask us because we see so many organizations and so many contracts on.
So we can come up with a, I think, a quite good perspective on, on where do you stand compared to your peers, leadership and competition? I think at the end, this is part of the pure thing here for the projects. It's the typical projecting on time at budget in quality, complete and distinct. What do you mean by that? This is maybe the lesser clear thing. So does this project cover or program cover what you want and does it do it in a way that you don't do the same thing you already have somewhere else?
So distinct would mean do things which add to what you need instead of having too much overlap, maybe more partially important for identity management, take consumer authentication versus workforce versus partner versus things, et cetera. Then there might be some quite, quite big overlaps.
It's, it's quite cool to avoid them where you, where you can do it in a meaningful manner for cybersecurity. It's even more so for the ones who are around tomorrow morning, I'll, I'll do a 30 minute session around reducing the number of species in your cybersecurity, Sue of tools. So that's the non-vegan session of the conference, so to speak.
And, and in that session, I will talk about this problem because in cybersecurity, we have very, very frequently that situation that we have far too many overlapping tools, way more than even in identity management, user friendly. Yes. At the end of the day, I rarely talk with, with, with companies about sort of success in identity management, without them saying, okay, one of our problems is when users complain, yes, complaining users, which are unhappy are problem, because this is what pops up immediately.
And, and you need to be user-friendly and you need to be extensible today. So is this something that's for me, a measurement for success of a project is, is something you can grow. You can extend, you can integrate, or is it something where you say, okay, they've done and I can't do anything.
So, so, so I'm, I'm I always like to bash on MVP approaches.
So an MVP approach is good when you cook can grow from that, that MVP approach. When you say, okay, I demonstrated that this is feasible, but unfortunately I can't do the next step from what I coded, then something went wrong. So it must be extensible. And then we have to operational the operational risks. So downtime the performance. How fast are you in onboarding people? The quality, how good do you do it? Like how many offered accounts do you have?
So that would be of the accounts that would be onboarding off onboarding performance. But also, as I've said, things like dust, the system work as, as expected.
And, and I know, you know, when your system is up down again and again, then, then you have some trouble and you have questions to answer. And so this is what I see here and moving forward, what are the benefits of key eyes and KPIs?
I think it's it's management and it's actions, so we can improve the way we manage our operations, our, our projects, how we do strategic management and we can improve success.
And this proving success at the end, this a very important point because, and I I've seen this so frequently that, and I touched it already, that, that, that someone says, okay, and where's the money where, where did go. And then you say, okay, I got better, but if you can't prove it you're in trouble. So they can say, okay, blah, blah, blah, blah, blah.
No, you need numbers. So efficient management means management by risk and management, by performance, focusing on the areas that require your attention, where are the biggest risks. So if you have a couple of matrixes, you see where your biggest risks are. And if you have a couple of matrixes KPIs, then you see where your sort of weakest spots and performance are.
And that helps you to say, okay, I put my focus on that and that could be a project performance. Okay. This project is hanging. Then let's throw your best people on that project.
For instance, if it's required or we don't, we have a gap here. We have a huge risk of not delivering to what a business needs over the next two years. Let's focus on that. Let's concentrate on that. And you know, I've seen this in some large programs where within whatever the, the, the overall lead of this came and said, okay, I don't want to talk with you. I don't have time. Your project is running. I don't care. I focus on the things which are wrong. This is exactly what this helps to do. Metricses the other is targeted controls. So you need to understand what are your controls?
Because a matrix is very close to a control control is in trouble.
We have to problem that the term control usually translated into leader.
So, so supervising, or instead of umto what determine English means. So making things better, taking the right action. So D determine typical German connotation of control due to the wrong translation is really something negative. Even while, while a control is something positive in the theory. So a good controller, for instance, shouldn't say, Hey, your numbers have come down. The good controller should say, this is the measure to take. This is my advice. We all know that most controllers don't do it, but that's a different story.
In term, we have this nice translation of P counting for that or counting piece.
Anyway, it helps you to implement the right controls. So what are the things which put you in trouble? So selecting the right K and KPIs also about what will you be asked, think about what will you be asked? And I talked about, for instance, convenience, et cetera. So do you have something like a customer satisfaction rate, you measure by regular queries?
If this is something you always get asked, then, then implement something and say, okay, once every half year or so, I send out a poll to selected users saying, okay, how do you feel with that? How are you et cetera? It helps you proofing success. And this is, I think the point for both quick wins and big wins.
I, I, I talk a lot about quick win for sure, but I also talk about big wind. I believe everyone should also always define a quick wins list of quick wins and a list of big wins.
The problem is if you only talk about your, what you want to achieve in the future, and don't have the quick wins, then you end up in this oh, 17 million spent, where are you? And you might have reached a lot of quick wins, but you need to prove investments are well spent. It goes back to what I already said, tangible improvements.
It also helps you to, you know, when you measure regularly and some things you might measure very frequently, others, you look at more and longer intervals, but it helps you to get better day by day by day. This goes back to the management thing by understanding, okay, this is where I need to focus on. And that's it. Okay. So strategy, I filled this, this, I created this slide for the ones of who are still sitting in the back of the room to make it impossible to read from back there.
And I put, put in some things which I, some of these you might have seen.
And, and I I'll, I'll try to, to more, more talk about some of the aspects. And, and I think when you look at the out the print outers, or you'll see better, but from a strategy perspective at the end, it's about maturity and completeness. This is I think, when, where it nails down. So what are your risk cost by gaps? What are your risk cost by non execution on, on a plan, or what is your compare, your performance compared to peers in the market? I touched it a little at a higher level, and this is commonly more at a C level thing.
So for the CSO, for the CIO, for the chief digital officer, for the CEO, for the chief financial officer, etcetera. So the people you meet when you have a larger program in your steering bot, or some of them at least, and you can use them most of a different domains like digitization, cybersecurity, brokers, IM brokers, it's always the same, what you're doing.
So once you've, you're good in KPIs and Caris and what is behind and, and above that, or however you'd like to, to phrase it, then you can apply it everywhere.
And one thing is, from my perspective, the state of the strategy, the mission, the architecture. So where do you stand with that? And it's interesting for in our webinars, I, I also hear for the video online channel, we, we, we regularly raise polls and one of the polls includes an option saying so most important priorities, creating a comprehensive blueprint for AI identity management, amongst four other options. And this is usually number four or number five on the list when I look at the polar results.
So it's not that it is the first thing organizations mention when they say, okay, what, what, what am I doing? The, the positive interpretation could be.
So the, the other 92% already have it done, but I'm, I've seen too many projects to know that it's not 90%, 92 person, which have a great blueprint and architecture, and vision and all that stuff in place, pride, anti management frequently.
It's just a lot of disjointed initiatives that are here. And I put in here this picture of the coming called identity fabric as sort of the sample.
So something we, we use in many projects, for instance, then to, as customers to say, okay, which capabilities do you need to, which services do you group them, which tools to enable access of everyone and everything to every service seamlessly. How does it integrate with your legacy, with your digital services, et cetera, and you, you can clearly measure on when I have this plan, where do I stand? You can look at the execution level, you can compare the maturity level of implementation.
In that case, it is saying compared to a sort of a standard maturity level three and five, but you, as I said, you can also do it to your peers. As I mentioned, organizations really love these things specifically when it's around peers, you can look at how good are you another methodology regarding emerging technologies.
So, so where do you stand compared to maturity level of a, of a technology? Did you already start working on certain types of technologies? Are you maybe ahead of the state of the market? Where do you intend to be in, in whatever three years, four years from now, how do we as Analyst and this users, a mix of things expect the technology to, to grow so you can measure yourself and, and how ready are you for the future by saying, okay, these are trends. I look at how, when the trends are, I select them. And I look at where do I stand?
And how big is the gap from where I want to be to where I am, for instance, in this quite a big gap. And that means small gap here between this bar. Andt this one bigger gap here, et cetera. So the bigger gap, the more attention it requires, then it helps you.
It's sort of a matrix. You can measure this, you can put this even in number if you want. And you can say, okay, I know where to put my focus on also because you want to be better in some areas and lesser in others. And then you also can think about prioritization.
So these, these things always help you also to understand, okay, I can't do everything. So what is where I should put my focus on how complete is your text stack? So this is the reference architecture. The identity mentioned reference architecture with recent, the published an updated version. It is the yes is the updated version because it has things in like decentralized identity issuance. So are you able to issue tokens into wallets already? This is something which we would expect in a complete technology stack over time.
So raising the bar a little, and then you can look at where, where am I and run exercises where you say, okay, which of these things do I need?
That's the first thing. And from the ones you need, you look at the gaps and then, you know, where are your biggest gaps, which have to be highest priority and gaps that are big and have a high priority are than the things you need to address.
First, you can measure, get you, do you get better when you compare it to next year and say, okay, do I, did that's the measurement matrix? Did I close gaps? Or didn't I close gaps? So I just strategy is projected for sure. That's the other level. We are a little bit more, probably familiar. Most of us with that part, which is there, the success is measured by what is here in the, at the center. I've mentioned time, budget, quality, complete distinct user friendly extensible. And this requires certain types of inputs, et cetera.
And so you can measure that and you should measure that to say, okay, we are making progress.
We are own plan or to understand where's where are the biggest gaps are these problems? Do we need to fix them? What do we need to address? I think it's something, ideally a project manager does every day creating wonderful plans with Microsoft project. I've written a couple of Microsoft project books in the past when I was young. And so I'm, I have some background on that.
And, but, but I would also dare to say that in most cases, yeah. Time budget. Yes. How good are you usually in measuring quality over the course of a project, it could be a weaker sport. It's also not, not super easy to measure, honestly, that that is something where you probably need also to think about what, what is the metrics for, for quality you have so surely there can be something like the number of incidents you have in deployment, things like that.
How frequently do you need to get back to, to do things etcetera?
So you can figure out matrixes completely distinct is a little bit more the strategy level. So do, are you really doing the things you need to do user friendly? I would do it with pulse regularly saying, do you like it? Don't you like it? What can we improve extensible again, a little bit more difficult to measure because it's at the end, it's really something which is probably something looking at, do we have the APIs in place?
Do do, do we do it right way? But you can look at software quality. You can look at. So if you could drill down and do that, and you can even come up potentially with metrics where you say, okay, how, how small, how are my, my, my distinct microservices are the APIs constructed, right? Or not stuff like that.
So you can do things in that space, but it's clearly, if you go further beyond the, the sort of the two standard things, then it gets a little bit more tricky to come up with good metrics. But I believe it's worse to do that. And then we have operations.
So we have care eyes and KPIs that look at risks and implementation. So this is our day to day business. At the end of the day, this is something where we then on a, on a more regular basis, go out and say, okay, which are risks we are looking at. So operational risks include system availability misconfigurations and their impact. So did you, do I, you know, did you something roll out, whatever has change and change led to a downtime, did it change lead to something where, what, whatever your SAP system wasn't operational anymore?
So I've, I've, I've, I've seen things like, oh, we accidentally deleted something in the active directory populated to SAP, and then whatever certain share of our users, weren't able to access SAP anymore.
This, you learn very quickly about these incidents, because someone from very high up in the stack of your organization will call you technical system performance. So tons of meters we can have around technical performance.
You need to under look at, which are really meaningful, select a few security risks, like unmanaged accounts, excessive entitlements, or users, which have a lot of direct assignments of entitlements could be another one. So when you say users only get their entitlements through groups, fine. If you have users that have many, many direct assigned entitlements, it's more likely that you have some, some risk here of over entitlements authentication fraud attempts would be another one. So how many fraud attempts do you measure for authentication is the risk.
And if this number goes up, then you should look at it and for performance, as of the time for onboarding new users, time for deep provisioning time for handling authentication issues. So how long does it take when your CEO is in China and calls you and said, I can't log in.
I think it's a very good metrics, because if you're good on that, it can't really save you a lot of trouble, especially for the, the upper management time for fixing technical issues at operational efficiency.
So how long does it take, if you have an incident to get it solved in average, how long does it take you to, to, to do a change, fulfill the change request? Maybe you, you group it into types of change requests, depending on what you do. What is the workload for running a system? So do you need, I, I think we have, we have one customer which has some, some 40 or more production servers for certain parts of this IGA environment. And others might have four. Then the question is, how good is this? And at the end, it means, okay, if you're well above the standard, how can you fix it?
What can you do to improve it? Because again, matrix are relevant to do, to, to improve. This is the, the purpose of it. So not only collect it, but take actions on that. And it's this, the right matrixes baseline collection. And then you do a continuous improvement.
And again, try to work with relatively few of these, but understand what really helps you. This is where we started at the first slide, select the right matrix, which provide you the required insight, which you need and which you are able to measure. I think these are two things.
So, so look at it, define your targets, what you want to measure, think about how you could measure it and whether delivers value to it. And this how to measure is a very important point, because when you spend too much time on measuring, then you have a miserable KPI for your KPI, so to speak.
And that doesn't make sense. So you should also have a good KPI for, for your KPI and care. I management, you need to create a baseline. This is what, what most frequently is forgotten.
Honestly, this is where, where the things really fail. You start your KPI. And K I process when you're half a year into your program, because then someone started saying, okay, where we are. Can you prove that you spend the money?
Well, then you're too late because you can't fix it anymore. So it's extremely important to make this a standard methodology, a standard concept in what you are doing to start when you start with whatever you do, whatever level do it as early as you can, because otherwise you'll not be able to demonstrate full success.
You, if you start later, you can say, okay, from here on, we've made progress, but that still leaves a little bit, this, this open.
So, so you have still a little bit of an attack surface for, towards your, your management saying, okay, but yes, you got better now, but what did you do the first year? Do it continuously in a meaningful interval. And as I said, do it with things you can gather easily schedule basis performance. So risk is probably more continuous for many areas than, than performance strategy, larger in the walls.
So otherwise chart in the walls have a standard reporting on that. An easy one is to understand really is a little bit dashboard.
Like, so managers like dashboards have one look at it. I maybe managers don't like them. I don't know exactly vendors like dashboards. I have to say, because everyone is, is doing a demon saying, oh, we have to cool dashboard thing here. And then I say, okay, can you drill down? Sometimes they say yes. Sometimes they say no, if you can't drill down, forget about it. Because at the end, you always want to go into the details anyway, but come up with something, they, they understand easily. One. You always have to understand that manager have a relatively low span of attention.
So they, they need to, to understand on, on the first perspective, they need to understand, oh, we, Dave got better. I don't need to care about them. This is the thing you should achieve. They look at it and say, oh, great progress. I can't spend my time on something else.
If, if your, if your report delivers that message, you're halfway through, unless you're faking too much, never trust the statistics, you didn't fake yourself. You also need me to prepare, be prepared for reporting. So something is going wrong and then say, Hey, this is not working. And then you need to say, okay, yes, it's working. It can show you. And then think about your metrics. Learn about yourself again, improvement, improve not only biometrics, but improve your metrics as well. Your approaching care and KPIs. It's not static.
So a little bit more on the slide.
Again, I, I would say there there's some for identity management, there there's some guiding principles you can apply from, from strategy to project, to operations. So who is responsible it's for strategy, the IM lead the IM program lead then for the project or programs and the operations lead. You have recipients. So on strategy, you commonly report to CIO, CSO, the project lead to the IM lead and the steering committee. And so on.
I don't need, don't want to, to, to talk out everything or read out everything, but at the end, yeah, it's always a strategic level, project delivery, efficiency, effect, effectiveness, and operations.
How do you, how do you do efficient management based on that, of what is your target for efficient management? So here it's workloads efficiency and effective delivery. So doing the right things, providing what they need deliver on time and a batch chat here. It's this typically benchmarking against peers or so, which is a typical targeted control.
You can implement cost time, quality here, technically indicators. What is way at the end proof your success in identity management, it would mean strategically. It is when you deliver to the business needs, I would call it. You have an identity fabric that suites the business needs place that you deliver on this in integrated perspective, from identity management, for project it's easy and here it's cost optimization, reducing friction and incidents. And then you can look at tangible improvements here, closing the gaps, the biggest one, the most relevants, getting better in project delivery.
So when your next project is based on what you've learned from this, these matrixes and the optimization based, it's getting better than you're making progress, etcetera. So that brings us or not, okay. The clicker doesn't move anymore.
Ah, here we go to, so after the third pass, that was the longest part, by the way. So I will not run over and, and, and spoil the coffee break afterwards. Don't worry.
How and where collecting and displaying care eyes and APIs. And it's again, a little bit of a metrics. And this is one axis, which it's little, little different type of metrics here, because I say we have automated versus manual controls and we have detailed versus summarizing.
And in some way we have sort of correlation around this axis plus minus, which is we should automate as much as we can do operational care eyes collect as much data automated based on, on, on scripts and other stuff, or why a BI or broke simple probe stuff like that. Key eyes KPIs tend to be a little bit more sometimes where we need a little bit more of summary can do it easier, but also sometimes a little less automation.
When you, when you look at some of these things. So measuring the number of often accounts, that is something you should automate measuring.
So if you, if you have, if you need your, your, your intro at Tobi to, to count the number, then you do something wrong.
Your intern who counts that. But that's the reason why we have internships. Okay. Politically incorrect, operational KPIs that is sometimes a little bit more measurement, more manual stuff you look at, how long does it take bro process to execute? But it also a lot of can be automated.
I overall, I'm a believer in automated, but it doesn't always work for projects. The project manager probably always does something manual. And then he has project manager outside of project. They have axle, as we all know, and then they do these big axle things which consume half of their time. The other time is fixing the problems they can chart and do their, their, their matrix here.
Then we, we have to as like risk and KPIs, close to each other and strategic, this is usually more manual. This is saying, okay, I run a benchmark, which is in some way, a project for itself benchmarking.
I, I do an analysis of where I stand with my blueprint, where I need to improve my blueprint. It's a little, it's a small project on its own.
Honestly, all of these are small projects compared to what you spend in overall in identity management. So, but here you are more that side of things.
And I think the guiding principles are very clear. The more to strategy, the more manual, the more summarizing, the more here, the more automated you should be, because this is what you collect frequently. This is what you do, whatever, once a year or so, once a half a year. So that should look like this.
So where, where do you get your data from, from the IM tool from security, from the steam tool, which also should have a lot of data around, for instance, fraud, attempts, stuff like that. You could use BI to collect some of the data and analyze it. So you trust, democratize the data. You have horrible. I believe from the project management tools from it, service management. So what is the effort at the it service management tool, very important for performance performance metrics show up here quite frequently. And then you, you, you do it in a presentation.
This is more for the board in a dashboard thing, which shows, okay, this is changing that way. And some statistics sometimes even more technical files or stuff like that, where you say, okay, this is just a technic technical information I provide somewhere. And some of this is also clearly done more in a manual manner. So you're not on the automate side here. And right now we come to this point of which, as I've said, I don't go into hundreds of potential KRS and KPIs.
There are reports out there from, and even while that report, I think for, for access governance states back to 2017, it, it doesn't out date that fast. That's the point, you know, clearly we will do an update sooner or later, but it is nothing where you say, okay, this every year you need totally new measures because the number of orphaned accounts has been a good metrics 20 years ago. And it will be a good metrics in 20 years because I don't believe that we have fixed the problem of orphaned accounts in 20 years from now, at least not across everything.
Oh, for sure. Decentralized identity and media version web will solve it and AI, but no, it will not solve everything.
My top five first strategy and I top five and top panelists Analyst also like to do top lists and, and consultants as well, by the way, benchmarking results compared to peers, benchmarking results, compared to maturity levels, clear favorites, completeness of coverage of required. I am capability capabilities.
And I, I could have underlined required here because it's not coverage of IM capabilities. It's coverage of required IM capabilities. You always need to understand your, and when you sort of think back to this project picture I had with the project success in the middle, on the top, there's the requirements. And I said, okay, look at the requirements you have, you know, so to speak the requirements that come from trends from what is emerging, look at the requirements that come also from emerging regulations.
And so, so take a, take a look at, into the future, the completeness of coverage of IM capabilities with high priority.
So how good are you in delivering on the high priority items?
And the, I think strategically seen as a risk indicator, number of identity related security incidents. So how many ransomware attacks did hit you based on fishing attacks, wonderful metrics, maybe you can't measure it anymore after that ransomware attack, but it's a different story projects.
Project cost time, a very good project metrics for IGA specifically, but not only for it also for access management for privilege access management is how, what range of systems connected out of all applicable systems, wonderful metrics where a lot of projects then go out with an unrealistic promise, oh, within we will connect these systems in two years. And then two, two years later you look at it and say, okay, we have already done 10 out of 400 or something like that. The interesting thing is the first problem is always how many systems do you have?
Correct. Yeah.
And, and, and, and, you know, you know, so, so, so this, this, this, this recurring thing is obviously going to a customer and say, okay, how, how many systems do you have? We need to cover? And they say, oh, probably 400.
And I say, okay, collect, oh, more than thousand. So it's, it's always the same.
No, some customers really have a good as asset management have a good, it, have a good it monitoring. So some have it, but it's, I've seen it.
So, so frequently that there were more system than in the worst streams that people have. So to speak capabilities implemented as a plan level of sophistication. So did you do it, or so I, I think you can always, if you're, don't cheat yourself too much, you can say, okay, I did this, but I plan to do it better. Or you say I did it.
And this really good. So a little bit of clearly not exact matrix this here, user, that faction score.
I brought it up operations, K eyes KPI split here into so K rise, orphaned accounts use of provisioning time, which is on both sides because it's a good one for both account fraud attempts account to den, to Perion ratios. So if, if you're dealing with thousand employees, but you have thousand 500 identities for your workforce, then this ratio is lousy. And things like that, unused entitlements, there is something which some systems are quite good to measure. So monitor which entitlements are used.
And if there are many entitle, you do, it must do it at least 12 months, by the way, because sometimes you have done a lot of entitlements in your whatever finance department, which are only used by fiscal year closing. And if you say, okay, not necessary, I have removed them. You might end up in trouble KPIs, user onboarding, use a deprovisioning average manual workload in a mover process. Very lovely thing, mover, and even more relocation process who whom of says in my organization, mover and relocation processes are running smooth.
Okay.
Zero hands that fits exactly my expectation and my experience. These are the real athlete processes. I know again, this, this ratio of account to identity, to, to performance, because it's also about doing too much work authentication related, really help us requests. Yeah. If you go for password less authentication and bring this, that number down, it's that, that not as easy, because the unfortunate thing password less password less today is that you usually have to use this need to do their user to device binding or identity to device binding.
And so again, one thing which can fail, so who systems can do the automated controls? IM teams will work a lot on manual controls, bro check manager for the project care and KPIs. And sometimes you need advisors to help you in benchmarking and other things sometimes guide you through process. I think this is something where yeah, there's manual work required.
And so I think this is already my closing slide. It's always about details versus overview for the purpose. So I am risk management is very much from details.
If you want to gas a budget, then it's probably more on an overview level or for stakeholder management or for the overall it risk management, your recipients are than more the corporate leaders or the governance get governance teams and the departmental risk managers, and some things you then do more drilled on operational dashboards. Or if it's really up for the board, it might be the individual summary, but one or two slides they look at. And if they, if you convince them, which is one or two slides, you don't need the other 46 slides, you have prepared us back up.
So, and then you say, okay, good that I did this work, but I didn't need it. Be happy about it. Even value sometimes feel it's wasted time to create your out of 46, but it's always good if you don't need it. Thank you.