So I think it's also very good that Benno is spoke about the challenges for risk management as well. And we continue that and I will describe the challenge what we have today, especially the OEMs like bmw, Volkswagen, Mercedes, they need to make the global supply chain more resilient. So as you can see, the OEMs today, they have contracts with the tier one supplier, but they have no contracts with it here and suppliers.
But today, with the supply chain due, due diligence acts is, is required that they change the information between the participants from Tier N to the OEM to recycling. So we expect more than 500,000 legal entities worldwide to exchange information and that that means they exchange information about the human work conditions, the material compensation, the product carbon footprint. So we need to secure this data collaboration in a network of these millions of suppliers worldwide.
So 70% of these suppliers are small, small medium customers they need to handle now with a higher attack vector if they need to share this information.
And as you can see data spaces, they should protect these assets and we are using W three C audio L, it's an open digital rights language and to define also the restrictions to access and using this data. Yesterday I have learned from my colleague Peter, we have more than 25 different policy definition languages and no definition language is easy to use and as well they are not complex enough to fulfill the requirements of the business.
So when, so when we ask the customers define policies for joining the data spaces to fulfill electronic contracts access or use data, we constantly get the feedback that the customers don't know how they protect their, their data and they can protect against this risk of losing this information. So in another hand, we want to also have a free flow of data that brings value to all the suppliers here.
So in this data sharing journey, we, we will learn more and more how we can share this content with each other, but we are not full experience yet.
And today, so we need intelligence self-learning way like a co-pilot to protect our data. So we need a balance between the content centric controls and the people centric contexts. So data spaces are using connectors to exchange data with each other. So we are using non-human identities to connect machine to machine with each other. These managed systems identities need to monitor via artificial intelligence. So we believe Microsoft purview is a good approach. So Martina will speak about.
Thank you so much.
So we talk about, we talked about data spaces already and now we are talking about data and data protection. So this is the platform we call purview in the Microsoft environment, maybe Microsoft world, maybe you've known or heard about it. Why is that so important? So we think that till 2025 there will be data of 175 terabytes. We are moving around the world and this data is crucial maybe to your business and to our business as well. So therefore 83% of most of the businesses will have a data breach one in their lifetime. And this will cost a lot of money.
And, but on the other hand, all of us are end users as well. We wanna not just protect our data, but we wanna be comfortable. So 90% of your end users actually say that they're ready to take on the risk if they're more comfortable with working and maybe a little bit faster as well.
We wanna be optimized in what we're doing, right? So therefore we have adaptive protection. So adaptive protection is basically combining three parts of, of the Microsoft purview. So here we have the protect sensitive data.
That's the part in which we categorize data based on labels and for example, based on those labels we can have encryption, which comes along with it. It's, that's probably the most form most of us know when it comes to compliance. We put on a label on maybe an email, then we have the preventing data loss. So therefore we actually try to to, to protect the data being spread outside of my company and within my company maybe there is data which is even so sensitive that not even my employees are supposed to share it with each other.
And there's the insider risk where we will talk more about in the next slides. So let's look at data protection even further. So as Mathia said before, data is not, or those policies cannot be just be set by the data or by the identities. You have to mix it, especially because the identities are these which are creating the data and the ones who are using the data. So here for example, somebody is receiving a confidential email, sending out an email, maybe using a cloud app. We are producing data with everything we do and we have to protect this data with it.
So this, this is why we have to look at the identities. Cheese,
I talked about insider risk management very shortly already and maybe you're asking yourself insider risk, why should an insider be a risk for my company? But there are real world stories for that. So here for example, Jane Doe from 2012 and two till 2017 stole data from their employee to sell this on the market. And some people are willing to actually pay for this. Same here with Jane do who actually took pictures of documents, which then will not be found that it's just being moved and sold those data as well.
And right now there is a newest case, it's from 2021 where data in worth of 100 and million dollars was tried to trade it and the person was convicted. So therefore be careful and the people who are within your company can be a risk as well. So therefore you have to actually connect it to, for example, HR systems where you can for example, take on that a person's leaving the company soon.
So therefore the, the movement of the person can be looked in deeper into, let's look deeper into that one.
So this is how it would look like from your perspective if you're using adaptive identity data identity protection. So here first we have the user which is elevated or classified through a risk score. Maybe you know that from identity protection, maybe some of you're using Azure active directory. So therefore you have a risk score. So this risk score is based on how you move, how you move with data, how you're producing data, how you're uploading data and where you send data to. And not just this but as well data through your HR systems. Are you leaving soon?
Is that something which is communicated through HR yet? And as well defend fire endpoint. So endpoint data as well in terms of what are you doing with your USB sticks, for example.
So it's not just the risk score, but based on this DLP policies will actually be triggered, which will stop you maybe from sending out data or using data a certain way. So it's not just one second or one moment in which you may be open a confidential file and you will have a high risk score. Probably all of us have done this before, maybe clicked into the wrong file.
So therefore it's actually levered through machine learning. So we wanna go away from the static policies Matthias talked about, we wanna go into dynamic policy. So these are based on the risk score and this risk score is created through machine learning. So here first there's the context of the data. So who am I? So for example, I'm a cloud solution architect engineering. So I'm in a customer facing role. So probably I will be sending out quite a lot of emails and maybe as I am sometimes as well at the customer.
On customer side I maybe use U USB sticks as I'm not always allowed to bring my laptop with me into every parts of other people's organizations. So therefore the context is really relevant.
The second part, as I said before, is the sequence, am I just storing files on a USB stick or am I deleting the files, which are under OneDrive as well. So that would be maybe something where we should look deeper into very important as well. If you are looking deeper into that or if you're not a hundred percent sure the data side, sorry. So there is no clear text name on the first side of things.
You can look deeper into it while investigating
And of course the anomaly detection. So therefore maybe as I said before, I'm a cloud solution architect, everybody in my team is a cloud solution architect. So therefore we will probably move quite, quite the same. And we will use data probably for the same things. But for example, somebody in HR is probably using the data very differently. So therefore anomalies are, can be based on my department as well.
So as we don't have that much time, we added some links about insider risk, adaptive protection, how to quickly put on some policies as well. And yeah, some more re resources.
So connect us in LinkedIn, we will share the presentation with our, we will post this presentation afterwards. And maybe one comment, this is also this adaptive protection is very important to identify hacked accounts to see what happen. Usually unusual things happen with, with the account. So it's also detect compromised identities.
It's very, very important to know. So if you have questions, yeah, I'll feel free.
Yeah, we have plenty of time for this. Can I session? But before maybe I could call you one thing. So actually we shared a poll for you to participate. You can access it through your mobile apps. The question is, are you affected by regulations as you collaborate with suppliers to share information, it's really easy, yes or no? And just take your time to answer it for a while. And meanwhile maybe we can see if anyone has a question for both of our speakers today
Is the story.
You can take the
Story that, the story that Martina said before about the data that were sold and the the Yeah, and that is the real value of those data.
Yes. So that was actually can actually even bing it. So there was a big beverages or drinks producer who, who that happened to in 2021. Yeah.
And that value is the value of the, they solve the data or it's the value of the data.
Like the person was caught. That's why he, he she was convicted.
But yeah, the value of the data was 120 million.
Great.
Anyone else? Maybe we still have five minutes or something. So all right.
There, there's a question. Yeah.
All right, just one second. I need to give you a mic.
I think you mentioned that purview can be used even if the data is already like outside the organization.
Yeah, and I just, I'm just wondering what measures are used to govern access to those, like outside the organization data. So is this just encryption? Are there any other like measures to to govern the access?
Yeah, so basically there is the, there are different parts like the data getting out of my company. So therefore there are policy tips first as me, as an end user, when I'm sending it out and I'm just like, oh, why can't I send it out? I will get a policy tip and I will get like a message which would let me know that the message will not go out. Same in teams for example. But if it's like really confidential data where it's like, for example a patent, you can use fingerprinting as well. So therefore there is the possibility as well to use fingerprinting.
So it's actually being detected even, even if it's not inside the cloud environment. Yeah,
Right.
How would you say the APIs are for someone who maybe has a risk engine that's wanting to consume the classifications to know risk scoring of user access to data based upon the sensitivity of the data based on your classification, can that be consumed in bulk by an external system?
Yeah, the api
You mean If you can extract the,
Yeah, like let's say you have a bulk endpoint where we can call and get a list of the files or folders and then the, the, the classification metadata that you've discovered by crawling it in, in purview.
I'm not sure about this. I know that there are parts of where you can export parts of the data, especially if you have a CM or if you have some kind of system you wanna use for monitoring. But I'm not sure if that works with everything, especially because adaptive protection for identity at the moment is still improve you. Yeah.
So this is a purview product I
Introduced When it's in general available, then it will be also public endpoint. Yeah.
Do you know if the data classification tagging is gonna be tied into the aback, the new AAC that's in preview for data access so that if you have aback rules that you could use the purview to populate the AVAC tagging based upon
It's it's based on the, on the tagging from an M 365. Yeah. Okay. Cool. Thanks.
Is every anybody using insider risk management in here?
Or if not, you wanna maybe tell us why not?
Well, I think we're kind of okay with the time. Okay. So maybe I can share the poll results. Maybe you would like to say something about it and I tell you how many people said yes or no poll results, please.
Yeah, no, I can tell because I have the results, so yeah. All right. So 12 people said yes and three people said no ing Yeah. More than I, than I expected.
Yeah, that's, that's really a lot. Yeah. Good. Thank you very much. Thank you so much. Thanks so much.