Good afternoon, everyone, and welcome to this KuppingerCole webinar sponsored by Protegrity. My name is Mike Small, and I'm a senior analyst with KuppingerCole, and my co-presenter today is Alasdair Anderson from Protegrity. And this afternoon, the subject is about how to enable data-driven business with secure cloud migration. So here are the basic rules of how this will work. The participants, you're all muted centrally, and you don't need to mute or unmute yourself.
During the webinar, we're going to run some polls, and we will be able to show and discuss the results during the question and answer at the end. There is a panel that allows you to enter questions at any time, and so please will you put your questions in through that panel, and we will get round to answering as many as we can at the end. The webinar is being recorded, and both the presentation and the slide decks will be made available for you shortly. So we're going to start off with the first poll.
So this is to see if you're tuned into all of this, and what we'd like you to answer is, where is your organization's data held and processed? Is it mostly in your organization's own data centers? Is it evenly split between data centers and public cloud services? Is it mostly in the public cloud services, and you're not sure where it is? So please can we start the poll, Cindy? So we'll give this a few moments for people to respond. So this is very interesting because in a way, this is very apposite to the talk that we're going to give today, and I'm going to move onwards.
So the agenda that we have today is, I'm going to start off by talking about the challenges of moving data to the cloud, and that will be followed by Alistair from Frategrity, who will talk about a modern approach to leveraging this data for both innovation and for profit. So let's start off by looking at where all of this has come from, that in today's world, we have this dynamic data that is constantly being generated, which is being exploited by organizations under the heading of digitalization, because data is the most valuable asset of the modern organization.
And you can see how we can use data, and this is not a new idea. Data has been at the forefront of the forefront of how you can develop new products and services, because in the 18th century, people were collecting data about illness, and about deaths, and when people died, and that allowed the development of financial services that had a solid foundation for things like pensions, for life insurance, and for annuities. In 19th century London, data was used to identify where cholera was coming from, and throughout the 19th and 20th century, this has increased.
But now, we have such a vast amount of data that the whole way in which this is available, the amount of data and how we can use it is changing. So what has basically happened, indeed, over the last 20 years, if you go back to the 1990s, the great databases represented the forefront of how you could manage and use data. But in fact, social media, the cloud, the vast increase in the capability to store data, as well as communication technology, has made it completely different as to how we can create and share and use that data.
And organizations have realized the opportunities from this, and this has been put together with agile development technologies based on containers and so forth, which are flexible, but which depend upon the use of cloud to be responsive. The fact that when I was developing software only 20 years ago, I spent most of my time looking for computer hardware upon which my teams could use to develop it.
Now, if you have a credit card, you can get your services and your a storage in milliseconds. But this has changed the whole thing, the use of the cloud has changed the whole environment. So when you look at this, customers are concerned about using the cloud, not necessarily because the cloud itself is particularly insecure, but simply because the accessibility of their services, the complexity of their services is such that it makes it difficult to manage the security and the access.
But at a business level, when you talk to business people about this, they're not really interested in the minute eye of the technology. They want to avoid certain business risks, and those primarily center around data. Your business does not want to suffer a compliance failure. And almost every week, we see in the press that another company has been fined millions or billions of dollars for misuse or compliance violations around their data.
When you look at privacy and data protection, again, there is the problem of the vast number of laws and regulations that you have to deal with, which lead you to the problem of data breaches. And those affect your reputation as a company. They also can lead to the first problem, which is monetary penalties. And finally, one thing that is often forgotten is business continuity. The more you depend upon data, the more you depend upon the data being delivered through an IT service, the more vulnerable you are to anything that impacts on it.
And the cyber criminals, the threat actors have recognized this. Ransomware is a nice little earner because organizations need their data. And as they have digitalized, they have become more and more incapable of managing without their IT to deliver that data.
So, preventing access to it has become a lucrative criminal exercise. So, when we look at all of this dynamic data, we then have to sort of start to say, start to say, how does this change the landscape? And one of the pivotal moments, I think, was the long-running dispute between Dr. Maximilian Schrems and Facebook, which has led to more and more clarification of what is concerned around data protection.
But basically, the Schrems 2 judgment hinged around the risks associated with moving data outside of the jurisdiction of the European Union into other places, such as the US, where the laws are not the same. And basically, the result of that was an agreement that contractual clauses, i.e.
contracts, were not considered to be sufficient. Now, that may well have changed as a result of the latest agreements. But the concerns are still there, that if you put your data in the hands of a third party, there are a set of risks, which you may be able to resolve with contracts. But ideally, you should resolve using technical measures. And those technical measures are what were defined by the European Data Protection Board. And I would like to suggest that those measures apply not only to data that is protected under GDPR, but potentially also to other data.
And if you are paranoid, like I am, I'm not paranoid, really, I just know that everybody's trying to steal it. So what about your business data?
Now, there is an awful lot of data that organizations hold that is business sensitive. This is intellectual property. It is customer lists. It is pricing. It is things like your patents and how your products work. So if your data is valuable to you, it's also valuable to your adversaries, to your competitors. And there is a history of certain other countries in the world of trying to steal that data.
The second thing is that if your data is compromised, then if, for example, it is your intellectual property, then as RSA found out some years ago, that when they found that the technical secrets behind their identity chips were stolen, it cost them $60 million or thereabouts to redesign them and to reissue them. The whole of our business at the moment depends upon secure data transfers, which are based in turn upon effectively digital signatures.
And all of this, all of this complexity is in fact being put at risk by quantum computing, which is potentially fundamentally going to disrupt the encryption technologies that we are using. So when you use the cloud, you as a user of the cloud have responsibility. And in 2012, the ACPA, the auditors of certified audit practitioners in America, put forward the notion of complementary user access controls, i.e.
you are responsible when you use a cloud service or another third party delivered service for controlling access, for controlling where that data is stored, for making sure that if it moves, it's secure, and you're responsible for making sure that the data is resilient. And so you need to do this throughout the data lifecycle. And so people often focus on a particular point in the data lifecycle, and that isn't sufficient. It isn't sufficient to just simply say, well, it's all encrypted when I've stored it somewhere, because data is created, is used, and then disposed of.
So each of those stages need to be thought about and treated. And so when you look at all of this data, we have where you acquired it.
Now, there was a time when people created the data. You had data input clerks who took data and input it. That's the minority nowadays. A lot of data is, in fact, collected through the users inputting themselves. There is a lot of data that is bought. There is a lot of data that can be harvested from the internet. Some of it is sensitive, some of it is regulated, some of it you may have permission to use, and some of it you may not. So when you have all this data that comes from different ways, then you have a problem.
And interestingly, one of the stories from GDPR was at the beginning, people realized when they had to find their personal data that a lot of this stuff was being shared around in terms of marketing lists and so forth, simply in spreadsheets. And nobody knew who had them. There was no control over them. So that's a concrete example. But you need help to find it all, to classify it, and then to make sure you can only use it according to policy. There is so much of this data. It is so dynamic. It's no good trying to manage it later. You have to have policy that follows the data.
Then while it's actually being used, you need to be able to control who can access that data and what it's used for. And this takes you into authentication, authorization, identity management. And since cloud services are complex, your organization will have administrators, many administrators, and they may have access to that data. So you need to be able to limit the administrator privileges and to enforce segregation of duties so that they can only do what they have to do.
And it's really important to protect against account takeover because those administrative accounts are going to be the ones that the bad guys are going to look for. Then having controlled access to it, you need to be able to control unauthorized access. That is to say, what happens if a media containing your data is lost? Where does that data go under your control? How does it move between places? And effectively, this takes you into the questions of things like tokenization, masking, pseudonymization.
Are you able to be sure that you could continue if your service data was destroyed in some way? And remember that encryption is on a countdown to quantum. And so you need to be planning for quantum safe encryption. Then it's not sufficient simply to protect it. You also need to be actively monitoring for attempts to misuse it. And there are lots of stories about how it's only when you start to see why people apparently are looking at data that you begin to realize that some of those attempts to access it are in fact from threat actors. And finally, you need to dispose of it.
And if you hold the data, then you need to dispose of it in a way which removes it in a way which cannot be recovered from the media that hold it. And there are standards about how you can do this. If you've got it in the cloud, then the cloud storage is virtual and ephemeral.
One day, you may be using one storage device. Another day, you may be using another one. What happens to the storage areas that you were using? Make sure your cloud service provider has a plan for that. So basically, you need to have a proper way of protecting your data in the cloud across the whole of the multi-cloud hybrid environment that you have today. And so in summary, this means that data is your most valuable asset. What are the risks? And they are that if you don't have it, your business can't continue.
If your data is breached, then you will lose your reputation and potentially your intellectual property. If you don't look after your data according to the law, you may suffer monetary penalties for that. The challenges that come from this are that when you're using the cloud, you have a shared responsibility, and that many organizations don't realize what their complementary user entity access controls are. And finally, you must not forget about the final risk of quantum computing.
And so these complementary user access controls that we have been talking about are around controlling where the data goes, controlling who can access it, making sure that it's secure when it's stored and when it's moving, and making sure that you have a way of recovering it if its primary place of residence is lost. And so in order to do that, you need a proper platform which has some kind of centralized policy with distributed controls that follow the data at rest in transit and during its processing. So thank you for attention up to now. We now have a second poll.
And so perhaps, Cindy, you could set off the second poll, please. So I think we've had time for that poll. So let's now move on to the next session. So I'd now like to welcome Alistair Anderson, who's going to talk us through a modern approach to leverage data for innovation and profit.
Thanks, Mike, and good afternoon, everyone. Thanks very much for me to set that up. And it gives us it gives me rather great context to discuss the next section. As Mike said, data is a responsibility that no business can delegate or outsource. And when you are active within the cloud environment, whether it be an application provider, whether it be a platform service provider, you are still responsible for the information that you hold. And ultimately, you're typically holding that on behalf of your customers.
And that's where things like privacy becomes a very important aspect of how you control your cloud.
Now, as Mike mentioned, the law has been very firm recently in terms of the clarifications that came from the European courts, and certainly with the fines that you've seen being handed out, typically by the Data Protection Information Commission within Ireland, because that's where a lot of the large US companies that have been fined are based, but nonetheless, across the whole of the European Union and people who work and do business within the European Union, that technical protections are mandatory when you look at your data in the cloud and your responsibilities towards that data.
So that's really where I am going to start with. I'm going to take you through what these sort of clarifications are and how different European agencies are starting to describe this problem space.
And then I'll talk through some very practical examples of how we, as a company with a privacy platform, a data protection platform, are enabling different customers across different verticals in this journey towards a fully cloud-enabled business, as well as adequate data protection controls that really anticipate what is coming down the pipe, as well as the state of play today, because as I think we'll all acknowledge, the data protection world is fairly agile and fast moving in terms of new laws coming out.
And especially if you're a global business that doesn't tend to stay static for too long with different countries bringing out different rules all the time. So what I wanted to talk about is just put this in a sort of business or a commercial context. As much as technology is a change business and things change all the time, the way that we evaluate whether a technology is appropriate, applicable, useful or not hasn't really changed because that's a commercial judgment. And really, you have to look at things from a balance sheet perspective.
I'm an old programmer, some would say a very old programmer, but I spent a lot of time looking to adopt new technologies, but they always have to be justified through the lens of a financial person. So you're really looking at how you materially impact the balance sheet of the company. And that really brings me to point number one and point number two on this slide, which is the easiest way for new technology to make a meaningful impact is to reduce the costs.
And typically, when we look at data and privacy, there is a lot of discussion that goes on, there's a lot of human decisions that have to be made. And the ability to automate that and not just automate the protection and the controls, but also the audit of those controls and the demonstrability of those controls can lead to really significant cost saves. We've worked with clients who have brought down their time to market on data products from several months to several minutes.
And as you can imagine, when one of the largest costs for enterprises is human resource, savings like that very, very quickly add up to the tens of millions. On the revenue side of the fence, if you think of profit and loss, costs of your loss, because it's money that's going out of the company, profits is your revenue coming in.
And really, the ability, and we'll talk about it later on in the presentation, for customers to start to monetize their information and monetize their data allows them to generate new revenue lines, allows them to increase market share or break out into new customers with new products. That really is the sort of, for me, the fantastic aspect of being in this business, because just as a control for compliance or a cost save, it's a very negative discussion.
But having discussions with clients about how they can bring their data to market in a compliant, safe, and privacy-enhanced way, that's a far more positive discussion. And even when you're not in the software world, when you're in the IT department within a large enterprise, that's a very much more positive discussion to have with their business stakeholders. As Mike alluded to at the start of the presentation, there's a number of large funds being handed out recently. These numbers are only getting higher.
1.2 billion dollars or euros, I can't quite remember the currency, pretty much the same these days. That's an awful lot of cash. And to avoid those costs is another important aspect of the balance sheet. I would add two points to this. The first point is, although that is a large cost, it's one of these things that you only sort of become aware of it after it's happened. And therefore, it's not a materialised risk on a balance sheet, it's a cost that happens at a point in time after an event.
There's an old saying that no one really gets thanked for preventing things, and unfortunately this is true. But a far more important aspect than just the cost, because that fine was for Facebook. Facebook has a revenue, I think, upwards of 120 billion a year.
Therefore, a 1.2 billion amortised cost is a lot of money, but it's not that much money when you consider the size of the balance sheet. But the loss of reputation and the inability to grow their business because they do not handle personal information and people cannot trust them. Once you've lost trust, it's very difficult to get it back again.
So for me, that would be the more important aspect to take from that world. Avoiding costs is important, but losing your customer trust is far more of a loss and far more difficult and valuable to get back.
So really, when people are looking at moving their data to the cloud, they're sort of stuck between a rock and a hard place. If you take a purely security view of information, you want to lock down data, you want to stop people from accessing it unnecessarily, you want to make sure that no one can see data, but that in turn makes it unusable.
On the demand side, the demand is really off the chart, driven by the explosion in demand for AI and the need for really enormous datasets and historical datasets, data that was perhaps considered not useful, is now being used to feed AI models and get intelligent results where definitely human-based processes or sort of declarative processes would be unable to cope with the volume of information. So you have this need or sort of belief in businesses that if we lock down information, we'll be safe.
At the same time, you know that if you unlock information, there is potential great benefits there, not just for revenue growth, but also for productivity. You look at the automations that would be available. I noticed a recent public use case from Klarna and the fintech who have made a 60% savings on customer support costs by the ability of rolling out chatbots. And it's more than that, the satisfaction of customers has actually doubled in interfacing with that process. Less wait times, more creative answers, more solutioning.
So there's definitely huge positive benefits if you can unlock that information. And productivity are very much in the business of protection, but for the benefit of unlocking information and allowing you to monetize it. So we talked about guidance earlier on. So what are regulators saying about how you can protect data? GDPR is a pretty mature piece of legislation now, but really legislation is the start of the legal journey. You get clarifications through test cases.
So there's still strong guidance coming out that when you look at things like pseudo-anonymization, so rather than full anonymization, you protect the privacy enhanced aspects of the data. You take PII out of the equation, you completely replace that information. That is seen as an effective technical control. Now that has also been tested within the courts themselves. Does this provide adequate protection for PII information when you're looking at the example of moving to the cloud and stopping foreign governments being able to access that information.
That's also been echoed across the Atlantic as well, when we're trying to find a sort of Atlantic bridge for information that you like, is how do you deal with PII information? And why am I focused on PII information? Well really it's an acknowledgement of not all data is created equal.
So if we start from another piece of guidance where the European Cyber Agency, and this is very recent, I think it was January 24 this year, it's definitely this year, I've been trying to structure, because remember the courts have been very strong on contractual controls are not enough, technical controls are mandatory, and therefore that takes you to, well what does a technical control look like? So the European Cyber Agency have split it into sort of two problems, the input and the output problem.
But really it's how can people safely move their data to a shared environment or a third-party environment, so for shared environments think of Salesforce.com, Workday, AWS, Google, Microsoft, Azure, all of these cloud environments. How can you do that and be privacy compliant? Because you might be in the same company, but you might be across different regions, if you're in the UK and you're in the EU, those are completely different legal systems these days.
So how do you move information across these environments and make sure that you are compliant with legislation and you have adequate technical controls? And really the Cyber Agency are saying, you know, simply put in the input side of things, you've got to make sure that the data that's placed into the shared environment is not identifiable. And when information is extracted from a shared environment, the output problem, again that the users cannot identify individuals in the data sets.
And that leads us to, you know, really distinct ways of looking at this problem set and we'll get to some real-world examples of that later on. But why have we, you know, we said we've talked about focusing on PII information. I think people who've been in the data world for quite some time will realize that not all data is created equal.
You have, you know, any given data platform or platforms, you have really, it's not even 80-20, it's probably, you know, two to three percent of your data is personally identifiable information or sensitive information. We've all seen these sort of categorizations by the cybersecurity people of public to highly sensitive information and steps in between.
Now if you can deploy adequate protections on your restricted data and allow that data to be mixed in with the usable data in a controlled and protected fashion, then that allows you to really unlock information, which is what we started the presentation off with, is how can you unlock that information to meet the demand of your analytics environments, your product development people, the people who are looking for efficiencies and productivities, trying to build AI models. How can you get all that information to them?
You have to unlock it and that really means you have to lock it first, but in a way that allows the information to still be utilized within the entire data set. So if we sort of drill down on input data, so what we're talking about here is that you want to take your information that is on-premise and you want to put it into, we've called that sharing environment, we're using their language, the cybersecurity agency's language, but really what we're talking about is third-party environments.
So these might be cloud hosting providers, they might be cloud application providers, they might be third parties that you work with, and it's how do you send that information to them and ensuring that you can achieve the outcomes you're looking for, increase your customer experience and look to gain a market share, wallet share with the customers that are out there, and look at, you know, sort of productivity efficiencies that you would gain by using platforms like this. And for us, this is about protection on the journey and to the shared environment.
If you take your information, if you build almost like a passport control that people can put their protections on, the data that's PII sensitive or other kinds of sensitivities, and then place that into the shared environment, you've broken that link that allows individuals to be identified, and that is the guidance that the cyber agency are giving.
On the outside of the fence, very, very similar, you want the protections on egress, as the cloud providers call it, or extract, to be similarly of equal strength, and really the simplest way that we would look at it is you really need a sort of a gateway technology that allows you to communicate two ways between yourself, third parties, bilateral parties, and the sort of cloud environments. You want to make sure that your protection can be enforced on the way out. You also want to make sure that happens also to the, not just the data itself, but also the outcomes from the data.
So you want to put protections on the analytical outputs that can occur in those cloud environments. We're just about to talk about one and demonstrate what that would look like.
So like I said earlier on, we've got a few use cases, three use cases from different verticals, and it's just an illustration of how we are working with customers, and how we have taken these patterns, which is real guidance from the European authorities, and then that's allowed us to implement it into really existing architectures, but architectures that have been on-premise and now are looking to become cloud enabled. So really two parts to it. One is the journey to the cloud, and the second part to it is really enabling growth within the cloud.
So I think one of the sort of biggest mistakes any IT person can make is to say to a business colleague, we're going to change everything that we do, and we're going to move to the cloud, and then after the end of that big project, it'll be absolutely the same. I think what you want to be thinking of when you're moving to the cloud is the benefits that you're going to get from that. So how are you going to be able to scale larger, do more, produce analytics that are not available to you within on-premise, drive an AI model that needs, you know, 2,000 GPUs instantaneously.
All of that stuff is only available in the cloud, and it all drives the sort of business outcomes that we spoke about earlier. So our first example is from the healthcare world.
This is a, you know, representative of a business that is available globally, and actually a business that operates in some pretty tricky jurisdictions within the Middle East and the Far East, and in Latin America as well, and really the challenge they have is that they know they have some very deep data about their patients, and they produce a real premium service within, you know, chronic care, so something that doesn't really go away, and they connect people to machines, and these machines produce a lot of data about the patients, and they want to achieve a few things.
They first and foremost want to improve the outcomes for their patients. They want to increase, you know, other businesses. We're talking about customer experience, but in the medical world, you're really talking about patient outcomes, which means, you know, can we improve quality of life? Can we reduce, you know, medical overload, medical overhead?
So can we, you know, make people's lives better? The way to do that is collaboration. Collaboration with pharmaceutical companies, collaboration with machine makers, collaboration with the clinicians that actually administer the treatment. The challenge is you're trying to do that globally. You've got multiple third parties, and really you have to have a way of securing your information. So what we've architected with this customer is the idea of a sort of an analytics platform that would be owned by them.
So if you think about the shared platform diagram from earlier on, they would put the information in there, that information would be protected, actually doubly protected, and then that would be available for people to come in and run analytics on. But they would not be able to take the data away, they would only be able to take the outcomes, and the outcomes are run on pseudo-anonymized data and tightly controlled by that client. So they get those two sets of protections that the European Cyber Agency is really giving guidance on. So this is an example of both parts of the balance sheet.
So, you know, if you work with the pharma companies, or this company works with pharma companies and machine companies, you can really drive productivity and efficiencies within that world, which ultimately and very quickly results in cost saves. But at the same time, you're working to make sure that people don't go into clinics as much, and that is a very positive outcome on the patient population as well, and it allows them to provide that service for more people. The next example we've got is from the finance world.
Some of the most advanced analytics users within the commercial sector happen to inhabit this world, and finding the source of terrorist financing, the source of, or how financial, how criminals in the real world, sorry, wash their ill-gotten gains, as we say where I'm from, is a particularly complex operation. You have to take financial information, you've got to take signal information from media, from social media, from companies' reference information, and you have to be able to produce that analytics that produce red alerts and potential network analysis.
So is this one person acting as a group, or is it a group of people acting as one organisation, as you would see, you know, within things like drug cartels. Now this customer had very sophisticated on-premise capability, and they were actually looking to produce a new fintech, which they've done successfully, and launched that as a cloud-based activity.
So they've moved all of that information to the cloud, they're taking financially sensitive information, they're protecting that information in the cloud, and then they're selling the outputs only, again protected when they go out, to people who have to do these processes, because there is an economy of scale here to invest the amount of money that it takes to build this capability is significant. You know, it's no accident that, you know, banks will spend hundreds of millions a year developing these services.
At the same time, governments are asking people in the real estate business to do the same checks. People in the real estate business really don't have hundreds of millions to invest in technology, therefore buying it as a service allows that organisation to take the capability they've developed over nearly 30 years, take it to the cloud, and make it available as a service for third parties. They have the protection on the way in, the data is still available to be utilised, because we've unlocked it with our data protection technology, and then they can take the outputs, again protected.
Now this also allows us to leverage some of the capabilities we have, in terms of having privacy-enabled views, so different people see different things, because sometimes these alerts have to go to the authorities, whether it be the police, or whether it be the financial regulators. So you have that idea of a single piece of data with multiple viewers on it, based on the entitlements and authorisations that Mike spoke about earlier on. Last example for me, comes from a very traditional vendor in the finance space, but it's measuring the risk of commercial loans.
And really, tremendously complex math goes into this, but what you're really trying to measure is the probability that someone will default on a loan, and if they do default on a loan, how much is it going to cost you? And that might sound simple, but if you think about a loan book for the average bank as, you know, tens of thousands, hundreds of thousands, even millions for the large global banks of loans, it's a very difficult calculation, and also it's part of their core financial calculation when they look at how strong is your balance sheet.
So this is a traditional vendor, always being deployed on-premise, is now looking to move their clients onto the cloud. The clients don't want to deploy on-premise, because it's a pretty heavyweight analytics piece of kit. They want to leverage the economic benefits of the cloud, and the agility of the cloud, and they're offering their product as a service.
So this means that they've been forced to think about all of the challenges that we've already spoke about, really they haven't had to do before, because customers would install the software and then run it on their own hardware, and they didn't have any of that shared platform considerations that the European cyber people talked about.
So really a good example of how a very successful well-known vendor has to change their approach to how they protect information within their cloud, in order to get clients migrated from everybody wants to move, but they're not sure how they do it without adequate technical controls, and that really is why Protegrity exists, is to provide those adequate technical controls. So mate, that's me, back to you. Thank you very much Alistair for that very interesting presentation and set of slides.
We now have another poll before the Q&A, so if I could ask for this poll to be opened, and this is how would you describe your organization's current approach to governing the data that you've migrated? Is it that you've got tools, is it tools and processes for all of the data, or just for some of the data, or are you just thinking about what to do, or you're relying on other security methods? So we'll give it a moment or two for people to fill in that poll, and then we can have some questions and answers.
So while I'm waiting for that poll to finish, one interesting thing Alistair, perhaps you could just make a comment. I've kept talking about quantum computing. Now does the way that you are with Protegrity protecting the data with things like tokenization and pseudonymization, how does quantum computing impact on that?
Well I think for the industry it's certainly a huge concern, and if I was to compare ourselves to something like encryption, so encryption is a 256-bit key, you know, there was the potential with quantum computing is, you know, that makes that potentially crackable, whereas traditional computing does not.
Our tokenization actually uses a mathematical model, so a moving cipher, and that gives us, if we were, it's not a straight equivalent, and so, and it's good you're asking me to talk about this rather than produce a white paper, because I would be caught out here, but you know, if you compare us against something like 256, our number of permutations would be in the billions, just the way that we work it. So we're quite bullish about how we would protect against quantum.
I think the proof is always in the attack vectors that come out and how we defend ourselves, but when we do a direct comparison, I think the math is in our favor, so we feel quite confident our customers are ahead of the curve in that respect. Yes, so in a sense your tokenization method doesn't rely upon a mathematical problem that is easily solvable using quantum computing, so it's inherently quantum resistant.
Yes, and you know, it's a sliding scale, but you know, in the simplest way I can put it, the longer the string, the more complex our protection becomes, and the less risk you would have. So you're looking at, you know, protection of things like names and addresses, and we're very confident that we would be, I don't want to say quantum proof, but certainly far more difficult to break than traditional encryption.
Yes, so it becomes more a question of user failures that are likely to help to lead to that data being disclosed. Absolutely, that's why we have a very, I mean, I describe it as a pessimistic strategy in that we tend to always have data protected unless it's absolutely necessary, and we only unprotect by default, sorry, or by exception, my goodness.
So, you know, our advice is always keep the data protected because human error is, you know, the sort of root cause of a lot of breaches these days, and therefore if your protection is never off, then you sort of eliminate that risk. Yeah, so your tools help people, you know, you talked about this small percentage of sensitive data in this enormous pot, and so your tools help people, first of all, to find this data. That's the first issue. So once you've found it, do you then recommend that people protect it by tokenizing it or whatever, wherever it resides? How do you deal with this?
Yeah, we would certainly advocate that, is be holistic in your approach, so have a data-centric protection, which is to say if you have a customer's name in multiple systems, protect that name in multiple systems, don't ever allow it to be stored in the clear or used in the clear unless the process demands it. So our software does that in a consistent manner with a single policy, so that allows you to still run all your business processes on the data without ever having that data unprotected.
A very simple example is, you know, the airline business is one of the world's heaviest analytics users, they analyze, you know, how much does the how much do soft drinks cost as they travel around the world in terms of fuel and stuff like that, but the only time you really need to see a passenger's first name and last name, the sensitive information is when they're boarding, when they're in front of you, when they're at the ticket desk and when the internet application authorizes that that person's biometrics are correct within their telephone.
So actually it's very much the exception and that information is needed within the normal sort of business process and analytical usage and certainly something like an AI model would never need to see that information. So we would say default everywhere, default protection everywhere and do that consistently in a single policy. And so by doing that, the protection follows the data. Absolutely. That's the key to the whole of this, that if it's stored in the cloud, if it's going through a network, it's still protected. Exactly.
But even if it's going to a third party, we have customers who will send protected data for analysis and that will come back with product recommendations. In the gambling world, we've actually seen real-time odds being generated, but that's never on personalized information. It's always on protected information and the personal information only goes to that customer themselves. Yeah. So that's a very powerful, powerful approach to this. So you've given a couple of case studies there.
So if some organization were considering moving to the cloud, what would your advice to them be about protecting data? Well, the first piece of advice would be, be consistent and you never want to be in a conversation with an auditor or a regulator and talk about, well, we do one thing here and we do another thing here. People have pretty mature sort of protection mechanisms or control mechanisms on premise. Those have to be reflected in the cloud. And then when you go to the cloud, you have to look at that sort of shared platform aspect.
And that keeps you on the right side of things like trends. So really, what do you have today? What do you have to build on top of what you have today in order to enable the cloud? And then I would then say everything then has to be consistent with that whole stack.
So for me, it's about protecting information the same way everywhere. If you don't, you're going to get into an awful lot of problems pretty quickly with the inconsistency and you will drop a ball.
And those, as we've seen, can be very painful and expensive. Good. And so another thing is, at the beginning of this, I was talking about these extra controls that you need to put in place, these customer entity controls. So how does your approach help people to or organizations to demonstrate compliance that they are safely using their thing?
I mean, it's one thing to do it. It's another thing to be able to demonstrate it. So what do auditors come and typically ask you that your product can provide as evidence?
Well, first thing you have to do is implement this. There's no point buying the software and not using it. Auditors are always looking for inconsistency. And then the way they look for that is through, they would talk about demonstrability. So can you show how you protect, who's protected and what happens? So we have, you know, either we can do the analysis ourselves, we have a component that does log analysis of access, or we can ship out to your log analysis world or your security world. And that can be done centrally. But really, we audit all accesses and all controls.
So it's integration with your existing framework, and then you show the utilization of those enhanced controls. And so it's really about making sure that you can demonstrate that you have implemented what you said you were going to do, and how that is managed on an ongoing basis. And that really is about, you know, transparency, really, and be consistent, because auditors will always pick up on inconsistencies, and then make sure that you can demonstrate that consistency through the protections that you've deployed in the systems that you cover.
Yeah, that's, that's good. So you can see that the protection is there. And you can use logs to see if there's been any attempts to, for that to have failed or been bypassed in any way.
That's, that's really good. So I think we're now coming to the top of the hour. And so I'd like to thank you, Alistair, for your very good contribution and very interesting case studies that you gave here. I'd like to thank the participants for their interaction. And with that, I'll say thank you very much to everyone. There will be a recording of this, and it will be available shortly. So thank you, everyone. And good afternoon.