Increased global competition is driving businesses to adopt new technologies to enhance existing processes and enable new business models, new revenue streams, and greater interactions with partners and customers through Digital Transformation.
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Increased global competition is driving businesses to adopt new technologies to enhance existing processes and enable new business models, new revenue streams, and greater interactions with partners and customers through Digital Transformation.
Increased global competition is driving businesses to adopt new technologies to enhance existing processes and enable new business models, new revenue streams, and greater interactions with partners and customers through Digital Transformation.
Hello, I'm Richard Hill, a senior analyst at KuppingerCole. And today we're having a webinar about enabling zero trust with dynamic authorization. And this webinar is supported by cloudentity. And joining me today is Nathanael Coffing. co-founder and chief strategy officer at Cloudentity. And before we start, I'd like to give some quick information and some housekeeping notes, and then we'll jump into the topic for today's webinar. As you may have already noted. We have a series of upcoming virtual events, all in a very modern format with panels, presentations, keynotes, and much more.
The next upcoming virtual event is soaring towards an enhanced up strategy on March 25th. The Casey live event, where you could learn about the machine aided human decision-making and best practices for implementing a robust endpoint strategy for your security organization and balancing SAP security access protection and authorization is on April 14th in which you could hear from experts at SAP together, along with speakers regarding key SAP security concepts and solutions.
And then on April 28th is balancing SAP security where you could hear about the path from traditional SAP access controls towards their new solution of SAP and much more. So there's a lot of virtual events and as well as other types of events throughout the year. So please take a look at our website research and blog posts and videos. And now for some more housekeeping, everyone is automatically muted. So there's no need to worry about muting yourself. We'll be recording the webinar, which will be available on the KuppingerCole website.
Also we'll save time at the end for questions and answers that go to meeting control panel has an area where you could type in your questions at any time in which will answer during the question and answer session at the end. And with that, let's take a look at today's agenda. I will start by talking about dynamic authorization management and policy based authorization technologies, and then look at how they could be used to support zero trust approach to security and help to address those changes around API identity and authorization.
Once I'm done, I'll turn the webinar over to Nathaniel who will discuss the need for speed and scale when adopting new technologies and business models for the enterprise, in order for them to remain competitive, and then they'll address how API identity and authorization challenges could be met. And then finally, we'll save some time at the end, as I mentioned for that question and answer session. So what is that landscape for it? Security? I thought maybe I would start off by helping the audience to understand how we got where we are today.
So traditionally the it environment ran within the walls of the perimeter. It solutions were more monolithic, centralized identities were managed and stored on premise. Local access control systems were used to ensure that employees just had access to the resources they needed through authentication and authorization with the ability to audit their user access.
And then we started seeing Federation hubs or bridges that extended the reach of where identities and access controls, reside Federation allowed for that secure exchange of user information that could be between divisions within an organization or between organizations in the same industry sector. For example, single sign on systems gave users the ability to authenticate once not only across multiple it systems, but organizations too. And then cloud services gave organizations new options for it motivated by that business need to increase it flexibility and scalability while reducing costs.
And then under that umbrella of IDASS, there are a number of capabilities, not only I am, but also capabilities ranching from SSO to full identity provisioning.
And then as organizations begin reaching out to their customers, gathering information about the consumers who are using their products and services, they found that they needed to provide a battle better digital experience through the use of customers, mobile devices and social networks, for example, providing that easier onboarding experience for consumers, but they also needed to be concerned about privacy and compliance such as GDPR and PSD two. And now we're beginning to see identity APIs becoming available, driven by that need to meet emerging.
It requirements such as hybrid environments that span across on-prem the cloud and even multi-cloud environments for reporting all the different functionality of I am Siam and IDASS with a key of identity. API is being developer centric. So in a nutshell, I am is continuing to evolve to meet that growing list of it requirements. Now let's look at some of the shifts we're seeing in the market today, a trend that is going well beyond identity and access management is that everything is becoming a service.
So we see new solutions running as software as a service or SAS and existing solutions being rearchitected into SAS solutions. So this digital transformation is changing enterprise. It driving that as a service model where everything in it could be provided or consume from the cloud. So there's this clear shift being observed in the broader, I am space from traditional deployment models towards service models. So this is one important trend.
Another trend that is closely aligned to the shift towards the service model is a shift towards a more modern software architecture using microservices, which is a software architectural style that is gaining momentum in the it organizations today. And each microservice is characteristically small autonomous making microservices for fine-grained using lightweight protocols and utilizing API APIs extensively, and most microservice application architectures use containers like Docker to implement their solution, implementing that modern it architecture will help transform.
I am into a set of microservices that are available to everyone and everything in a way that would be more secure, scalable, and in a manageable way. And among these architectural trends is also a need to have a separation between identity application and data. Data really must not reside in the application. Rather applications should utilize identities in associate the data that belongs to them. And you can think of this in terms of using authorization and governance for that data. So the business environment is also changing.
Consumers are increasingly demanding privacy and control of their personal data.
Along with transparency into hidden fees and tailor made services for customers with a centralized location for the services and customers now use many more endpoint types and automation to what they need, which in many cases includes access for their financial services such as online banking or online payments services that authorize and process payments between customer and the merchant and then the lending services or even services that provide a consolidated view of customer's financial data and all of this type of financial API based services need to adhere to the laws and regulations like GDPR PSD, two or open banking and other electronic identification and trust services, and other regions of the world outside of the UK and the rest of Europe may have similar laws and regulations to consider.
And now we have this other overarching trends that are affecting, I am. We have this broader notion of identity and access management. And with it in this connected world, I am is expanding beyond people in a single organization to include a broad range of identity types, such as employees, partners, contractors, consumers, and even other Teligent things or devices, identities. And I am are converging, which enables everyone accessing every service via any device from any location in a more controlled manner.
And so trust has become required of IBM solutions also where trust is never granted implicitly, but is continually evaluated as well as using defense and depth tactics. Zero trust principles are designed to prevent that those types of data breaches and limit internal lateral types of movement and strong and adaptive authentication and authorization are key as well, as well as supporting a wide range of identities and identities services as they evolve and mature.
And as I talked about earlier, microservices, containerization container orchestrations like Kubernetes, for example, in APIs are becoming the architecture of choice for standard cloud computing application programming interfaces, or APIs. As I mentioned are key to enabling a service-based approach to I am.
I say, provide the bridge between services, microservices, and containers on prem and in the cloud. So here's another shift that we had been seeing and where the future of IBM will really become API based. And when it comes to things like identifying risks or identifying, for example, outliers and access rights, artificial intelligence will help us in that regards. So current and future API capabilities have the potential to really help organizations consume and drive value from big data and drive decision-making through powerful analytics.
Now let's move on to the topic of zero trust and more details. So here's zero trust means to only trust once verified. And this is really more of a concept and architectural model that is applied by combining processes and technologies to bring about that more secure environment when verifying access to resources, the identities devices in that context of users should be considered along with other indicators, such as the network that they use or the data that's being accessed.
So the mindset of zero trust is to expect threats, put in the policies in place to restrict the access and then continuously monitor and verify users and their need to access resources.
And to do this, I systems must support the best practice characteristics such as the principal least privilege, privilege only giving you enough access to get the job done, centralizing and standardizing access policy management and attribute definitions, making decisions in real time, and having the ability to adapt or react to environmental changes along with providing that continuous authentication and authorization at scale. So how does dynamic
Well dynamic authorization provides access to resources such as applications, data, or other sensitive asset dynamically in real time, commonly using attribute based rules and policy. Some dynamic runtime authorization patterns could include API access patterns that provide a front end authorization mechanism for managing and enforcing authorization policies for services exposed by the API APIs. And these policy enforcement points typically authenticate API calls via API keys or tokens.
And then there's API gateways that provide inbound and outbound type of requests, authentication and authorization. And then there's those external authorization managers that externalize all the parts of access decisions from applications and cloud patterns like federated runtime access or OAuth access patterns extend that basic web access or API access authorization to the cloud by BiPAP bypassing are passing those security tokens about the users or the devices to the policy decision point to make those authorization changes.
And when you're defining access policies, there are some basic components to consider such as the subject, which is the user or the entity that is being attempted access to a protected resource or the action which defines the type of operation that that subject would like to perform on a resource such as a read, write, or an action on a function within an application resource is that data service or system set the subject, which is the access.
And then the context attributes of the access control policy aids in that dynamics, making it context aware, or potentially more risk intelligent in the sense, for example, attributes about the environment, the time and day geolocation current threat level really helps to make it more actionable decisions about access to resources. And with all these policy aspects, data quality is really key when information about different data sources are tied to these attributes, its meaning or values should be the same across the different data sources. And sometimes that doesn't happen.
And this is where organizations really struggle and are challenged by. So this graph is from the dynamic authorization management research document from Cooper Nicole and what's a graph is showing is that dynamic authorization is really a mature and established technology. And it's a driver for future use of this technology and its ability to handle those complex access control use cases. That includes things like finance and privacy and into the terms of market direction. Here are some areas that will help move dynamic forward.
So first, you know, making it easy as possible for organizations to define their access policies. I think we're well past the days of hard coding policy logic, which requires developer support. Having that ability to define policies is close to near natural language as possible is good. So that businesses and security analysts analysts can easily use the policy authorization or a authoring type of tools for common use cases. Policy use cases, templates should be provided also having the ability to simulate and test policies is also needed.
And there's a clear trend towards deployment in the cloud native type of environments. As I mentioned earlier, support for more modern system architectures that are using APIs, microservices, container orchestration type of systems, like Kubernetes is also needed an automation where automation makes sense for access control to make it more efficient, such as putting authorization controls in place. As soon as new applications are discovered on an organization's network. So I will stop here and now I will turn over the next portion to our guests, Nathanael from Cloudentity.
Thank you, Richard. I appreciate the introduction and the great overview of where the, the world is today, right? Where we come from and, and where we're going. So as Richard mentioned, you know, this is about dynamic authorization, right? How we can bring more and better data into our authorization decisions and move down into a transactional aspect.
And, and we'll talk over and over again about hyperscale. And what I mean by hyper scale is, is hundreds of thousands of token mins transactions per second. And so right at the core of that, right, is bringing in this dynamic authorization, right? And then we're looking at all of these net new types of services, all these net new application infrastructure types and all of these net new, you know, almost protocols, right, that are communicating both in-between the services, meaning east west communication and between the client and services, meaning north south communication.
And so you've seen this, you know, this advent of distributed computing, right? And that's really what the cloud has been bringing to bear. We've seen past FAS, you know, says all pop-up right. And we're now we're seeing things called service meshes also pop up. And these are essentially automating, you know, what AWS is doing under the covers, right? But allowing you to deploy it anywhere. So it could be an AWS, it could be in your own data center could be an edge compute, could be almost anywhere. And we're also seeing the advent of net new networking, right?
This 5g concept where we have limitless networking, going down to our edge compute and connecting that back into our core, all of that. And what we're seeing today is all of that is leveraging Kubernetes. And within Kubernetes, we're building microservices to help our development teams move faster. Our businesses to roll out new features, functions faster, and the ability to interconnect, right, a service via an API with IOT devices, with users, with backend services, with partners and with customers.
We're also seeing less, as I mentioned, kind of the change in communication channels, right? So over the last, you know, 15 years, we've slowly moved to everything being HDP. It started off with, you know, soap XML, and now it's moved almost entirely to rest for north south traffic.
However, rest isn't quite fast enough for that east west traffic. So we've adopted a new protocol called GRPC, which is a, a Google remote procedure call a standard. In addition, we wanted to started to reorganize our data as we dumped it into data lakes and said, well, I need to build relationships between it.
And because of that graph, QL is really becoming adopted very, very quickly as a way to provide a singular entry into your data lake and then interconnect all of the different entities, the edges of that data lake for, for reflection back out into your identity authorization, as well as microservices that are consuming it. And one of that's reasons that that's happened is because as we built up our applications over the last, you know, 20 some odd years, right?
With, with the advent of the internet, we've really seen this, this authorization of privacy sprawl that's caused our businesses to slow down. It's caused our customers to suffer because of, you know, not great customer experiences, disjointed customer experiences, and it's caused our developers to move slower. Right. And so let's just a quick reflection of some of the things that we've seen, meaning, you know, we have mobile app built in 2018, right? And that's using its own type of authenticator, its own type of IDP.
It's bringing in, you know, some medium brand authorization, but all of that has been inculcated. It's been ingrained and hard coded into the application itself.
You know, we've got things that are, you know, kicked off from even earlier, right. A banking website in this example where, you know, what authorization was built, very coarse-grained they used, you know, the, the, the modern IDP of the time and then put it behind a, you know, a one trust or privacy Brank blanket, right at the front door saying, Hey, we're going to use your data for processing transactions. And then on the far right-hand side, we even have, you know, what the, what their developers are building now, what the business is focused on.
We have this Kubernetes service and that Kubernetes services or sets of services, I'm sorry. Well, they, they want to move into the modern technology, the modern identity stakes as well, meaning let's adopt Fido. Let's get very fine-grained in our permissions because we want to support that customer experience. We want to delightful customer experience and let's do very fine-grained authorization. And that's fine grained authorization, not just between the Kubernetes or hosted services and the client apps, but also fine-grained authorization.
So they control control how data flows in between the different Kubernetes, pods and services. So the question starts to become well, if I'm managing all of these, well, how do I provide governance, right?
How do I actually get back in there and say, well, I'm protecting PII according to GDPR, based upon these policies and to Richard's earlier point, that was the concept of near natural language policy packs, externalizing, authorization away from the services, and then layering on top business value via these policy packs and being able to understand exactly what data is flowing, where, how the user authenticated, how the service authenticated, if that service has actually asked the user, whether they consented user data and munching that all together into a common framework over standards-based protocols, OAuth, OPA OITC spiffy.
And so what we've seen is that, you know, that that is the, the constant in life that, that transactional change what's going on in between those services what's going on between the client and services and what additional context can we bring into it and how can we meet that, that transactional token at a very, very high capacity in conjunction with things like, oh, well, 2.1 and some of the, the specs coming out for rich authorization requests and, and pushed authorization requests in a while.
So we've got to bring in as much context as possible so we can make the best decisions at the services edge or at the service at the API that we can. So obviously first and foremost, the request or identity who did it, right? Where did they authenticate? What IDP did they come from? What data is that IDP providing to us? What security groups are they in? Things of that nature. Right.
Very, very important, right. But that's really just the start and we've treated identity authentication part of identity, sorry, as kind of this, this kind of singular whole end all in the past, meaning because we were trusting everything that was happening inside of our data center, authenticating at the edge and having a long lift session token, that could be 10 minutes as along the set session. And it could be 30 days depending upon your corporate policies, of course.
And then you reusing the data that token to either mint, you know, sub tokens, or to be able to patrol authorization or just passing along these giant tokens that were almost too big for, for the headers and your HTTP traffic. Next, we want to bring in more data around that. Right. So bringing in better transactional value and that transactional value as well. What is the actual value of the transaction? Is it a hundred dollars, is a thousand dollars. Is it sensitive data? Is it PII data? Is it Phi data? Right? We've got to know what's in that transaction.
So we know how to adequately protect it. What's the value of the services. Are they known bugs in that services?
You know, when was the last security scan, being able to understand that cyber context also allows us to know how much we can trust what's in that service, the identity of that service and the fact that that services is, or is not breached the where, right. Where did the user come from? Where do the user going? Where's the service located is a user authenticating over in the UK and trying to access the service in Italy or Germany.
Well, we have very regiment rigorous requirements around what data can get passed then on behalf of the user, right? Meaning we have to know the context of that geolocation of both the service, as well as the user, to be able to determine what a user can consent to. And also what a service can share with other backend services. The Y intent-based authorization very, very key to the future of OAuth.
Very, very key to things like open banking, consumer consumer data rights protection in Australia, financial data exchange in the U S being able to understand what a users doing based upon what they're accessing, what they're passing along. All of that context also leads into the regulatory requirements.
You know, why are we allowed to pass this data? It's no longer kind of this open season because everything's in a side of a trusted perimeter. I actually have to have reasoning and all an ability, right. For why I use that data, or why pass that data off to a partner because there's, there's regulatory demands as well as financial penalties. When I do that in the, in the inappropriately, lastly, of course, the easy one time of day, you know, when does it happen? Is this out of a normal user behavioral pattern is in a normal, but user behavioral pattern. Is there something that's happening?
You know, that's called, oh my goodness, are they doing a impossible travel? You know, and then accessing things two different times a day, being able to munge all that together, all of that becomes part of that intent based authorization. All that becomes part of a context that's requisite to process a transaction safely and securely. When we looked at in a little bit more, you know, real-world fashion, right? Bringing in that request or IDP that user info, that groups, those roles, DMR records bring that across different IDPs.
You know what we're seeing with the advent of cloud and the adoption of cloud, right, is that you're almost forced in some cases to use the cloud IDP. Right. And that could be as you're a, B to C, that could be Cognito, that could be Google identity, right? So you need to be able to munch different IDPs data together, cleanse it, normalize it, make sure it's processed. So it all looks the same to your underlying services. That's the only way you get a normalized authorization construct.
Second again, that application identity bringing in, not just, you know, the, our back and APEC related data, that's coming from the IDP, but also what else can we get out of the application? What entitlements are there? What type of spiffy based PKI identifiers are identifying that individual workload?
What, what does that application using, you know, from SQL no SQL from a consent perspective, you know, again, bringing all that together and then using the API gateway, your existing infrastructure, Envoy, things of that nature to actually do that distributed policy decision point to do that distributed policy enforcement. So you've not rewriting the world, right. You're actually taking what you have from an ADP perspective, taking what you have from an API gateway, Kubernetes cloud architectures perspective, and bringing it together, using a common authorization fabric, right?
Using dynamic authorization to embed all of that data into your authorization, going down to the application. And so what we're really talking about there is how can we, it start to separate apart the identity piece, the OEDC away from authorization, right? Using the concepts of session and context.
And, you know, if we think back right to 2012, 2013, when
So think about it from a, a perspective of, I want to standardize what my identity context, contextual inputs look like. I want to make sure I understand what applications are out there and how I can bring them together by plugging into my existing API gateways, my just incriminate infrastructure, my existing FAS and paths, infrastructure. I need to be able to dynamic build and update, assign policies. And lastly, I want to wrap up a full layer of governance and I need to trust it, right. I need to make sure that I have, you know, Fabby certification, right?
Because API APIs are passing or passing PII data regularly. Right. And that PII data is kind of become what I like to think of as, as the new CO2. Right? So for a long time, that PII data, it was, it was oil, right?
It was, if you're not paying for the service you are, or if you're not paying for the product, you are the product, but now we're getting away from that. We're saying, okay, well, our, our customer data is sensitive. Our customer data is the most valuable aspects of our business. So let's protect it. Let's protect it in a way that is a financial grade protection. So if we look at that from a slightly more technical detail perspective, right, we've got that user IDP bringing in a way coming in via LATC, we've got a token mints that could be occurring via that IDP or via the API gateway.
We've got privacy, that's occurring often embedded directly into the application. We've got enforcement that's happening at the API gateways or, or the ingress control or, or, or to Kubernetes, or even the ingress controller to the pods. We've got service identity that's emerging. So we can control how data's flowing and understand what workload we're talking to. And then we've got authorization policies, right.
That, you know, we've seen BB be crafted in Zack Amal in the past, and now moving towards open policy agent in the future. And so if we look at that and it kind of bring it all to bring it all full circle, right, we need to start to bring all of those different pieces outside of the IDC together. Right? So tying scopes, the policies, making sure that scope of read email really means, well, I want to make sure that's user authenticated with this NL level, with, you know, with a, with an MFA perspective and that that user has consented to that email.
And then lastly, that the application is able to see that user's email, right, bringing that all into a singular place for governance, right? Because now that I've got five, six different different technologies and different different standards working together, well, I need to wrap those together and I need to process them in a way that the business can understand all of that.
Again, you know, leverage content based authorization and building the connection right between the user, the service, the policies, and the underlying data that's going across. Okay. Looking at it in a little bit more logical view, right. We see that digital identity fabric, and this could be a SAS service. It could be in collection of SAS services. Right. But that's where the, the longer term data lives. Right. And that longer-term data is that user registration that self-service that strong authentication, what type of MFA are they going to use delegated administration?
So business can actually hand off control of the underlying developers, you know, to a partner, right. That gets, gets married to the authorization fabric. And this is the OEDC, you know, plus OAuth connection. Right. And that authorization fabric, again, it needs to know what applications are out there. Right. What APIs do I have? What service do I have? What API gateways are they on?
What, what Kubernetes clusters are out there. And then what pods are actually in those communities, clusters, how do I normalize the data coming from my different IDPs? You know, I might have developers coming from, get developers coming from Cognito and have, you know, Okta as my employee. Right. So how do I normalize the data coming through all those IDPs to ensure that I can reuse that data at the service level, without writing shims, right. To remap email, to email address, or a last name to surname, right.
So I want to normalize that all right, at the front door, as I'm getting ready to mint those different access tokens down to my different APIs and services. And then you'll see missed is released a, you know, a standard for protecting microservices. And they call out, you know, micro proxy that they call it micro perimeter that micro perimeter, you know, has to act as this, you know, essentially this, this old or not old, but this new perimeter for your individual services itself.
And that, that is your new edge, right? The edge of the service, the edge of the API is your new edge. So I'm going to do all of my identity clarification, all of my contextual authorization, all of my inspection of the API traffic going down in that service right. In that micro perimeter. Right. And I'm going to use that to make sure everything in that service is authenticated. Did the user authenticate with the proper AMR record, did the service authenticate and register itself appropriately?
Did the service register with, as an OAuth client appropriately making sure all of that really coalesces into a common framework. There's also the concepts that are, that are emerging of API security gateways, right? How do we wrap this together? How does my micro perimeter or my micro proxy plug into my existing API gateways? And then how does my API security gateway reconsider that, that data right. To do a distributed policy decision distributed policy enforcement point.
Lastly, we take all of this very, very rich data of what's transpiring in my enterprise and roll it back into an actual analytics engine that actionable analytics engine really gets us into the crux of what is transpiring, right at the service during the processing of this data and being able then to, to build those remediation loops, right? Meaning, oh, shaking account takeover, attack, overseeing session or token replay attacks.
Oh, we're seeking, seeing broken object level authentication or broken optical, sorry, authorization being accessed, or we're seeing business deed also emerge. Well, let's quickly use that as a feedback loop to build better policies in real time, push those down to the micro proxy, to obviate or the, the, or to, to build in real time protections back to those threats that we're seeing occur within the analytics engine.
So again, taking one step deeper, right? How does this all work together? Right? And this is just a very simple open banking based intent-based authorization flow, where users using dynamic authorization, right. To try and access a go banking app that, that, that user hits the OGO banking.
URI says, Hey, I want to, I want to do a balance transfer so that URI or that application has to go and actually request access. And what's going on is that this user is this user doing things appropriately, the authentication or the authorization service needs to go and make sure it has the proper authentication, the context of the client, the proper consent.
Did the user actually say that we could know, we could know their transaction ID as well as their account number and correlate that with their last name yes or no, bring that data to corroborate the identity and then push that data into that transactional token, back down to the go banking app and process the individual API transactions down to the API gateway.
Authorize it again, just to make sure everything looks good, meaning we've been able to redact or control the data flow that goes to the API gateway, but we're also going to now execute policies at the API gateway for getting down to the underlying service, the underlying protected services and API APIs, right. And bringing that all together into a COVID common governance framework, right.
This looks really complicated in reality, you know, it's quite simple, meaning you're doing more of the policy decision, the policy management perspective, a lot of the, a lot of the, I'm sorry, a lot, a lot of the context, a lot of the underlying connections, right. Are built directly on top of open standards and pre-integrated for you. So when you put it together right, and say, well, you know what, I do want to take my API security. I do want to build an authorization fabric. I do want to control how a PII and other data is flowing, flowing in between my different services, right?
Core to core to that is, is building a mutable service of building a mutual services requires dev ops. We want to do security appropriately. We want DevSecOps, right? And that's going to build out a couple of different things, right? So first automating that service discovery, understanding what that landscape looks like of the underlying services, being able to bring all that back.
So, you know, these, these API APIs are serving up, you know, PII data. These API APIs are open banking. These APIs are Phi. These APIs are, or are, are just open to the general public. So understanding what those APIs are and then classifying those third part is a asserting assuring that the right policies are being applied to the right API.
And, and the key here is again, building policies around the knowledge that I want to build zero trust. I want to build intent-based dynamic authorization, and I want to link those together, right? So I want to make sure that my policies are in co-locating where the geolocation, the context, right? The geolocation of the user, the geolocation of the service, the data that they're accessing, the value of the transaction, et cetera, et cetera, and then pushing those policies down to be evaluated right at the API endpoint.
And the beauty of that is that really makes your governance process because they're all pre-integrated right. It makes it very, very easy. Meaning. I want to know what developers are able to are, what scopes are tied to what policies and applied to what underlying API, right? Because I've got a change in my PII data structures. No problem. You know, look, look back at your, at your diagrams, look at your topology charts, look at your policy facts, and you'll see that's all appropriately working.
The other beauty of the beautiful part of that, right, is that you get this inbuilt, a waspy API threat mitigation. And so if we look at the OSBI API, top 10, the vast majority of them related to identity and authorization identity, the service identity, the user authorization of the function, authorization of the object, and being able to build policies in real time, process them in real time, update them in real time, automates your protection against that, a Las VPI threats. So lastly, right.
Again, I want to make sure we, we, we think about it appropriately, but that dynamic authorization it's really about how do we decouple the token service away from the authentication services, right? How do we adhere to open standards, right. And avoid vendor lock-in right. By using vendor libraries, I'm a humble fellow. I'm consigning myself to vendor lock-in so make sure that everything I'm using is, is based on the old standards and, and has a licensing models, right. That are built for my developers because they're going to go and go build it and immigrate it regardless.
So how can I make sure that I'm not building liability for my company, the ability to enhance your existing IDPs with fine-grained authorization and consent management, right? Those are not core functions of an IDP and neither is transactional authorization. Right. So being able to, to, to, to take what I have, meaning a great authentication context build on top of it, great authorization, entitlements, permissions, consent context, right. And then leverage those for making the authorization decisions right. At the edge of the service itself scale. Right.
Again, when we start thinking about transactional minting, right? Our scale requirements go up several orders of magnitude going from, you know, tens or hundreds of all of OAuth access, token mints up to hundreds of thousands and maybe even millions in the future, right. Because between each and every service, I want to process it. So I want to increase my scale and I want to reduce my latency. So making sure again, that by policy to decision points like token minting capabilities, et cetera, et cetera, are all right at the edge.
Right, right next to my services. So I'm not hitting the wire, going back to an IDaaS, hitting the wire, going back to a monolithic IDP on prem, but it's all actually happening in real time at the service and bringing in policy governance and the appropriate constraint, lastly, you know, brings support for new authorization standards, right? There's lots of things coming down the pipe, a rawr and par Pepe too. Right. And being able to say, well, I don't want to upgrade my, my IDP. Right.
I want to upgrade my authorization fabric because all of the net new standards are really coming around in that perspective, right? How can I protect the data that's flowing through my services better? So being able to adopt those without upgrades, without even changing IDPs, right. Becomes increasingly important as we adopt the distributed services as we adopt the zero trust models. And as we adopt, you know, the future. So thank you again, my name's Nathanael Coffing. I appreciate you taking the time to listen to it.
Listen to, to me this morning, if you'd like to get in touch cell phone and an email address or, or on the screen and we're open for questions. Okay.
Thank you, Nathanael. So we've reached the question and answer section of this webinar. And as I mentioned, the recording of the webinar and slides will be available on the website. If there's any questions on your end, don't hesitate to enter them.
Now, the go-to meeting control panel has an area to type in your question at any time and I'm just need to share my screen and then let's get on to the questions and answer. So the first question that came in just a moment get to the top authorization or permissions are not always only permit and deny for effective UI authors. Authorizers must also answer questions like which records can users access, or what could users do with this account? How does cloud NAD deal with that? So a great, great question. So which records can a user access, right?
That that's often part of the entitlements or part of the policies, right? So it's evaluating that in real time, taking that policy, running it and giving the feedback from the service, whether it's a data store underlying service, right.
That, that is accepting that policy as well as accepting that transaction a second piece, which I thought it was a great question is, is what can a user do? And I think that's, that's a very employee based way of looking things, right? That's how we thought about governance, you know, for the last 20, 20, some odd years and it's working right for the workforce. But when we look at that, you know, on distributed services, as well as when we look at that from a consumer perspective, right? Authorization has kind of changed a lot.
Meaning we had, we had day one of authorization where we had our back a back, you know, payback, right. And that's kind of where authorization is, has come from now. Authorization is taken a step forward in 20 16, 20 17 with GDPR, right. Which is what can a service know about me and then took another step forward, right? With open banking and building dynamic data sharing agreements and saying, what can a service share about me right. To a third party or even to another business unit within, within the same organization.
And so the authorization construct is really a, you know, has really changed rather dynamically. So I'm not so concerned about, you know, what are my overall rights as a consumer, because my rights are usually fairly limited. I am very concerned about what can that service know about me? How can that service share data about me? And then how do I bring data lineage and construction of exactly what transpired in the transaction to prove or not prove that service ACE, all my full user records, service B only sold a subset. It was supposed to say.
Great, cool, great answer. The next question that came up can contextual authentication policies be extended towards authorization?
Oh, contextual authentication. Sorry. I misheard the question. Absolutely. So any context that's coming over in, either from in the token, right? Meaning you can publish a claim that token says risk equals, you know, X, Y, Z, right. So that can be extended and enhanced. In addition, we do have a few callbacks out to, you know, different fraud engines or the ability to build web hooks out to different engines, right. To go and aggregate additional information.
So let's say tomorrow Octa releases, you know, the ability to query their, their authentication fraud engine or their authentication context engine. I'm sorry. You can actually aggregate that data, bring that back and make that part of your future access and authorization tokens. Jock is just an example. It could be, it could be a zero B2C, sorry, know you specific in, Okay. The third question that popped up, how does decentralized example dif slash PCs bitten to your vision? Could you repeat the question? How does decentralized fit into your vision?
And the example that they had was dif dif slash VCs. So I think he's talking about decentralized identity is what the question is really around. So decentralized ID identity is really a data source and it gives you another layer of control of what data as a user rate, I'm going to release in my, you know, whether it's a Bitcoin based decentralized identity or something else, I have much finer grain control of how I'm releasing data up to an organization.
But again, I still run into the same, the same problems, right? Meaning once the new organization has that data, or once a service has data about me, how do I control how that service shares data? How do we make sure the developer doesn't take it and dump it into SIS log as well as you know, out into a SIM somewhere because what we need to avoid is that data propagation, right?
Again, thinking about this particularly PII data, right? It used to be the, the oil now it's the CO2. So how do I protect from the propagation of that CO2 by controlling how that data flows within the access tokens, particularly around PII. Great. Another question that came up is what, what does that importance of service identity in zero trust? That's a really good question. So service identity. So if we think about zero trust, zero trust really actually started as zero trust networking, right? So it's 10 service, one talk to service, two, what data can it pass?
How did those services authenticate? And then, you know, it got it, got it got diluted a bit, I guess that's a fair way of saying it right where we had people saying, well, I want to, I want to hone my product into zero trust, but my product's really only a perimeter based product. Right. And so they start saying, okay, well, authentication constructs at the, at the edge of my data center is zero trust.
Okay, well, the user is authenticated. That's great. But when I start saying, well, not everything's my data center, I actually need to start to roll out distributed services. I want to roll out net new edge services to enhance my customer experience.
Well, now I've got services that aren't behind my traditional, you know, firewall IDs, IPS, you know, cybersecurity infrastructure. Now really my only protection of those is the CDN, you know, what's going across the wire and
Well, okay. So I got, I've got good data there. And then what data's flowing across the wire, the value of the transaction, but I need to authenticate that service right before I even consider giving it any of that data because the unauthenticated service doesn't allow me to have that zero trust. I don't want to think hated service is just a service out there that could be processing data off to, you know, off to China, Russia, or, you know, wherever the boogeyman of the day is.
Or it could be a legitimate service talking out, down to down, back to back in services, in my data center or in services in my VPC. So being able to authenticate from the zero trust perspective, from a holistic or even pragmatic perspective, I've got on authenticate the user, I've got to authenticate the service, I've got to authenticate the workload and I've got to authenticate the data going across the wire. So all of those have to work together.
If he's a, an emerging standard that's been accepted that the CNCF, right. That's really seeking to build that workload identity. We've connected that spiffy identity to your old wealth client identity. Right? So now we have a connection there in addition, connected that to your, the policies that are protecting those underlying services and APS. So it builds that holistic infrastructure of what zero trust was was originally meant to be Okay. And we have a follow-up question to that one about, you know, how does decentralize VCs fit into your vision?
The followup question is says beyond decentralized identity, couldn't VCs be used to convey fine-grain consent on self-sovereign user basis. Yes, it could.
This, this is probably a deeper discussion as we, as we talk about, you know, what I am authorizing both a user to see as well as what, not, not just what I'm consenting to, but what I'm pushing out. So please feel free to hit us up directly. I can share my email address or use the infill@clarity.com and love to have a deeper discussion on this. Great. So let's see. Another question is in your presentation, you mentioned integrations with web application firewalls. And so what does that integration look like and what information can you glean from that? So another good question.
So the web application firewalls are really, you know, the front door to your distributed services, and they've done a great job of looking at ACB. Traffic's starting to pick up what's in it, API traffic, right. And giving at least a, a general overview of the safety, right? Do they come from a tour network? Is it on a white list, black list? Is there something obviously nefarious in it? Right? Meaning cross-site scripting cross-site request forgery, you know, or even, you know, DDoSs attacks.
So all of that data is very, very valuable that every one of the major CDNs are able to publish that data either via API or publish bits of that data in the, in header, the index should be headed. It's coming across the wire. We're able to pick up that data, right. And integrate that data as part of a risk profile. We're using that route, using both validators around what's in the header data, as well as saying, okay, the risk factor of this data is, you know, XYZ. And then that is now, I'm sorry, that now becomes part of the contextual authorization that happens at the underlying service.
If you have a high risk factor, right. I'm not going to let you go to a transfer a money app. If you have a low risk factor, I will be, I have a high risk factor, allow you to go to the Wiki, but it'll probably stop you there, prompt you for MFA, try to do things that reduce your, your, your overarching risk. If I'm just accessing kind of dummy data, I don't care. Right. So that's one of the reasons that the identity, the data, the identity of the user, the identity of a service knowing what's going across the wire is so critical, right?
Because what's coming over in the header or, or via the API calls from those CDNs becomes increasingly important as I'm doing the policy decision and the policy evaluation. Yeah. Okay. Another question that came up is how could contextual authorization meet privacy compliance type of regulations? Privacy compliance are part of that context. And that's part of that context for authorization, right? At your first service. Israel is part of your authorization context at the second, third, fourth service.
So, so one of the pieces, right, is being able to build an aggregation of what privacy entitlements, permissions grants all look like fermenting within your individual tokens. And so w what, you know, what we're doing is, is one we're aggregating that data to, we are doing per transactional token minutes again. So I can say service a can see your full user record, right? Because you're, you're at the portal where your financial portal service B should only see a small subset, maybe last name and a transaction ID, right.
And being able to control that data by token minting access tokens that are specific for the data going across the wire is the whole reason for doing transactional authorization, looking again at it on a transactional basis, not a longterm session basis. So being able to remit tokens on the fly very, very rapidly, you know, with, with sub sub sub millisecond response times to ensure that only the data that's required or that's necessary for the transaction is going across the wire. And then you can build policies, right.
That are related to GDPR CCPA, or use out of the box policy packs that help you conform to those regulatory demands, right. To control what data should be flowing. Meaning is this API, or is this service serving a PID?
Well, I need to make sure I have consent then. Right. That's part of your policy. Great. The next question is how would you use over exact model? And I'm assuming the OPA is that open policy agent that you spoke About?
I think, I think it's maybe a Y instead of a how, and, you know, so, you know, the death was Akamai has been proclaimed. I don't know, for the last 15 years, maybe it's, it's been quite awhile, right? Zack bell has been going through a few updates with their 3.0 release.
You know, what, what really like about OBA and, and not open so much, but regularly the underlying policy language of OPA right, is the ability to do more CRO programmatic declarative authorization as part of the, the contextual dynamic authorizations happening at the service for Zack. It becomes very, very chatty meaning back and forth between, you know, policy decision, policy enforcement points, and I'm sorry, policy decision, policy, information points, and the end point for your pit, your pet, or your, a PDP for, for OBA and transactional token mints, right? Or for regular I'm sorry.
And transactional tokens. We're able to actually refine that data before it gets down there. So we can do a policy evaluation of the token minting process as well as another policy evaluation at the PDP itself. Okay. And this is a related question that came up is your product based on exact model standard for authorization. Okay. That clarifies that another question that came up is providing, are you providing this solution also as a managed service? That's correct. Yeah.
So you can, you can consume the solution as a SAS, or as a managed says, meaning you'll have token minting capability as well as policy decision points in your own VPC as close to your services as possible. And that's for scalability, performance and reduction. Right. Next question. Do you specialize in serving specific industry verticals?
So, you know, the, that's a better question. That's more related to how fast or different industry verticals adopting, you know, API centric framework, you know, and, and the advent of that would really be open banking, right? Where now there's defined API APIs for intercommunication, between banks and third-party payment providers, as well as obviously consumer client apps. Healthcare is hot on the heels of is hot on a teal. So we do have a number of healthcare customers.
And then finally, you know, anybody that's SAS or retail based are kind of billing the next generation of their, of their applications, right. That are external partner employer, I'm sorry, external partner and or customer facing are critical aspects for, you know, our go to market. Okay. We are now coming to the end of our time for today. So thank you all that attended the webinar, and we hope to have you soon in one of our upcoming events.
Thank you, Nathanael, for your presentation and to the audience. I hope this was interesting to everyone. Thank you.