I'm gonna talk today about two projects that were involved in last year, which we're about into work in a verifiable credential products. So we'll look at the projects and the protocols that were tested and the results. So the first project that we were involved in was from G FFF in the usa. This is Jobs for the Future, and the, they want to get verifiable credentials through the workforce so that people can apply for jobs without fear or favor, without necessarily releasing their name or their age or their gender.
But they can just release their educational credentials and then they can get job offers based on what they know and not who, who they particularly are. And this has been funded partially by Walmart Foundation, and there are three plug fests that are planned. One was in summer last year.
And the, the task there was very simply just a display, an open badge of E three credential, which, which would say what the, what the person was and, and, and what, what qualification they'd obtained.
Now, that might seem like a very simple task, but in fact it caused quite a number of problems with implementers. First of all, there were logos inside the credentials, so they had to be displayed properly on the user, on the user smartphone.
The issuer was an object, whereas many people had expected the issue to be a simple string because the standard says the issuer can be a string or an object, and, and many people just implemented it as an issuer. So once it came as an object that caused them problems, they'd got new app context values in there and multiple nested properties. And all of these had to be displayed correctly on the smartphone. In the end, 21 organizations successfully completed plugfest one, and the, the results were at that url, and you can, you can go and see it yourself.
PLUGFEST two ran from August to November last year, and now you had to obtain this J F F Plugfest credential from two different issuers into your wallet.
If you were an issuer, you had to issue it to two different wallets.
Now, now we're talking about protocols, not just the content of the credential. And three different protocols were attempted Open ID for verifiable credential issuing V C A P I chap and did come well, did, did come failed. There weren't two people could, could in work with that. So the only successful results were for open ID for verifiable credential issuing and V C A P I chapi and, and the that was, that was successful. We'll look at the results in a minute for that Plugfest three, the details have not been finalized. This is actually more complicated. This is about verification.
And the problem with verification is there's so many different protocols and so many different options that they cannot decide which ones they particularly want to go with at the moment. Are they going to choose diff presentation exchange?
Are they, are they going to use the CCG verified presentation request, for example? This is still the activity determined, but we're hoping to find out later this month the, the precise details of PLUGFEST three inter working project. Next project we're involved with was the NGI Atlantic Next Generation Standards Project.
This involved frown hoffer in Germany, crossword cybersecurity in the uk, which I led in spruce ink in the usa. And we decided we were only going to use open ID for verifiable credentials, pro protocols. We were not going to use chape and we were not going to use, we were not going to use the the V C A P I. But even that proved quite difficult because there are so many different options within the, within the standard that we had to decide what we were going to do. And we had to de define pro profiles of the protocols.
And the other thing that we said we were gonna do is we're going to put a trust infrastructure in there because the whole of very fabric den is based on trust. And how do you know that the entity is trustworthy? How do you know the issuer is trustworthy?
So we, we built it on what was known as the train infrastructure, which came from frown, hoffer and train builds a trust infrastructure based on the DNS where you can go and look up the trustless. And then the trust lists are held on the web in Etsy's standard format trust lists, which are part of EI data standard. So we built that into our, into our system, and you can, there's some references there you can look at for, to get further, further details yourself. There were, there are three specifications for, for Open I, for VCs. And we looked at, we only looked at the first two.
We didn't need C O P v2. We did, we found it was not actually needed for issuing or offer presenting credentials. And we did a profile for VCI and we did a profile for VPs. We also raised something like a a dozen change requests on the standards, which, which got merged in in order to enable our interwork into work.
The problem is, as we said, there are many different options. It's built on of V two. There are multiple dds, more than a hundred have been registered.
So, and you can't expect to do interwork. And if you're using different dds, there are multiple formats, JWT proofs or LD proofs. There's also mobile driving license, which it supports, which, which is a different format.
Again, there's multiple encryption algorithms, there are multiple oil floors, et cetera. So inter-op is absolutely impossible without making choices. So we pro we produced a profile, which we've made publicly available.
It's, it uses did JW k, it uses JWT format and the S 2 56 algorithm and the pre-authorized flow. It also supports the, the wallet on the same device as the browser. It supports cross device flow between a browser on one computer and your wallet on a smartphone. And it also supports web wallets where, where your wallet is held by some third party on the internet.
We had to add a terms of use to, to it to support the train tri trust infrastructure.
This is where the issuer inserts into its credential, A terms of use property, say I am a member of a trained trust infrastructure, and then the verifier, when it gets a credential, can go and look and consult the train trust infrastructure and say, is this issuer really in the trust list? And if it is, then I know it's, it's trusted. We also added an evidence to support E I D S levels of assurance.
And, and these were, these were, these are documented in the profile if you want to go. And we also published a paper on the trust description, which is available at that url. Then when we came to, oh, sorry, that we, we, yeah, we've looked at that one for issuing. Now here are the interop results for issuing. You can see there were, there were five wallets and there were six issuers, and most of them, the interworking worked.
There are a few that didn't finish within the timeframe, but you can see we've got a number of, a number of products now from different organizations that are able to interwork using that particular profile.
Also, some people decided they didn't want to use J W t, they wanted to use LD proofs. So they used our same profile with the same, the same combination of parameters, but they decided to use LD proofs instead of J W T proofs.
And, and those companies are shown there, and they all managed to do interworking as well. And of course, the previous companies, whether using JWT proofs, couldn't I with the LD proofs unless it implemented both proofs in their implementation.
Now, when we come on to verify presentations, again, we had to make choices. So we use did JW K and we use JWTs. We use the S 2 56 algorithm, and we also use an open source policy server. The policy server was there so that the different verifiers could all refer to the same policy because the, the, the profile, sorry, the, the actual standard, the I D C, the VP standard allows the, the policy for what you want in terms of verifiable credentials to either be sent inbound in line or as a reference to a policy server.
And by choosing to send the reference in the protocol, then the policy server would hold the common, the common set of requirements for all the different, different verifiers that were participating. The other thing we determined that there are two types of wallet. There are strict wallets and permissive wallets, which depend on the trust infrastructure.
And our definitions were as follows, if you trust the verifier, if the wallet decides that the verifier is trustworthy from the user's perspective, and the verifier will, will keep the user's personal information secure and private, and will go by the user's requirements on what to do, do with the personal information, then the wallet, if it's a strict wallet, will allow the user to send his personal information to that verifier. But if the wallet determines that the verifier is not trustworthy, it will not allow the user to send their personal information.
It will actually forbid it, A permissive wallet on the other hand, will allow the user to send his personal information to untrustworthy verifiers with a warning message. We cannot verify that this verifier is actually trustworthy.
They may, they may use and abuse your personal information, but it's up to you if you want to send it. So those are the two, the two wallets that we, that we decided we we needed to implement. There's a full description of this in the paper that's available and we're presenting it later this year at the Open Identity summit. And you can download the paper from there. So looking at the inter interrupt results, the crosswork permissive wallet successfully transferred a VP to the spruce verifier and vice versa. But we didn't have chance to finish with the strict wallet.
So that was not tested in, in, in time. Now let's look at the other protocol.
V C A P, chap and V P R. This works in a different, different way to the open. I open ID connect sets. So it's not based on OAuth, they're based on chapi, which is a credential handler api, which was actually built from, from the, from the credential API that Google specified for browsers. And most browsers have implemented it. So CHAPI has extended the, the credential API to allow any type of credential, not username and password. And in this case very verbal credentials to be used between, between a wallet, between a browser and, and a website.
But also it allows the user to choose where the credentials should be stored so the user can choose to store it in a wallet on the same device as the browser or even on a, even on a remote wallet that might be stored in the cloud somewhere.
And they've also specified a, a presentation request protocol, which competes with the the Diff P protocol to say how you choose verifiable credentials that you want to want to be received from the, from the wallet. So here's a picture which is showing chappy.
You've got chappy implemented in, in the in script inside the browser, and that allows the user to say, I wanna store this particular credential in either web-based wallet or in the wallet app on my, on my smartphone and the V C A P I specifies the protocol messages that will be sent across chapi because CHAPI is essentially just opens up a link and allows you to transfer anything across that link. And the V C A P I is defining the messages that go between the various components.
And what I've done in this diagram, I've shown you the various elements of the VCA API that have been specified to date and the elements that have not yet been specified to date. Originally they were hoping to define APIs for all these different components, but at the moment I say they've only done the ones that have shown me the green tick.
So the interrupt results, you can see here that there's a lot more components or products from companies that are implementing chappy V V P and V C A P. Then we're doing the J W T with, with the open ID connect.
Now the reason for that is if we go back to the previous diagram, we can see that we've got thing called a holder coordinator, an issuer coordinator, and a verifier coordinator. Now the issuer coordinator was actually implemented by a company that was participating in, in the plugfest and they allowed everybody to, to you as their issuer issuer coordinator. So in order to be an issuer service with this, all you needed to do was to implement the issuer service, which is in the left-hand side purple box. And all you needed to do was to issue a credential to the issue a coordinator.
So it was much easier to become an issuer using the V C A P I chappy CHAPPY model than it was using, using the open ID Connect model.
Because as an issuer of, you know, open ID connect model, you had to implement everything yourself. So that's why there's a lot more companies that actually participate in Chape V P and M V CM improv because they didn't need to do, to do that much work themselves. Okay. Now what are the key takeaways from this? First of all, there are too many protocols, options, and configurations to ensure interworking.
It's, it's just not possible for any two random VC products to interwork. Even if the same protocol suite is being used by two products, you still need to agree on parameters such as DIDs encryption algorithms before INTERWORKING will succeed. And this is before trust is even considered. Trust has to be led on top of that, and that still hasn't been done in a standard way. So profiles are definitely needed, trust infrastructures are needed, and you need to profile the trust infrastructure.
And until we get to that state, my assertion is that VC ecosystems are really at the same level of development of the worldwide web was in the 1990s when there were, there were several browsers that wasn't into working. You couldn't trust websites, you couldn't trust browsers, et cetera. And that's, I think, where we're at today with, with the verifiable credentials products. And at that point, I'll hand over for questions
Very much.
David, the first question we have here is you said did com failed. So it's a two part question. Why did it fail and what would have to be true for it to succeed?
Okay, so why it failed is because out of the 30 or so companies that were participating in the JFF plugfest, there were only two or possibly three that were even attempting did come. And because we're talking about six months ago, their implementations were too premature and therefore they didn't manage within the timeframe of the plugfest to get their implementations up to the level where they could actually successfully form into working. I believe that's not the case today. I believe there are, did come up implementation are inter-operating.
So it was really a question of time scales and and the immaturity of the, of the, of the implementations.
Okay, great. Rather than,
Rather than anything specifically wrong with DCOM as such.
Okay, great. Thanks for for clearing that up. And another question here is, to what extent and in what ways have the projects achieved the goal of fostering adoption of verifiable credentials?
That's a good question.
I mean, that's a separate issue, isn't it? Because whether someone decides to adopt verified potential or not is a business question. It's not really a technical question. It's about where do I get my business benefit? Where do I get the, the, the ministry gain from? And whereas we were talking about technical interoperability, can we get products to interoperate in order to build an e to build an ecosystem? So I really cannot answer that question because, you know, I'm, I'm not, you know, I'm, I can't actually talk about people's business motivations, et cetera, for, for adoption.
I do know that there are some live adoptions going on. I mean, one of them that probably everybody's aware of is, is Microsoft and LinkedIn.
But again, it's not clear to me what the business model for that is because it seems to be free, free for most people to participate. So, so I'm sorry I cannot answer the question directly.
Okay. Thanks very much for your presentation. Round of applause everyone for Dr. David Chadwick.