So hello everybody. I'm now here from IT Technology Solutions, which is the one of the largest systems integrated in Japan and I also work for Open Foundation Japan as chair today. We folks in decentralized identity world is struggling to such or find a good use case of bare five credentials, but we didn't find it yet. I think that the reason why it's, there are several reasons. For example, there's no strong requirement from verify side.
Also, there are some existing systems and the almost all people are satisfied to use it. So today I would like to introduce our project with National Institute of Informatics, we quality NII in Japan. How they integrate VC ecosystem to existing some based system and also what they want to do with verified credential. So I invited the guest speaker from NYI. So first please introduce yourself and start your presentation.
Thank you for intro introduction. Can you hear me? My voice? Yes. Yep. Okay. Can I share my screen?
Yes.
Okay, we can you see my presentation slide?
Yes, I can see
It then I, I'd like to start. Okay. I'm Chika KO from Research Center for open Science and data platform of NII. I'd like to introduce how we are trying to use variable credentials converted from open batches to personalize function of the other services.
National Institute of Informatics Japan called NII is an institute to build platform to support educational and research institutions in Japan and our research center article course develop and their based research data infrastructure in response to the global needs for open science. Our mission is fascinating research activities in Japan using developed research infrastructure
As a platform for research data management. We provide research data cloud called N-Y-I-R-D-C-R-D-M is a research data management platform for researchers to manage and share their research data and created materials.
It enable researchers to start sharing data with collaborators in the its crossfire system is a publication platform for researchers to publish research data and created materials they wish to make publicly available. It has only function necessary for registering and publishing data, including papers and signing research is a discovery platform to start research data published on publication platform. This is the largest academic information such as in Japan and allows users to start research data on articles registered in public infrastructure using these three main platforms.
We encourage researchers in universities and research institutions in Japan to manage and use of research data In N-I-I-R-D-C we have seven functions to support N-Y-I-R-D three platforms and especially the data governance function. Partial gates research data management by automatically orchestrating the N-Y-I-R-D-C platform. Armed with their data management firm in N-Y-I-R-D-C, I am a team member of a capacity building platform and we support N-Y-I-R-D-C users by providing our educational services to use N-Y-I-R-D-C services as a core platform.
To provide educational services, we provide glin RMS. This is the land management system based on model and almost 100 universities and research institutions in Japan. During glin r Ling s, we provide learning courses related to research data management and N-Y-I-R-D-C on this platform. And learners can get open batches when they completed learning courses. Open batches is one standard format for digital batches, which start by learning completion and many educational institutions issues of batches to their students.
Now N-Y-A-R-D-C provides services to each users, however the backgrounds of researchers and research data management serve very significantly way for example, positions differs such as investigator and students and experience and knowledge also differs depending on users. But N-Y-A-R-D-C cannot personalize their services to each users. So we would like to describe personal attributions as open parties by checking their ability and ising open parties on gags.
NIRD can use issued open biases as personalized personal attributions and optimize their services depending on the kinds of open parties. This is the system architecture of how to pass open batches issued from glin A MS to N-Y-I-R-D-C services. We developed wallets SP connector and Fuji will explain in detail later about this system architecture.
Same I, I'd like to explain the use case of how we integrate open batches into N-N-Y-I-R-D-C. These use cases are ongoing projects and the first use case is gut RDM research data management platform. In the laboratory use researchers and students learn how to use GRDM in laboratories on GLMS and get open batch and issue.
Open batch can be shown on GRDM so project members can check the laboratory members have COMPLET learning and have enough knowledge to use GRDM in
The secondary use case is repository researched publishing platform and provide repositories on IES access to some research data are restricted so users are required to apply manually to each organizations which administer access, restrict access restricted research data. Now we are planning to automate this process.
Indeed researchers learn knowledge to access research data such as how to write acknowledgement and licenses to use research data from LMS and get open budget by submitting open budget they acquire to the they acquire to the repository, researchers can access restricted data result applying to each organization. The final use case is data governance function. As I mentioned the data governance function ate research data management by automatically orchestrating the NYRD platform along with the data management from the user of this function.
Now research data management on Gagne LMS and get depending on his or her proficiency and research data management environment and procedure or personalized based on the kinds of users open badges. Thank you for listening and if you are interested in our institution and center, please access here. Thank you.
So thank you very much. So from this part I explain explaining in detail from the system perspective. Please change slides to mine.
No this, not this one. We have another one. Sorry for that. Not this one. We have another slide for me?
Yes, this one. Okay, thank you. From this part I will talk about about project challenge and the systems architecture itself. So first one is about project challenges. So NA as n since explained user qualification with portable digital batches is required this first one and the second one is how to adopt the VC ecosystem onto existing sample based federation issue is the second challenge. For first one governing RDM request user classification to access highly confidential data and the LMS is built based on mood or LMS which can only can issue learning credential in open but V two format.
So there's no standardized way to store or bring or present open parties to applications. This is the one challenge for us For second one in the ging federation there are a lot of some SPS in there. So all of them are candidate consumer of learning credentials but they do not want to match changes on their systems to only consume credentials.
And this is a overview of G ecosystem system like as a trust framework, like in common in United States or at game G Queen Trust framework itself is under umbrella of K initiative.
Airway two includes certified identity providers, which is operated, which are operated by university Research Institute to authenticate users like students of cultural researchers. Also includes certified service providers like ING, ILMS or ING RDM or other service providers.
So back to challenge. Our approach to the challenges are two things. First one is embed open batches into verifiable credential and also embed verifiable credentials into some assertion.
For first one, we built a gateway service between Moodle and warrant to issue verifiable credential to hold us and the verifiable credential include open patch issued on the ING LMS and the sorry the gateway service actors and issue by invoking mood API to get users, users open batch and embed into verified credential and issue it to the user's wallet. For second one, we also built a wallet to SP connector service which combat presented BCS to Samal assertion the service actors.
This time Aria to verify BP or BCS and extract embed it often batches from BC and de embed it into samal assertion and gagny. hps can now consume similar sessions and get qualification information from the attribute of similar session. And of course SPS can verify open batches itself using verification URL in open batches.
So this is systems architecture in detail, there's three steps. First one the issue open batch in ging A MS and second one is the issue. Open batch is as verify credential to hold US wallet. The final one is the present profile credential to ING RDM.
This this make me miss type miss to through the word speed connector. For first one it's quite simple, log into R-M-S-M-S is a computer summary SP and direct user to identity provider who has account and user has be authenticated and back to LMS and gateway batches. For second one, user have to access gateway and also we configure gateway Gateway as some sp so the gateway is direct users to identify providers and the authentic user back to gateway.
After that gateway inbox model API to get open batches for the user and after that the gateway issue provide credential include the open batch to users wallet and this is the final step for presentation if user access that RDMA research database.
The research database is also configured as somewhere SP and direct user to identify provider and authenticate user back to RDM.
And we prepare some special pages on gag RDM which require qualification information from open batches and if user access to the page, the gag R DM direct to world SP connector using some protocol because we built a world SP connector with as a thermal identity provider. So that ING a, this itself is computer, some SP I I mentioned earlier. So some ING a can easily direct users to work SP connector in some protocol. After that what SP connector act as bare fire.
So the connector request users to prevent verify credential and bare verify presentation includes their batches and the user shows present verified credential from their wallet to the award connector. After that word SB connector verified the verified credential itself and extract the data from verified credential and embed it into the sum assertion and post back to GNI RDM. So finally GNI R DM can get open batch data as qualification data in application. This is an overview and flow of our system but there are still remaining challenges.
For example, as many of you aware that how to trust gateway or SP connectors, it's a quite difficult problem but there are existing trust framework or trust circle in some manner. So we think it's a good way to involve gateway service or waterless speed connector into existing ging federation.
By doing that, other entities in federation can trust gateway or connect itself. And the second one is from a different perspective. Some of the lying parties DEC requested us to use credential not from governing LMS for example. They want to use credential from IT vendors like Microsoft or Oracle.
Some, some, some of that. So to do that we need to expand trust framework to accept external issuers. The trust share list on gateway services would suitable for this use case I guess, but we have to discuss more deeply. The last one is about interoperability or compatibility. As many of you know the standardization is quite confusing. So we have to follow the movement, which credential format is suitable for this use case or which protocol is good for use case or security profile and so on.
So that was my or from me and if you have interest on our project, please reach me out if you by LinkedIn or this in this venue. So thank you very much.
Thank you very much for the practical solutions here, your insights. We very much appreciate it. Thank you abs. A very quick question. Is there anything that was surprising to either of you throughout this experience? Anything you did not expect?
Okay. Actually even f from from now. For for now there's no in involve, involve of end users. So we will get some feedback or more feedback from users after this prepared. Yes.
So I expect expecting good feedback from them.
Yes, absolutely use your feedback will give us lots of information. Thank you once again to both of you.
Thank you. Thank you very much. You very much.