So my name is Hank Marshman. I have to rectify one thing right from the start. I do not have a PhD. So in that sense, I'm not a doctor, but the DRS that I have in front of my name is the Dutch way of saying you've done a master's. On the other hand, I did spend an a year and a half at university doing actual research. So I'm somewhat of a researcher, but I don't have the title yet.
However, if there is any professor in the room with promoting rights do come talk to me afterwards.
And I actually wanted to start with saying that I'm very happy to be here because I get feedback this time. And I assume that most of you know what happens if you have a system with a broken feedback loop, that system will become unstable and will go off the road. So last year I was at this beautiful conference and I joined hybrid league and I was giving a presentation online and I found out what the effect is of not having a feedback loop.
Then all of a sudden you get a very hard stop because you exceeded the time. So I'm glad to be here getting feedback, being able to look at the clock, which is not doing anything at the moment. So I still have 20 minutes and I'm gonna talk about ethics, insecurity, design for digital identity.
And before this, I'm gonna start with that. I see I left out a slide so the slides were not updated too bad. What I wanted to start with is that over the past couple of days I've been listening to a lot of presentations and there were a couple of quotes that that rang a bell with me.
There was Emma Lindley in the, in the first day talking out of the women in identity research and you see, and she said, she made the statement, I have bias. You have bias. So let's get over the fact that we should not or would not like to have bias. We have bias and we need to deal with it. And then Vittorio came up with a presentation and somewhere along his great presentation there was this brief line of identification is a civil liberty. Now that's hardly a technological statement.
Then we go into the realm of political science and they've connected to me with Ian with his boxer story saying we need to fill consistently. Well I'm gonna show you a failure that we don't want to make consistently, but we want to be able to address it consistently.
And the last quote I had was by Mike Kaiser saying somewhere in the q and a that yeah, should we give individuals this power?
Should, should we give that to them? And he said, yeah, I think individuals will act in the interest of the group.
Well, I don't know if I can find a lot of historic evidence for that, but that's not a technology claim. And I think with those observations, so kind of feedback from the conference, during the conference, you can already see that we're out of the technology domain with our topic. So what did I do?
I made a small puzzle mainly to make clear what I'm not gonna talk about. I want to talk about two things related to ethics and digital identity.
And the first question that I want to answer is saying, why should we consider ethics as a topic of a deep cybersecurity technology domain of identity and access management? And then if we should do something with that, how can we do something with it? Cuz I can imagine that everybody, if I ask the question, is ethics important? We'll raise their hand. And then if you ask the follow up question, do you think ethic is difficult? Maybe also everybody will raise their hand.
So that's where I want to help by, out of the research that I did, given demos or or show what the harm is that happens, then give a moral perspective on it, an ethical perspective that you can use and then make it really practical on how it's applied. And actually we're currently in the Netherlands doing that.
And to start off, this is the article that my research resulted in cuz three, three years ago I started at Lida University and I said, I want to prepare for a PhD, I want to investigate this digital identity because I read the claims by McKinsey, Accenture World Bankless, give everybody a digital identity and we'll solve a lot of problems. And on the other hand, I've read a couple of cases on where we see that these things go off real. So I started at university, I said I want to figure out what a good digital identity is.
And they said, well that's interesting because good as a word, as a concept is coming out of moral philosophy. So we're gonna send you off that way and we are also gonna send you off in the digital identity track and how nations, how states are handling these type of solutions.
And one case that I came across and that I wanna highlight here is that of Uganda. Uganda in Africa is a country that is working on a digital digital identity solution. And their national national identification or digital identity solution is called Naga Muntu.
And they've been rolling that out over the past couple of years. And you probably know of Ahart has been a lot of research in that area, but also in the, in the African countries there are a lot of initiatives in this area to see what they can do with biometrics, with digital identity. So in this case in Uganda, the national identity cart is a physical cart with a machine readable zone and it has stored your fingerprints and your facial biometrics. And that is connected to both access to physical services and online services.
And by the time that the government thought it was a good idea to close down services like healthcare and make such a cart a rec, a prerequisite, not everybody had a cart yet.
And that led to, to a lot of problems. And the two two specific cases, the one on the left hand side is a young lady, 20 years old. And during the whole program where the government was rolling out this, this digital identity solution or national identity solution, she, she was not there during the registration.
She was young, she was pregnant, she dropped out of school and then she realized that I need, I need to have this identification card because I need it. And she went to the local district to register kind of in a second, second attempt. And I told her, you can do that.
It, it's gonna cost you a little bit of money. Of course the first time was free but now you need to pay. So there we have the first exclusionary factor.
And at the time she was still not 18 yet, so she couldn't register at the central office. And indeed I think a year later she needed medical treatment. She went to the hospital and she said, well if you have this N I C card, we will help you. She didn't have the card and and the literal description was they threw a book at me and said I should just go away because in order to be helped you needed to have that identity card.
Luckily she did receive help and, and the researchers talked with a lot of these people and he says what they did is they tried to find the young nurses who had a little bit of compassion for the human, for the individual and kind of bypassed this weird requirement. And the other case is he was 88 years old and by the time that this program came along to register everybody, he couldn't prove his age.
So he didn't have a birth certificate. And what then happens is that they start asking questions like, okay, so who was the ruler when you were young? What historic events do you recollect?
And then they tried to build this, this frame that and they said, well I we think you're 70. That also meant for this guy that he waited, had to wait another five years before he would get his senior citizen grant again because he was 88, he got his pension senior citizen grant. So now by the registration making him 10 years younger, he lost that he's frill traveling to that district office to correct it is difficult for him hearing stories about how cumbersome that process is makes it even more unattractive for him to do that.
So here we see like two cases and there are a couple of more described in that research where see what the impact is of identification projects, digital identity as well if they've, if they negatively impact human life and individuals, I can of course wonder is this is this digital identity and and in the non-digital world also people are harmed should we really care about this case?
And it's true that also in the non-digital world people are left out and we have exclusion but we have a couple of hundred years of experience of correcting that and we have our ways for that.
And the other aspect is like is this an isolated case or not? It is not an isolated case. So in the keynote on Tuesday, Emma shared with us the women in identity report. Kevin from the UK is one of the cases they have who is struggling to get registered for an identity in India.
They have, and there is a situation in 2019 where they mixed it up with the citizen register and they said, well actually all these people that came across from the border are are not really citizens. They're like immigrants. And by connecting that citizen, non-citizen ship to aar, losing AAR means no identification means no bank account means no mobile phone subscription means no supports, no welfare.
And actually the US also got a, a notification here cuz that's where the state and the IRS outsourced the identification services to ID me.
And they also found out that the registration there was very difficult. So early on this stage somebody said the most costly part of identity nexus management is the registration. It's also one of the most challenging ones. So I think as I am practitioners, the question is do we care or not? And I decided I care. So I started digging into this and and basically my question was how can we make it better? Because this quote Mr. Berg stuck with me, huh? Technology in itself is not good or bad, but it's also never neutral.
You will use it in a certain way, take the biases that were mentioned earlier and that means that it will have impact. And if you do that in a system without a proper feedback loop, it can go on for a long time before it gets corrected.
And in the Netherlands we have our own experiences with poor artificial intelligence algorithm solutions. The other thing that I learned from this one, and that's a quote by Mark stain researcher at Taino.
He said, we are technologists, we are engineers and actually engineering is a moral activity. I was like, what do you mean?
He said, yeah, well if you're an engineer, you have a problem. You wanna make it better, better good is a moral philosophical concept. So actually as engineers we are also ethicists because we wanna make this world a little bit better. We're solving a lot of problems.
So then the question is how are we gonna do that? And for that my slides are a bit messed up so my storyline is also messed up. But for that I found out in my research that we need to have an idea of what, what mental model, what moral concept can we apply and then how can we apply it?
And you already could have taken it from the title from my publication is that I came across the capabilities approach. So if you dive into the philosophical archives and the the world of Plato and Aristotle, you find a lot of lines of thought like utilitarianism. If the end is well then the means to that end are well virtue ethics duty at count. But it was one that stuck out and that was the capability approach. And the capability approach is developed by Amaren and further developed by Martin Nusbaum.
And Amaren is an development economics and a philosopher and he worked in India and there they said, well we see that the GDP growth of this province, GDP per capita has increased. So we're we're improving, we're getting better. Connecting that to welfare and wellbeing. And then they dug in deeper and they saw that actually the education was going back, it was not, it was not not getting better. So what happens that the DI division of the funds was not equal and they looked at possession in the sense of money but they didn't look what people could do with that.
And that's when they came up with the capabilities approach because they said if we want to go from A to B and we give everybody a thousand euros, is that fair equal just, well it depends because if I'm in a wheelchair, I need to get a cap that can carry the wheelchair and me to this location and it's gonna cost me more money.
So we should not focus on like an equal distribution, which is kind of libertarian concept. We should look at what people can actually do. Do we have equal capabilities? Do we have agencies or can we choose freely?
And then also they introduce another aspect and that is context or conversion because say for mobility we give everybody a bike, but I live in the desert then it's not really gonna help me. So you also need to look in what situation is this person with these capabilities. If I'm in a culture where it's not allowed as a female to drive a car or ride a bike, then these things won't help me. So that's a way of looking at are we improving things?
Is this better for wellbeing and welfare by looking at the individual, the human, the capabilities he or she has and also how much effort they need to make to use and apply those capabilities, the conversion factors in their context.
So with that in mind, I started looking at digital identity solutions on a national scale. And that's a different presentation because what I wanted to do now is kind of give you some really practical ideas on then how do we do that?
So I'm engaged in a project at a customer, at a government, how do I then make it practical and then out of academia there are a lot of models but I find kind of a common threat there. So I'm, I'm not picking on a specific one that is in the notes on the, on the agenda, you can look that up as well just tell you sensitive design. But I'm gonna focus on those three aspects. It's always about the context of technology and its use, it's about stakeholders and the users and then it's on impact and values. And there are many ways to do that.
There's one that I want to give out as an example and that is guidance ethics approach where they say ethics is not a judge but it's actually more like a coach. It will guide you and you should start off by describing the technology in its context cuz identity management in the Netherlands is different than in the UK and that's both in Europe, not completely anymore, Netherlands and Germany. But in Africa it's different. In Asia it's different. In the US it's different.
So in technology operates in context, then you're gonna have a discussion with everybody involved about what are the effects of technology. And then also it's important to make space in those conversations for feelings. Cuz feelings are signals for values. Doesn't feel right, doesn't it feels yuck in privacy. We have that with the decree line when things exceed the Cree line, even though they're legally okay, what we had in the Netherlands with i n g and a large supermarket combining forces in a, in a loyalty cart, legally it was okay but the customer said it doesn't feel right.
Something is happening here with my privacy, with my autonomy. In those conversations you come up with improvements for your design and you should also incorporate them in the operation that can be in the solution itself. So you make a better identity management solution, it can be in the environment. So there is no privacy law. Well let's get a privacy law in place, get a policy in place and you can also do something towards the individual training, raising awareness. Now in my updated presentation, I had a very nice case, but there's not that much time.
But the case was about the care robot in a senior citizen's home where they did this and they kind of found out that there were 18 actors there interest cuz it's not just the senior citizen and the nurse and the doctor, it are the parents and the children of the senior citizen.
It is the supplier of the care robot. And for example, things like this thing can read out your medication but the senior citizen is sending next to somebody else. So a privacy screen was one of the things that they come up with to incorporate this value of privacy. So there are two links there.
The interesting thing is that I'm involved with the Dutch government activities around ADAS and the European digital identity wallet and they just kicked off with doing four of these workshops, four digital identity wallets in the Netherlands in the context of adas. So I'm really curious what will happen there and it's at least an an deliberate effort to discuss values in a quite technologically and and complex solution.
Also, there's already a lot of material available. You start digging into this. So this is more for you to have a look at in the plane back or in the train back at home if you think this is interesting. What I found especially interesting is that we can bridge and learn a lot from academics here who've been thinking about this for years already and apply that to our practice because the domain of identity and management is now really touching society as a whole.
And that's where we see that it's like what they say with ai, we build a car and release it and we haven't really test driven it yet. So that is something to do perhaps a little bit slower and have those conversations about what is, what is the impact, what will happen when we do this digital identity thing. The other aspect that struck me today was that when design, we usually go for the happy flow and the majority and there are always those edge cases.
I think this is one of the change points, the inflection points where we, where we find out that our edge cases may be etched to the design but they're central to our humanity. So if we design for the edge cases, whatever solution we will get, we'll be able to handle those edge cases and it will also be able to handle the general audience. So with that thought, I'd like to conclude if you wanna know more, if you have questions, might be a little bit of space for questions now. Otherwise tap me on the shoulder later today or tomorrow. So thank you very much.
Yes, thank, thank you very much indeed. So there don't seem to be any questions online, but I I was reflecting as I listened to what you talked about, that the cases that you brought up were an intersection between law and technology and not always technology because for example, the problems in the UK about the so-called wind rush didn't involve any technology. It was simply that the paper records of what had happened to these people or where they'd come didn't exist.
And so it it, it technology can be a convenience but if the law doesn't, if the law is wrong, then the law will cause the, the the problem. So is it that you are saying you have to always design the system in the wide, wider context or have I got something wrong there?
I think you've on the right track there. I think for these like, like if we do digital identity on a national level, we should expand that scope because then we're not just dealing with employees or third parties or customers, but then we're dealing with everybody in society.
That's where we have governments who play a specific part where the jurisdiction and the legal framework does that as well. So you, you need to include that in thinking about how it will run and also on that level. So on the national scale, you cannot drop in digital identity and, and let it float there. There's another case where people did that, they focused on the use case of voting. So everybody from 16 and up got this digital identity and by doing that they neglected birth registration.
So what happened 16 years later, people would come up for a voting idea but they didn't have the birth certificate. So that kind of breaks the change. So you really need to, as a, on a government level, you need to take one or two steps back to consider the whole.
And yeah, in another occasion in Kenya the digital identity project was paused because of the lack of data protection, impact assessment, data privacy impact assessment. So yeah, it's, it's, it's a big thing that we have here.
Yes.
And it, it is a major problem for the people. And the examples you gave are only a few. I think it's, it's sort of, there are many other cases that you can just be born in the wrong place or at the wrong time and it can be almost impossible to get an identity. So are there any further questions further? No. Okay.
Well, so I think we've got to say thank you very much. Thank You're welcome. And thank you for your thoughtful presentation. Thank you.
Yes, bye.