Well good afternoon everybody. Thank you for spending the next 20 minutes with me on a topic that is close to my heart. Not that I want to be manipulative, but I want to explain a little bit about manipulation and why I think it's a relevant topic for the European digital identity wallet and all wallets that member states will either build or recognize in their in their nation. And there's a little bit of background on myself. I've been in the identity industry for nearly 20 years now in various roles.
If you wanna know more, probably you'll get the slides and you can click on the LinkedIn button and you find out more about me. Interesting to see here for this talk is that I combine both advisory with academic research and publications. And in the talk today I wanna do four things. I wanna share a concern with you that I know some of you'll recognize.
I wanna see what the link is between EIS and manipulation. Look a little bit deeper into that topic of manipulation and online manipulation. And then talk to you a little bit about the risk of oversharing data. So it starts with my concern.
And my concern is not that we will get a digital wallet. I think that's a good thing because the EU rightfully recognizes the risk that citizens in Europe have shared data everywhere, knowingly and unknowingly with all kinds of companies. And once these companies, once they have that data, they will use it for whatever purpose they see fit. So the notion is that we wanna give that data to the citizen and the citizen is in control. So all the citizens are waiting expectantly on what is coming.
And the solution that they will get is a wallet in which they can gather all their personal data, but also all kinds of other data.
They can store it, it's safe, secure in the wallet and they should be very happy with that because it's a good thing. And with that wallet online, you can share your data in exchange for subscriptions. You can share that data when you buy something so it gets delivered to your home address. And that enables the European citizen to have more control over where their data is. Unfortunately, the internet is not a friendly place.
So what will this citizen encounter there? The really bad guys, of course who will hunt you down with phishing attacks, but also other types of things you will encounter online. And those tend towards manipulation, nudging, trying to get your data by saying, what if we give you a 10% discount on what you just bought? If you share your date of birth and would you be able to share your shopping history from the last month with us so we can better assess your health insurance? And maybe you can get a discount if you eat healthy.
And this is my concern that we don't spend enough time thinking through this part of the equation where we give the user, the citizen a very powerful tool or actually a big bag of gold because data is the new gold and the new oil, but the drum is difficult to carry. So we will give them a wallet and you go out with the wallet and there are friendly people out.
But also, yeah, rubbers lurking in the bushes trying to get the gold out of your wallet. Can you protect yourself? And that's of course the next question, what type of protection would the citizen need online?
I think that we still want to get the citizen engaged and excited about this development because there are very good aspects to it. But we also need to see if they need a helmet to make sure that if they fall they are properly protected. One of the threats they encounter is this manipulation.
And it always sounds a bit scary and a bit dark and, and it's a really bad thing or has a bad vibe if you talk about manipulation. So I wanted to give you like 10 seconds and use those 10 seconds to come up with an example in your life where you have been manipulated.
So who could come up with an example that they were manipulated?
Ah, you're a manipulation free population. No, the reason for doing that is that manipulation is mentioned in recital four of the e new EI, this regulation. But if you start thinking about it, you quite quickly run into the question of what? What is it actually? Because I know we will, we are, we're all influenced and we influence others.
And on the other end of the spectrum of persuasion where I influence you by giving you good arguments and you think them through and you make a conscious choice, on the other end of the spectrum is coercion where I put a gun to your head and I ask you, not so kindly to do what I tell you to do. And somewhere between those two extremes is manipulation. So what is that manipulation that is mentioned here?
And I'm gonna take you along on that journey. 'cause I thought if it's in the regulation then there must be an explanation.
Isn't there an appendix or a dictionary where it states clearly what it is? Well, not really. I went through a couple of acts and I'll quickly show them to you later on. But it starts off that there are already three types of use for the word. The first is if I have an image or a video, I can manipulate it. That means just changing the video, making a deep fake. The other is specific for ai. 'cause I also looked at the AI Act and that is about manipulating training data, giving different results in the AI responses to questions that you ask it.
But the last one is of my interest, the manipulation of humans.
So I went through for example, the Digital Markets Act. And in the Digital Markets Act it says also a piece on manipulation. It says it distorts or impairs the ability of end users to freely give consent. So you're not completely free anymore if you are manipulated in the Digital Services Act, there are talks of manipulative techniques and that impairs the ability again to make free and informed decisions. So this gives us a bit of an understanding already. Now the AI Act is really strong on it.
They connect it directly to European union's values. AI is a tool that can be used for good but also can be used for bad purposes, including manipulative purposes. And they contradict union values like human dignity, freedom, et cetera.
And there's more detail in the AI act that helps us to understand a bit of, of what is exactly this manipulation. It has to deal with an impairment on your autonomy on giving consent in freedom, but also being aware of the fact that you're influenced. And that's even discussed here as well. There's a part of manipulation that also involves deception.
So I'm not gonna read the rest of it 'cause I'm sure you can get the slides. But it helps me to think about manipulation and to think it through a bit because in the end, I wanna protect the European citizen from manipulation. So I need to know what it is. And in the end, it's a party, the manipulator that wants the patient or the manipulate to display certain behavior. That behavior can be buying something that can be acting in a certain way, that can be reposting a tweet, but that can also be sharing data.
And the results are usually harmful.
Those can be harmful for myself, can be harmful for others. And aspects in manipulation are that a part of Dece deception. And so you're not being told the true story or the whole story. You're maybe not even aware in all cases that you are manipulated and it impacts your autonomy and your free decision making. So it can be that you don't see all options. So your decision making is impaired. It can be that you don't feel that you're free to choose all the options. And lastly, it can also exploit the vulnerabilities.
And these come from the legal texts that the EU pushes out the the legal, the the regulations of of the eu. Now this tells us a little bit about then what the effectives of manipulation and why it's bad because we want free people, we want autonomous people, we want people that are informed.
But if you dig a li dig a little deeper, and I won't spend a lot of time on this topic, but in the area of moral philosophy, there are philosophers that have been thinking for a long time on how people are influenced.
There's a branch that that that goes into behavior economics, which is more applied where you see the nudging techniques that are also commonly online, common online. But there's one thing that I want to pick out of that, that that information and that is that when you thought about the 10 seconds or when you were manipulated, I think there's a far fair chance that each of you had something in mind that it was hidden. That was something that was not completely clear.
And one, one criteria you can use for that is that if you look back on choices that you made, you think why did I make that choice? I would've made a different choice now. And if that is because somebody was influencing you but didn't give you the whole story, that is typically when people talk about manipulation.
But there are a few problems with this theory because if it is hidden, then we also find situations where I can manipulate somebody and it's completely open.
If I don't want my daughter to travel to a unsafe country, I might make remarks to her saying, mommy will be very anxious if you go away. I will be very scared. I'm not telling her to go, I'm just telling my feelings. But that is a kind of a manipulation and it also puts a focus not on the manipulator if we want to talk about manipulation and and is it hidden or not?
But it, when we focus more on the patient, eh, did they see it or not? Was was it actually hidden from this person? And then you need to look at the persons that are manipulated and see could they have known. And that also gives rise to another problem if it is open, was it then still manipulation and does it put a burden on a person to discover it?
So what does that do exactly with accountability?
Well, a lot of academic thoughts there. I included the article one solution out of that is that Professor Micha klink states that perhaps it's better to define manipulation by saying persuasion is when I inform you and you make your own choice. Based on my argument, I have persuaded you coercion is what I said, the gun to your head. And in between if I want to move, get you moving in a certain direction. And I use techniques and ways to do that without caring whether you understand why I remember. So with persuasion, I'm convincing you now I'm just getting you in this direction.
You may understand, you may not understand, but that carelessness is the defining aspect of manipulation. It solves a lot of these problems about hidden and unhidden if you're even interested in that. I think a lot of the technology practitioners may not be, but it helps in defining when we are talking about manipulation.
And it also takes off the moral burden because there are a lot of situations where I am influenced and I don't have the feeling that the influencer really cares whether I understand they just want get an end result.
And that leads then to the question, is this morally okay, this type of influence, this desired action and linking that to the digital identity is, is the digital identity there comes into play that when I am either moved to share data about my digital identity that refills information on me.
But it can also be that this data is used to connect the dots across other data that with the development and the introduction of a digital identity wallet, this becomes a very exciting space because the digital identity wallet all of a sudden is this bag of gold with high quality data that a lot of relying parties online may want to have and that they can use to get to know you more intimately. And the better I know you, the better I know your pressure points, the better I know your vulnerabilities, which again enable me to better influence you or manipulate you.
And then again, we need to ask the question if that is okay or not, was it a good influence and a good manipulation or not? So in that sense, digital identity can be an aggravating factor for what all the other EU regulations have said about protecting people so that we can live autonomously and make free and informed decisions online.
So that brings me back to the question, if this citizen goes online with their wallets and they start sharing data and they get data sharing requests, are they able to determine freely and autonomously where they want to share that data?
Do they understand what data they are actually sharing and and what it means and how it is different from the online account with their fake date of birth? Can they issue a fake date of birth outta their wallets? And this is part of ongoing research that I'm involved in at the moment. I wanna see how this introduction of a wallet and giving it to a citizen changes the dynamics if at all of the oversharing of data online. So in that research we are asking six questions. We've asked 16 experts in the Netherlands to, to answer us, what, how do you see this risk of oversharing?
What is it?
Are there specific scenarios where it occurs that a user could overshare data from their wallet? And then typically you will have the holder of the wallet who is sharing data to the relying party. Are there specific aspects that we can allocate to one of these actors that make it better or worse in terms of sharing and oversharing data? And then going back to that citizen, what can we do to help this citizen, the user, to improve capabilities and become resistant to manipulation or being really conscious on where to share data and then what message we can take?
And the first part of that research is due in September. There's a brief publication forthcoming for an conference in in LOFA on EO. So electronic governments, apologies, my health system is fighting with the climate control in my hotel room and the climate control has better persistence over time than my health system.
But, but this is one ing area to see the risk of oversharing data and of course how that will impact the, the privacy of the citizen and being manipulated or perhaps even exploited. And on that risk, we already see that there are roughly 10 aspects that increase or decrease that risk. So for example, the first one is quite obvious, I don't have this wallet because I want to share my identity. I wanna buy a bike online or I want to go get a subscription online. So that is my goal and if I'm oriented towards that goal, having to share data out of my wallet is a stumbling block.
So I want to get over it as soon as possible. That may lead to citizens not seriously considering what data is actually requested. And that's for example, the third one on the left side is a citizen able to assess whether they really need to share their medical data to subscribe to this magazine or can they figure out that it's not proportional.
And so there are a number of these aspects related to this risk of oversharing data. And it's interesting, we're still working on, on, on analyzing the data on the actors and the possible measures of it.
But you already see here that there can be measures on all three of the aspects. And actually what we found out during the research is that we forgot to include the supervisory bodies and what roles they have and what measures they should take. So in the end, this user will get a wallet from the wallet holder and fill that wallet with the data. Well you've seen this picture a hundred times over the past four days. Share this data with relying parties for all kinds of services and products and a supervisory body will make sure that the whole trust scheme is intact.
And from that, the next part of the research will look at the characteristics for the holder, the wallets and the relying parties and then see what the user capabilities are that we think the user needs that the CI, the citizen should be able to do what the citizens typically have.
That and I can already spoil, give a spoiler there that we, we don't think that they have it because there is not one citizen, there are 400 million citizens and they're quite unique, which makes it difficult to provide a general view on them. And then of course the the measures.
And then when the citizen goes out with his wallet, we want to make sure that it's clear what risk they're running so that governments but also relying parties and wallet providers can take appropriate measures to make sure that this citizen is protected online. Has sufficient bodyguards, can enjoy the online and the digital economy enabled with high quality data in his wallets that he or she can share online but not without the proper protection.
That is a concern that I'm working on. I've seen a lot of stuff on EI this here this week.
I'm really grateful to keeping a call that they put a whole stream in the program on this topic. I think it's essential to keep conversations and discussions going not just on the technicalities or the trust frameworks or the data schemes, but also on the user testing and the UX research that has also been mentioned earlier this week to see that in the end we're not building a car without brakes because the user didn't ask for it.
We may build a car that does not have a very strong brake and we will see some accidents with the wallets as we had with the car 'cause the first car also didn't have an airbag. But we need to bolt these things in as soon as possible. If you want to contribute in this research, reach out to me. If you wanna know more, you can also reach out to me. And I wanna thank you for your time and attention for this topic.
Thank you very much and I invite the audience if you'd like to continue the conversation more. Feel free to get in touch and carry and ask your questions over lunch or virtually.
Thank you very much and thank you.
Thank you.