So our panel for today is focusing on privacy equals data protection plus X. Hmm. So what could the X be? I have some thoughts, but maybe David, since you've joined us from space, the world is wide open where you are, why don't would you like to take the first thought David on privacy equals data protection plus X what's the, yeah,
I disagree with that mathematical statement actually. Perfect.
I think, I think they're actually intersecting sex and it's not. One is the subset of the other, because data protection applies to all data, which can be about legal entities. It can be about IOT devices. Whereas privacy only applies to living humans. It doesn't even apply to dead humans. It doesn't apply to things. It doesn't apply to organizations. So data protection covers more than what privacy covers, but then priv privacy for people covers more than what data protection covers.
So, so I disagree with the mathematics of that.
Hmm. Okay. Challenge with the math. Good fun answer.
Katrina, what do you think
We were chatting about this a little bit beforehand. I mean, anyone that knows me knows that I've been a massive privacy advocate for, you know, 20 something years, those long held hopes and, and have not gone away and, and my sense of commitment to it. But if I look at where we are going in a spacial web, I think our current concept of privacy and rights will be really difficult to hold the line.
And I, and I think the conversation we need to be having right now is the X is probably governance and right to redress and context, because I think the convenience of spatially being connected to this digital extension of ourselves and we can all go, oh yeah, you know, I, I'm not gonna do that. I'm not on, I'm not even on Facebook, whatever we're talking about. A generation of children that are, will be born into a spatial digital world that won't have choices. That won't be an analog version.
And so the idea of being able to opt out close the door, restrict that I've grown up with in my life that shaped my idea about privacy will be, will be even hard to explain to, to a child in a few years. And so I, I think the X needs to be, what's the governance model that we need for a spacial web. And what's our, what's our right of redress. And how quickly can we get those tracks laid down?
Yeah. Yeah.
I, I do know you for a while and I that's where I think you would go with, you've been working in this space for quite a very long time. So I think that's right in line with what would expect. So Adam Europe,
So just to continue on that theme of governance for a while, I think one of the big problems is it's been certainly things about redress have been almost forgotten. They don't exist.
I mean, if I asked any of you in this room, what you would do, if your national identity was stolen, would you even know who to talk to? No one, no hands.
See, I mean, this is the problem and it's just gonna get worse. And I can see, I mean, there are new things that are happening now from my own country in the UK is working on a trust framework. So department for digital culture and media and sport are producing this trust framework for digital identity and attributes, which is fine.
I mean, that's a good thing to do in my, my opinion, but the governance function for that, despite what many experts have told them is gonna be within government. So they're checking their own homework.
It watches the watch.
Yeah.
That, that can't be a good thing. And it, and it's, it's challenging. Those norms is, is what needs to happen. So that governance actually is a real thing, but also that the ability for individuals to gain redress is not only there, but understood because we don't really know.
I mean, we're many of us in this room are experts, but I would challenge that even us experts don't know exactly what the consequences would be. If our identities were compromised, we'd have a pretty good guess, but how do we fix it? Who's gonna help us. That's all missing and that's wrong because the vast majority of people have no idea. They have no idea about what data they're giving up or what it's gonna ha what's gonna happen to it or where it's gonna be. And as we accelerate towards the metaverse, it's just gonna get worse.
And, and for me, that's why I think the privacy thing is so difficult because the convenience of the, and the ease by which lots of things are gonna connect up, people will just opt in, opt in, opt in without understanding, first of all, the consequences before understanding the read like, well, they, redress will only happen when there is a consequence, but having not understood the consequence going in. Yeah. So you think it's like compounding that issue.
So you're giving me added if I, if I've spoken with you at the conference, you may know if I haven't, you may not know, but my laptop was switched in the bins, in the security entry. So I have a laptop that looks exactly like my laptop, but it's not my laptop.
So, and that laptop has my immigration papers on it and everything that one would need to recreate an identity that that would not be mine. And so the, the topic on redress is, is we don't have our way fully through that.
And, and, and I think when you were talking about understanding the, the, the consequences Katrina, I think that's so accurate. And so when I was thinking about this question, I was thinking about privacy plus data protection. I was thinking control is one of the things that should be in there. How can I can have transparency and control of that data?
I was thinking about accountability, which I think gets back to governance and who is responsible if I, if that laptop gets switched or if, or if I leave it on the chair and say, please take me or take pictures of my credentials and put them on a pole who, who, who ultimately is responsible. And what does that look like? And then I think further education is needed around that accountability. Where are you? Who's accountable and where are you accountable? Because I agree, we don't, we don't have good advice today really to give to people around these topics.
And I was listening to, I don't know if anybody is a fan of Kara Swisher of New York times. And so I was listening to her last night and she was saying how she, she, she burnt an hour last night listening to some ASMR on TikTok, which she does over a burner phone. She has her own burner phone so that she doesn't get into this TikTok algorithm, but she can still enjoy the experience. And I think that's way too much to ask for people in, in it's we've constructed a world where it's too easy to fall into those traps, right? Yeah.
Yeah. I like it.
I'd like to experience,
I just need a phone was afraid of it. I know. Yeah.
Sorry, David.
I bring up analogy in the physical world.
I'm, I'm driving down the road in my car and someone pulls out the side street and hits the side of my car. Then they're responsible. And they pay the damages.
Now, when we move into the virtual world and I'm giving my personal information to someone okay, for a specific purpose, and then they give it away to someone else who happens to use it and abuse it and, and ends up defrauding me of money. Then I can't actually at the moment, get that the damages back from that person who, who did the damage to me because there's no mechanism for that. And this is a whole new infrastructure that actually needs to be constructed in digital world.
It needs, there needs to be a complete audit trail of who gave information to what and how did pass on so that there's an accountability and then damages can be, can be claimed.
And I guess eventually people are gonna get insurance.
You know, once the insurance world gets into things, things start to happen. And if you can actually get insurance to protect you so that, you know, as a company, I can get insurance to protect me against any claim of damages that I might have done something wrong with people's data. Then that'll really start to tighten things up because the premium that the insurance company charges me will make sure that I do the right thing with, with people's data. So there's a whole new world and business thing that's gonna have to be built up over the next decade. Actually.
Yeah, that insurance thing is really interesting. I sat down with our insurers a couple of months ago to take them through our product roadmap, you know, to review our cyber security policy, all those things. And the underwriter ha you know, is, you know, deep, deeply experienced insurance person, you know, decades and decades. And by the end of the meeting, he, he, he just said, I don't even understand half of what you've explained. I want to get somebody from the insurance policy, another underwriter on the call.
And as we started to work through some of these points that you've just raised, they just said, we need completely new infrastructure. Like, like we need different policies. And you know, now that this could happen and this could happen. And one of the things they, they were wanting to do was exclude something the minute they hear crypto. And I'd had to explain the difference between crypto and cryptography, cuz it was like, oh it has the word crypto.
No, we just weren't underwrite it. And I said, okay, well that cryptography is the whole means of securing. And so I think before we can even get to designing the products, there is a level of professional education that, that needs to happen as well.
Yeah, I'd agree. It's, it's a big shift in terms of understanding.
Can
I, can I bring it, this, this is, this is actually some truth from the 1990s when we started up this first PKI company, true trust, which is the name of the company I'm still using. And this was in the 1990s before SIG really was, was going and it was working in Manchester chamber of commerce and a bank. And we wanted to build up a PKI in the UK and we created the company and we went out for insurance because we wanted to give protection to people that if they got a PKI certificate from us, then they would have a certain assurance.
And the post office actually at the same time were building something and they actually built assurance into their system and they would, they would pay damages to people if they issued a certificate wrongly. And it turned out when we talked to insurance company, the premium that they wanted, because this was so new and they didn't understand it, the premium was gonna be greater than our annual turnover. So we couldn't get, we, we just couldn't get the insurance. It just wasn't. And it wasn't sensible.
That was, I was speaking with Joe Carson, Joseph Carson last night, who, if you know him, he was with the X roads project in Estonia. And he was sharing a story about how for the, for the Estonia E I D there had been a, a group of, of people in outside of, of the European ecosystem in, I think maybe Indonesia don't quote me. Exactly.
Cuz I might have the, I might have the location wrong, but basically they were somewhere in that area and they were going to banks and doing the full ID proofing process and then handing that token off and getting paid small dollars, a hundred dollars or something. And then handing those real proofed identities off to, to criminals who were then going out and doing lots of crimes with them. Eventually there was an audit that was done and, and the outcome was that the bank lost their insurance coverage.
It was going to go astronomical for the insurance coverage for those bank and their customers. And so the bank that in question had to actually cancel every account that went through that process.
So, but David said something that I liked and I, I do also like the privacy equals data protection plus assurance and maybe the concept of assurance actually has multiple pillars within it, which covers us on things like integrity and security and transparency and confidentiality. So redress, redress.
Yeah, yeah, yeah,
Yeah, no, I think redress is a really important one, which is just not, not in the picture at the moment. There's a whole new business is gonna develop about that. Exactly.
That, you know, that's gonna stop identity theft a lot, a lot of it, if, if the way that information is leaking out can actually be stopped because people realize that they will be liable if the data has leaked out from them. I think, you know, it's gonna be really important.
I think there's also a big opportunity in the redress area. As we look at everything being connected. I remember a few years ago, pre COVID I flew British airways. It was going to fly to Florence and it was summertime. I was with a friend that had come from Canada.
So she'd already been at the airport for hours and it was like every hour and in blah, blah, blah. So we finally took off turns out five hours and three minutes after our scheduled time. And there was Amer this American guy, once we were up in the air who went up and down the aisles who spends his summer on Aircrafts pays like a hundred Euro, whatever to fly on routes that typically don't take off on time and gave out his card and said, it's three minutes, past five hours, five hours. You're entitled to this, this, this, this, this, and this.
Here's the form. It's prefilled, email it to me.
I'll just take 50% of what you get back from the airline. So that kind of stuff is a smart contract in the future. The thing is, it's too hard in the physical world.
You know, he has to get on an airplane, all of that. But imagine when you go to book an airline ticket, do you want to download the five hour delay smart contract that will automatically trace these things? So I also think, I mean, I'm horrified and I'm worried about lots of things, but I also think that humans we're really clever at coming up with all kinds of things. And I think redress or smart contract redress will be a whole new area of innovation. Yeah.
This is interesting cuz I think the redress part of the equation, when, when, when I have my talk later with John wonderlook, some of the things we'll be talking about is around access to information. And so in Canada, right now we don't have clear policy around citizen resident access to public sector verified information because I think privacy equals data protection plus sharing. If you can't share, then what's the point of privacy. You just have something locked into a space and I would pause it.
That part of the reason we don't have very clear access to that information is because the accountability framework and the redress framework has not been well defined. So there's a fear to actually move forward in that space yet. Yeah. Yeah. How we doing on time?
We much, I think that was it. Yeah. Thanks for joining us, David. Thanks David. Good to see you.