Kim Cameron, Microsoft
Scott David, University of Washington (Seattle) - School of Law
Ladar Levison, Lavabit
Nat Sakimura, Nomura Research Institute
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Kim Cameron, Microsoft
Scott David, University of Washington (Seattle) - School of Law
Ladar Levison, Lavabit
Nat Sakimura, Nomura Research Institute
Kim Cameron, Microsoft
Scott David, University of Washington (Seattle) - School of Law
Ladar Levison, Lavabit
Nat Sakimura, Nomura Research Institute
And my question, first question would be just to say who you are to the folks. Most of the people know you already anyhow, give a short sentence of what you To microphones. Thank You. So my name is Scott David I'm with the universe of Washington school of law, and here for responding to the invitation to be here. But more particularly, I worked in private practice number organizations, foundation, economic organizations, looking into the issues of how to normalize and standardize some behavior identity and data sectors.
So this is a continuation that The chairman foundation probably of stuff, as well As My painting, actually, That one has no button. So I'm Kim, Karen, I'm the architect of identity for Microsoft Azure and windows server. And so since I've created many of the identity problems of the world in the past, I feel it's my duty to come here and experience help move forward a bit. My name is I'm the founder ofit and I'm a civil misfit currently working on protecting our email into the future.
Well, my name is Peter and I'm a partner Lloyd charge of cyber. So in Germany and you're well known. The first question may what the topic of this session and now is what is the next generation of privacy tools? So what we foresee is that will come after all the discussions that we have are in charge of all I was in of just, okay, let's put it this way. I think that the work done by that we were talking about last night, privacy by design committee ended up producing a set of tools, the process of producing a set of tools that allow people who aren't experienced with.
I, I don't have a very strong gut feeling for what privacy is about and horn experience with sort of how to do a rigorous methodology, to analyze the privacy attack surface, to bring them in and allow them to start to understand that they, they have to review everything they're doing. So I think that's a hugely important tool. It's not an end user tool.
It's a tool for the people who are developing systems that might otherwise end up failing because of privacy problems or being, being controversial because of the, then I'm also involved in a bunch of work to actually take privacy, enhanced technologies that we call paths and ship them. So we've had privacy enhanced technologies for a long time in the various, you know, academic circles and so on. But the problem with them was they never got into any mainstream products and they incredibly hard to, to deploy in the form that they're, that, that, that they're in.
You didn't say incredibly enough times. I did.
I, I, I, I I'll say it again incredibly. No, but very, very difficult to, to, as you were saying in, in your presentation, these aren't easy problems. And so to actually get it to the point where somebody can sh you know, put into production a privacy enhanced system, just as easily as they could put in a privacy invasive system is, is the challenge that, that I really I'm working on. Have a question. What do you mean?
I mean, you could, as a group, what do you mean? I mean, privacy means privacy just for myself, right? Privacy to me means control, control over what other, what information others learn about you. So Users Control. So being able to control and Scott did a really good job yesterday giving his very academic theoretical definition of self view and self persona, but it's the ability to control how others perceive you and the way you do that, it's by controlling what information they can learn about you. And when you lose control of that, you lose your privacy.
Well, I think privacy actually, it has many dimensions to it because no, the privacy goes beyond the digital realm clearly. And it's a question of, well, as they say, entrench protection of, of, of private life. And so part of that is being able to in the digital world to be control the way that you project and so on.
So I, I, I think it's broader than simply than simply that the other thing is, you know, so it can also be things like's ability to, to, to be seen in, in, in places. And, and in, for example, for people to have a view on your ments is a question of, I think, a deep question of privacy. And the other thing I would disagree with is that it's only about individuals. I think that's a mistake we've made in having this argument with people who say, well, privacy is dead. The there's also, you know, the organizational privacy is important too.
And so you hear people here, you know, think, think about organizations. Do they want the internal traffic of their organization to be visible externally and so on? That's another form of privacy. And I don't think we should be saying, okay, individual privacy is, is, is one, one thing. And this organizational privacy is something else. I think we should be saying, look, it's all one thing. And so get into the privacy, you know, get into protecting privacy of all kinds, individual organizational. I was gonna say, we should ask Scott.
I'm sure he'd be having to tell us that corporations are people too. Corporations are people. Yeah.
Ted, to answer the original question. I think what the future holds is moving the encryption beyond the wrapper, down to the object where that is an appropriate model for security. What we're discovering is that by wrapping everything inside of say, TLS, we're leaving ourselves exposed because there's a single point of attack. If you compromise the one key, you compromise all the communications coming in and out of the server. And by pushing the encryption down to the object, to the individual message to the individual user, we make it much more difficult for an attacker to compromise everyone.
They, by their very nature or by the very nature of the system, need to become targeted. What I was hoping we talk about because to me, that seems like it's, it's obvious.
Now I, I tend to be one of those people that think things are obvious that other people don't so I could be wrong there, but I think pushing the encryption down to the client is obvious. I think what we will need to focus on after we solve that problem, because which client do you mean?
The tech, the technical, the technology is there, like you talked about the real burden right now is just making it usable. And that will come over the next two to three years. The real problem that we have is that when you're doing the encryption on the client, on the device, whether it's a desktop, a cell phone, a tablet, a hearing aid or a refrigerator is that end point security right now is absolutely horrific.
And if we're doing the encryption on those devices and we can't keep them secure, what we'll see is people just breaking into those devices and stealing the keys directly off the end point. We're seeing that already with people breaking into machines and stealing Bitcoin wallets, because there's a financial incentive to do so break into the machine, steal the little file off of the machine, if it's not encrypted and all of a sudden you've stolen somebody's bank account quite literally. So I think that is certainly the future.
And like I had mentioned in my original keynote, I think the only way to truly solve that problem is with hardware by isolating the cryptographic functions on the device, into a sequestered component, that can't be compromised at the audience system level with that I'll, I'll turn it over to Scott, cause I'm sure he's running out a room on that paper. Well, I, I, I think future, we're gonna see privacy as a service. I think it's call pregnancies of product and that would be really provocative in Europe.
I know people would really get upset about that, but you know, when talking about, privacy's not just fundamental rights, it's also triggerable rights. That's really distinction. We need to make. There's a lot of fundamental rights. We have the first amendment, fourth amendment, fifth and 14th amendment in the United States. We got D law. That's comprehensive. We got EC directives, fundamental rights, the 83 case and Germany established the fundamental, right? It's not all fundamental rights. Privacy also involves non fundamental rights.
And that's one thing we start to think about is what about that ability to focus on those rights that are, are not fundamental, therefore, not as culturally bound. So I think the steps we're gonna see, and I can do the 30 seconds, some examples of that, of what economic and economic interest or something that's not protected by constitution paper. So now problem is, or the challenge is then manner in which the rights are defined are different BOL jurisdictions in the European model. My understanding is it's defined with respect to the data itself.
So it becomes more comprehensive in the us is defined with regard to the right, that the data is relating to is the intrusion is a subset of what might happen. So there it's just is a generic matter for these purposes. You want to, I think there we can, if we unpack the nature of rights, cause rights is not one kind of right. We know that. And so we just treat it as one kind of right. We're losing dimensionality of movement that we can have. And in fact, we markets got it. I think you're being theoretical the first time. Just an example of a, of an assignable, right?
Maybe you could, maybe you could explain it by using, remember the reason decision of the European union court around the right to begotten and the revenue requirement for Google to, I definitely can't explain it, Vivi that case. Cause I've read the case. So I'm not familiar enough with that.
Oh, okay. So, but let me, let me, let me not answer the question right. Familiar. What lawyers do is ask, when you ask me a question, I'll tell you something I know, instead of answering the question, so I'm tell you something now, there, there, it, it's a longer discussion to talk. There's a variety of assignable rights contract rights, etcetera.
Now again, the challenge is can, if you treat any data about any person's relationship, then contract rights become fundamental rights. Now, if we want to treat my buying of a book and let's say the book, forget a book, cause that's much information my buying of a glass of water as, as a fundamental right. That's we're gonna have to think about if that's on par, the things that are potentially more intrusive. The question of channel intrusion and control, I think is what we're talking about.
And the, let me give an example of what we don't want to have happen. HIPAA. It used to be there's four privacy torts in the United States are in common law and the under defamation li and slander, it used to be under common law that if I said that someone had aerial disease, that was per se slander, I didn't have to show any intent because it's so terrible to say that. But if I said that somebody had a broken leg, that was not per se slander because it's not so terrible. It's not insulting to say the broken leg in HIPAA, a broken leg. And the ver disease are identical in terms of protection.
That's my point. What said by your doctor, what it says, if it's HIPAA information, I go to my doctor for each of those. They're both protected. That's the point? Should they both be protected, equal HIPAA?
The, the European law they're established based on what we do before, the question is, is that what we want to baggage we want to drag forward with. So just give 15 seconds of where I see that going right now. We have very subjective views on privacy. We're looking for what person feel and all that. We're gonna move from subjective to objective. And I can elaborate on that later. We're gonna have emergence of metrics that will go along with that. So we'll have extra things to measure. What gets measured gets done, right?
Gets deployed the metrics, gonna involve both technical metrics tools, and then legal metrics. Those two sets, that's going to allow privacy as a service to emerge, and that's gonna allow global markets to emerge those global markets. Right now we have isolation cause of the differences. We're gonna move from isolation to venue, shopping and arbitrage. Cause there'll still be differences, but we'll have movement to harmonization. So that's what I see in software service.
That's the A of what was original question tools for new tools for privacy discussion is what would be the tools which will allow you to implement the level of privacy if we are looking for. So let me find the privacy to the information, privacy, that's the kind thing that we talk.
And, you know, when it comes to tool, we often think of very technical things, but we also, I think, think about that normal uses normally uses, right? And since I view the privacy information, privacy is almost or very closely related with the information of determination, right? It has been done through informed consent. And so what I explicit or implicit through action, but the problem there is that in many cases, people make stupid decision. All right? So there are a lot of incidents in Japan that all, whoever send money to the criminals saying that I son I'm in trouble. I need the money.
They send it. And they actually sense to the criminals account and that's the wrong decision. And people make this kind of wrong decisions all the time. And they also often found in providing personal data, when you're asked to subscribe to kind of service people usually provide the data without they thinking about. And so I think there needs to be some kind privacy service, right, or plan it's called fourth product who can actually help the consumers, help the people to make more decision inform or more, same decision kind of things.
So, so it is, it's a personal Agent Who assist the person. And I think that things really need to trust it.
Yeah, it's a trusted agent. So one of the questions I've been asked a number of times is, you know, why is it so wrong for the government to collect all my information when Google's doing the right same thing. And I think you brought the same, that point in your discussion, which is, well, I have a choice about whether or not I want to work with Google, whether or not I want to use Google's search engine, but not about whether the, want Them to collect all your Information.
And, and what you touched on is the fact that we're moving into an era where even in the private space, you're losing that control because of all of the backroom deals. Now I don't know what the solution to that is because I'm the free market fella inside me is jumping up and down on my shoulder over here and saying that it's not the solution. Isn't putting new laws in place, preventing companies from sharing data.
But at the same time, the other one over here is saying, well, it's not right for all of these direct marketing firms to build these entire archives of everything that I've ever bought and done online and then sell it off to bidder. So, you know, what we see is the, the solution. I think you're right. We're gonna, it's gonna take some time. And eventually the laws are gonna harmonize. I will say this. The one thing that I really like about the online world, and I hope you keep it, is that it's really easy to change your identity online.
You just clear your cookies and register a new email account. All of a sudden you've got a new identity. One of the ways to think about it is, you know, when you're preventing harm, it's, it is an interesting exercise. Cause you really have the situation where, you know, I've said this before in another session, we don't make hammers soft so that they can't be used to hit people on the head. We make hammers hard. So that they're functional. And we say, don't hit people on the head with a hammer, right? We don't make cars go five miles an hour so that they can't be used for bank getaways.
We say, make cars go a hundred miles an hour. Or in Germany, that's in the slow way we be insulting Kilometers.
We say, we say, don't Rob banks, right? So one of the questions here is to think about it. Data technology, these systems are dual use technologies. Let's start thinking about that, right? So what we're faced with is do we want to not deploy the benefit because there's bad guys, what we need to do the way we do it in the society generally is we all gang up as good guys and say, let's keep an eye on the bad guys. That's what we need to do here. And that's what law does, right? The idea of standardization of law is what society is. We made decisions about.
We weren't gonna use one objects that Is whether we want it to be codified as a law, or do we want to have a common practice by, you know, something like stakeholder processing for the industry, something like that and self And that, and that can vary. So I was speaking to Robert Madeline about some work we're doing with MIT and I was showing the contract that we're doing in the United States.
And he, after we had the discussion with him and some of the administrators he's working with, he said, well, we're gonna do it by legislation. And in the I said, I understand that that's fine, but I wanted to let expose you to the type of duties we're talking about because there's two ways that duties are created in the world. Voluntary and compulsory, compulsory is laws and voluntary contracts in legal sense, right? Or policies, norms, whatever. And so it doesn't matter what the source of the duty is. As long as the duty is the same duty.
So if I, if a legislation says, you must have, you know, XB, RL, whatever, like S C regulation, and then somebody voluntarily self binds to that doesn't matter. And so that's how you do it. Trans jurisdictionally. I don't care how the duty gets developed. I just wanted the same duties. Yeah.
But, but one of the other aspects, something that I always encounter when I'm speaking with government people is, I mean, people involved with legislation. The government is that they feel well, first of all, the actors who are, who, who create the digital environment are essentially corporations and, you know, entities that are, that are selling things and selling services and so on.
And so, as we found out in information cards, you can create something that is for the consumer. It doesn't matter unless it's adopted by the people who are actually selling the products.
Similarly, the government is unable to, or people in government who, who want to influence these things or unable to pass legislation to require things that aren't possible. So, you know, we can say, okay, we're gonna pass the law to do things which patently are impossible because it's clear, there's no technology that will do it.
So, so I think that, you know, really what has to be done is to make it absolutely credible that enterprises and, and, and governments too, to the extent that they're themselves, people who, who are operating systems have the ability to operate them in privacy enhancing ways.
And at that point, then it becomes possible to say, yes, well, these are the requirements, you know, that potentially at a governmental through law or potentially through increasing competition, that there would be better practices, but I don't think you can have them without having that technical infrastructure, you know, data protection of being able to use anonymized and, you know, privacy, enhanced identifiers and all, and, and basically some way to get around this problem of metadata that, that became so evident during the snow thing We touch.
You touched on the topic of best practices, one best practice that I've seen that I'd like to see its use increase is privacy policies using them to detail very explicitly exactly what information, a particular site or service collecting the cords about your visit to it and how that information is used. And what other organizations have shared with, I did that with, I know another, a number of other companies have done it, but it's still kind of a minority practice probably in large part because the larger sites should take a Facebook or Google their privacy policy with dust.
We probably larger than the United States criminal code. Not quite, not quite, quite nice, but, but I think that that might be a best practice that we can use going forward because NA touched on it. We're not informed as, as consumers about what's going to happen to our main information after we leave the site. Right?
No, I agree with that. And what I, what I propose is what I call the data reciprocity. So the basic thing is if, if, if any entity collects data about you and makes it available to anyone else, they must explicitly make it available to you And Make it available to you through, you know, restful APIs. In other words, actually available to, to developers. So you can then have developers who are able to provide these services that access your own data. And at that point, you don't have to read a privacy policy. You just look at your data and you'll go, oh my God.
Or, oh, I'm so happy that this is being shared financial benefit to this company. So when it Comes to privacy policy a little bit, having all of my data be available, be an API that anybody who manages to get something on my computer can now access.
Well, at least it's already there available by for people who break into any of those systems. The point is, you don't know it's there. And so you can't have it taken out.
I mean, it's like, that's why I find this EU thing around the right to be forgotten, to be, be probably one of the most important things that's going on. You know, more, just as important as encryption is oblation.
I mean, the problem we're having with privacy online is the fact that computers don't forget in the traditional. If you went to the corner store and bought a code that shopkeeper would forget about your visit in a day or two, you don't have that with Amazon. They remember everything I have ever bought going back about 12 years. And that that's the issue we face. How do we reconcile the two worlds? I think right now the, the stat is every four years, we create the same amount of information as all of the previous reported history. And that number is probably gonna continue to decrease.
We're going to continue to report more and more information. Oh, the best description I've heard of it is that we now live in a world where just walking down the street creates a trail of digital information behind you, just waiting for somebody to pick up Shadow. Yeah. Or I've written that Japan taking survival of, I mean, it's a story about the past. Who's trying to be get forgotten, but this is super supernatural, capable human being called Sergeant jab. Right. And he always starts finds him out and the expect his past.
And that makes a lot of that was In 18 19th century privilege, supernatural puzzle right now, everybody has it. And so I, I, my articles, titles, the M society and how to make to a bit that exciting, Our moderator wants Channel. Yeah. I want to challenge you. This all sounds too nice and sweet. I'm Not sure where you Have to trust that we can trust the lawmakers of this world anymore. And you make the right loss in Germany. I must say 10 years ago, I said the NSA is not our friend. We now have proof. I'm not sure if German is fair, but I don't know. I don't trust them.
I don't trust anybody. That's my, the country I come from from Austria. We don't trust any government.
So how, Why do You really believe that lawmakers American lawmakers, who don't actually acknowledge Europeans as a different society and your laws don't apply to me. I'm not protected here in Germany. Even if I'm speaking in Germany, I'm under American law, nobody. So why do you think that your privacy law will help number one trust, but verify.
And to me, that is why we can't have secret courts anymore, because what we're seeing, isn't the fact that the laws were wrong. It's that the courts were interpreting them in ways that even the lawmakers who wrote the laws didn't anticipate and companies like mine, Microsoft, Google, Facebook have all been fighting those laws and losing in secret and not being able to do anything about it because they're gay. Even people in their very organizations don't know what's going on.
So to me, that's the problem is that the courts, the, the organization, the branch of government that is responsible for interpreting those laws needs to be subject to the same level of transparency that the rest of government is. And speaking of transparency, we probably could use a little bit more of that into the executive branch in the United States. I'd like to, I'd like to get back sense. You made earlier. You said what we do is to solve that problem just to, to team up as the good guys and keep an eye on the bad guys.
I think the problem with the technology one is it, it doesn't forget anything, but the other, the other observation is keeping an eye in on someone in the internet is rather much more difficult than in prior internet. There's a couple of, there's a couple of fundamental challenges there. So trust let's one, our trust first.
So trust, we have a lot of trust. The assertion of trust. If somebody comes up to you, the street and says, trust me, and you don't know him. The first thing you think is I don't trust.
So the, the affirmation of trust itself speaks to a weakness and in a perceived weakness. Well, I've always approached trust in a slightly different way, very incremental kind of way. As a lawyer, I've been not trusted for my entire career. So I had plenty of practice and trying to figure out how to get trusted.
And the, and the, and I haven't been successful at it. That's why I'm still working on the, but the way I treated is if I said to you, I trust that when I step on the brake of my car, it'll stop my car. You would not say that was an odd sentence.
You'd say, oh, that's a good use of the Wordle, but there's no person involved. That's a mechanism. So one thing we can focus on is mechanistic trust. So metrics think of technical standards. You can test conformally against the standard with a metric. What metrics allow is for us to start establishing mechanistic trust first. So recall that when Toyota had a break problem, their stock price went down well, trust is transferal, right?
So you can, so the notion is you can start to earn trust back by reliability and metrics and transparency and accountability and auditability, all that good stuff. Now what you're being trusted, what the metrics are, is kind of arbitrary, not entirely, but the reason I say that is, think about a stoplight, okay. A stoplight could be red, or it could be purple. It's not the color of the light that stops the, that lowers risk. It's the agreement to the color of the light. Green means go red, new stuff. Okay.
So, well, I heard that once they changed in, in China, they had a lot of car accidents. That's red with joy and they didn't wanna stop this slow down. It's a recommendation and yellow means go faster.
So, but the point is for us to start to unpack this and see was possible, right? And so the earning back of trust, now, the institutions that we have now are doing the best they can. They're they're, I've said in another session, our institutions are artifacts of our earlier problems. Cuz there were solutions for earlier problems and there's a gentleman shipping tag. Cause the Chinese accounts I think was in Baltimore and he wrote a book on the general theory of institution change. And what he said was institution change is about taking ideas and converting them into institutions.
So if I said to you in 1970, I worked in software and say, what are you talking about? If I said, I worked in derivatives and say, what are you talking about? They were simply not words that weren't institutions, nothing now and a party say, oh really? What do you do 10 years from now?
You say, I work in data. Identity can say really what do you do? But right now they'll say, what are you talking about? So the ideas that we about become institutions when metrics are established with the normalized behaviors happen. And that's why I don't advocate for markets just because I was from the United States. And we think of the markets is doing so many wonderful things. I'm doing it because I'm trying to figure out a way to pay for privacy. It doesn't pay for itself. It's a very, does anybody think it's not expensive to do privacy? That's why we're here because everyone's hurting.
We're trying to figure out how to do this. But the reality is with all of these things, the ability to do it, you can't do it alone because of the nature of the internet. So we there's certain things, we just don't have choices about. The internet was built on the laws of physics, the universal, and we're dealing with nation state laws here. That's the problem, the challenge. And it doesn't mean. And so if we think of global markets, then the EU and the us and China all get to be market participants, putting their solutions out there and we get a chance to choose.
But if we have interoperable systems for modular, you can swap down, out chunks from different places. And that can be done most effectively with regard to the operations in the middle, not the rulemaking part, which is culturally bound, not the enforcement part, which is culturally bound, but the operations cause every single system wants reliability, predictability, accountability, auditability, transparency, ease of use is a lot common. If we as developers, technical and legal developers, can't figure out how to serve that up. We don't deserve to make a benefit from those markets.
So Going all the way back to the beginning, trust for me, trust is hard to earn, easy to lose. It takes a long time to earn trust. And if we look at the most recent revelations, I think it's gonna take a long time for us to trust nation, state actors, to act responsibly with information tools. What that means. It's hard to say we've known for a long time. I mean I knew back circa 2000 when Google was first taking off, Hey, Google can index the entire web. And that means somebody else could build a parallel system to monitor, you know, all of our telephone calls.
The only thing I think they didn't have back then was voice recognition. I didn't, I never occurred to me that they would go after the metadata. Well now they have voice recognition and they have grid computing. So what does that tell you? But I trusted them not to have a computer listening to every one of my telephone calls. I felt safe being anonymous. I felt like I didn't have to defend myself against a nation state actor. I had to defend myself against criminal hackers and that that's A much easier problem.
Well know for example, for myself working in the international data center environment, I mean, I took it. Mike can tell you from day one, we just looked at this and we said, we're gonna have every spy agency in the, in the planning, in our data center. They're probably already there. Even though we have much better screening of our employees than the NSA apparently had, or at least it's at least it's outsource, but you just know they're there. That's a given. So now what happens next? Well then the question is how can you design your system to be resilient to that?
And so that means, for example, once again, it comes back to, do you trust the people or do you trust the system? And so the system has to be automated in such a way and audit and the automation has to be audited. And then the auditing has to be able to be well, we actually at the digital enlightenment form came up with this concept of crowdsourced auditing. In other words, the, the systems should be able to be audited by anybody and find out if they are the systems that were in fact audited, you know, by professionals to ensure they function in certain ways.
So where Can I download the Windows source? I wanna audit it.
You can, you can actually get you if you would like access to it, just come and get. They do have the open transparency set every, every government and every academic institution in the world that wants it can get access. Yeah. You can go to a Microsoft office and sit down at one of the computers, get it. No that's not true. Or you can be one of the technical partners. No. So academics.
So the, the companies have a proprietary interest in their intellectual property and that's recognized under current law. So that's a, that's what they do. Governments. The securities exchange commission has requirement in the United States. I once asked to SCC the head of automated reporting, David Balaganski. I said S sec had 10,000 attorneys. Could they monitor the financial markets?
He said, no, I'm not. So what they do is all the reporting now done in XBRL, which allows there to be cottage industry basically in financial analysis. It's crowdsourcing. So that's yes. That's. And so one thing to think about is, you know, you have, there's two ways that harms, well, three ways that harms happen in the world, two are of interest to attorneys. One is accidents, not interesting to attorneys. The other one is negligence and intention, right? Somebody accidentally, you know, full accident. That's like a, not due to a human, like a tree falls idea.
That's one thing, but a human negligence. And then you have intention. So the intention is the bad guys. Let's put them aside for a second. Let's talk about negligence. We don't even have the standards right now for the people who want to do the right thing. So we are really, fingers should be pointing at the people in this room and other rooms like this, get your act together. We're all worried about the bad guys. I think That's what the privacy by design stuff. Totally agree.
But, but that, so the first thing ISED, so let's get the rules that we have for people who want to do the right thing. Let's say here's, if you want do the right thing, do it this way. Then when you do, when you have that, then you have a common platform with normalization and behavior among the people who want to do the right thing, then you can do crowd you neighborhood watch, right? It's crowdsourced enforcement, because then my mom can say, this doesn't look right.
And that's what they're doing now in companies like in, in the university of Washington, I get these fishing emails, but most of them now are from my it people trying to train me into looking or train all of us into looking out for fishing. And if we respond and they go, oh no, no, no, no. And they call, they would call and say, oh, you shouldn't have done that behold you again. So what's going on is you're training people per neighborhood watch, right?
So this stuff is happening, but we, we need to start with the, if we're gonna try to scale this, cause there's no way that all governments and all commercial entities are built at hierarchical struggles. Maybe this will cause stuff that bulk nearly all, okay. Hierarchical structures used from a network perspective, centralized and decentralized architectures. Okay. Now we're all on the internet. The internet is a distributed architecture, hierarchical entities that use used to use information flows that were centralized and decentralized are rendered entirely blind.
And so the reason people are messing up so bad, the reason NSA here, defense in the NSA, and here, it depends in the NSA, the first one ever pounded by someone else at Thea, perhaps they're a choice to do their job because you can't do real time querying in the stream because it's distributed. How do you do it until you ask the question? You don't know what, there's no way to find an answer to data set.
So, and not defending what they did in this ness and all that. But I'm saying we all have that same problem because information is now vastly distributed and the flows are distributed. That's what the bring your own device. All of us work in the same company. And we're talking about an issue on company email. It goes from the centralized server and the company can see it. If all of us, same company, same offices, same issue, same, same issue, same issue. And we're talking about it on Facebook or by texting the company is rendered blind because it's decent.
It's, it's a distributed information, same issue, same people. That's where we all are. The institution we have are blinded. And so they're struggling to do the best they can not saying do the best they can not defending it that way. Not saying all of us are feeling that same pitch, the best they can right in. I they're not bad people. The one thing that I've I've told others is that I don't think the FBI agents that I worked with were evil. I think they were given a mandate by their higher ups. And they were doing everything in their power to fulfill that mandate.
And they realized that in order to do that, they needed the encryption keys and the ability to monitor all the traffic on my neck now where the disconnect was is they didn't see that as wrong. They didn't see a problem with that. And the issue that we have is that when these groups are allowed to operate in secret, there's no external basis for morale.
I mean, just myself. When I was going through this process of trying to decide right and wrong, is this right and wrong? I was AC it was actually illegal for me to consult a member of my family, a friend or a peer and ask their advice. I had to make that decision in a vacuum.
Well, and that's why the fair information practice principles are a real nice place to start with this, right? Because when you look at that right now, we have a situation where we're all talking about individuals and we are all individuals, but each of us is not here in this individual capacity. We're each here representing an entity. I don't think there's anybody. Is everyone affiliated here with some, right? So we're hearing each person wearing two hats, right? So we have, we have one project right now with a UN high commissioner of human rights.
That's what, they're the challenge that they're facing is they want to talk about their it channel. How does data get to Geneva to do human rights work? So when I walk outta my house in the morning, I have my cell phone charged up. I see a policeman beating the hell outta somebody. And I take a picture of it. All of a sudden I'm a human rights reporter. I didn't think I was in the morning, but all of a sudden I am okay. Now the state is adverse to me in that situation. Cause the policeman person, how do you affirm up the channel and the state's adverse?
So the situation we have now is individuals leverage their activity through their institutions. Think about it, how much authority, power efficacy would we have? We didn't have our institutions. We'd be like cave bank or, and women. Right? So our institutions give us leverage. We have to use our institutions, the only institution groups we have now, governments and companies. And so we have to kind of work 'em off against each other in the way. Right? And so that's, what's going on now think about it. You have this, we're saying, oh, Google's doing it and Facebook's doing it and NSA's doing it.
And the German government's doing, everyone's doing, but they're are they doing it on behalf of individuals? Are they doing it? Two individuals, a little, both. So that the issue is, and this gets very, even more theoretical is all these institutions. They didn't come from God or the king. We were done with that. God and God, excuse me, say the public. But the root as a root of government as a root of government, it's not usually secular government is what I meant mean to be I'm often.
I don't mean, but this, the notion of sovereignty that's sovereignty is returning back to people and the key people and the people that are doing these distributed systems. What they're trying to work out is what does sovereignty look like at scale? And then when you talk about fair information, practice principles, they're, they're in some way, a noted in a variety of jurisdictions. It's a nice place to start on looking with some people like, cause there's a lot of commonality there. So we can start to develop that normalization, those Markets.
But, but you know, I, I, what I see actually out there is you have people trying to do things, build systems, or, you know, get something to happen and techno technologically with a new, with a new idea. And then as they do it, they, they start to put together some pretty prepost preposterous ideas.
I mean, some of them are very, I won't tell you about some of the horrors that I've seen from, from a privacy point of view across my desk. And, and it's not only inside my company, that things like that emerge it's everywhere all the time. And then what happens is as people start to work on the idea, then you'll have other people who say that's a, that's a goofy, that idea will never fly because people won't accept it. It's like too invasive. And then you have this process of trying to figure out how could we accomplish the same thing, but not go through this privacy invasive thing.
And, and that's happening everywhere. You know, it's, it's become much more common that technologists actually go through that, that process of trying to figure out, okay, I still want to know what the market is, but do I have to actually do it the way I was doing it before, or are the new ways that I can do it, that that are more benevolent and, and will have wider acceptance and therefore be more applications. And so I think it's that dynamic that drives us towards better privacy rather than really this very high pollutant dynamic.
It's, it's more down in the trenches that the thing will really happen. Burg that's. So I was gonna ask, well, yeah, so you mentioned about Phillip right. Or O principles or nine principles. And these are the tangible things that you can check against, whether you doing good or bad.
And, you know, I think getting our awards out to those developers and people like that, so that they always come back to those principles and check, check by the way P right. Yeah. For example, a place like Microsoft, there's a, there's a process like you have to have, we have this department, they are considered a bit like some kind of a, a equal police department by some of the developers. But when you come out with a, a concept or some kind of a design, you have to bring in the privacy people who then review what you've done specifically in light of these principles.
And I think that more and more of the companies, you know, automotive companies, all kinds of companies who want to, to, to do things in such a way that they're not going bring harm upon themselves who want to just do basic risk management, realize that they have to have these kind of processes in place Question to that, right. Is there a new role for the company to take over their leadership, privacy like Microsoft?
Well, and not only the, not only the technology suppliers, but also the, the industrial players, you know, like say, you know, Mercedes and so on, they Mercedes or Bens baler just announced, as Barbara was saying my, my, this, whatever it's called, okay. Systems for consumers, when you're dealing with people's cars, then the privacy of the people is, is, is a thing. And so now the question is, are they able to, to do it in a way that respects these policies? So I think you will actually find the, you know, just like people embrace the green movement and all this kind of thing.
There will be embracing of this. The question is, is it feasible? But then I come question to that, that because the Google, the Microsoft and whoever who were willingly sharing data with the, the bank guys, that's what I was gonna say. This is okay.
We, we were never given the opportunity and you know what, you know, you've been through this too. We were never given the opportunity to defend ourselves. And I can tell you that, for example, you know, there is no vacuum cleaner connected to the Microsoft data centers and scooping up all of this crap, okay.
The, they have to come to us through, you know, legal mechanisms and there is no connection. Now we've never given the opportunity to, to we, we actually haven't stated that in public and, and, you know, the head of our legal department has made that completely clear whether he was actually allowed to do so is another issue.
He did it, the, the point is that, you know, to me, there's actually something very bizarre about the way all of that was played out. It's almost as though there was an attempt to convince people that there was a much higher degree of visibility than there really was. So the whole thing is in need of deep, as you said, a lot more transparency and, and being able to see what, what really is going On chucking into that. So looking back shows what errors we have done.
What is the future then for companies, how to act, you mentioned standards, how is the future look like for, as we have to Just, just make, just continue on with my last comment, which is you see if we actually built our systems properly, which we're trying to do, then they would be immune to internal. What we call, you know, insider attacks to me, the NSA is an insider attack. Okay. That's just as a technologist. I don't see it as anymore or any less than that, because it does, there is no vacuum cleaner.
So, you know, it has to be coming. If there is anything other than what, what is going through the legal process, it has to be through insiders, attacking The company has access to it then and insider would Get it. Yes. And if the company has access to it, then an insider can get it. And so then the question is how can we keep the company and its employees from getting access to the information? And therefore we should be building our systems in such a way that that is impossible and that it isn't possible for it, it to be sort of a drag net across those systems.
So I, and so then that becomes a question of, you know, what are the auditing mechanisms? And then you come back to this crowd auditing and all of the things that are necessary to, I mean, we're talking probably a 20 year process before there could ever be enough establishment of understanding amongst the population to know what I'm talking about here, because the idea of, of sort of automated self administering systems that function according to, you know, schemes that are auditable is, is kind of not what you talk about at a cocktail.
So I, I try and think whole building a system that I couldn't get access to the data. And they found a way to get access to it by ripping it open by changing the system. So to speak, changing the rules of the system. It's like having the ability to modify gravity. You rely upon certain assumptions. Like they can't put a gun to your head. They can't, you know, force you to change your source code. They can put, They can force you to change your source code.
But at that point, the Solution to that is the encryption, putting it, moving it down to the clients that the service provider doesn't have a choice. There's nothing with their realm of possible options. I agree with that too.
I, I'm not, I'm not saying that, but I don't think it's a, a either or thing data protection, right. Should also be there. That's. In other words, the whole notion of security is you have multiple rings of protection. Yeah. And one of them should be this, this protection at data level.
I mean, I totally agree with that. Another should be protection at the operational level and building your system so that people, so that there's no longer individuals running the system, but the systems are run by the system logic through. And the role of an individual is simply to constrained by and audited and reported. And all of that auditing and reporting is built into the system in an uncertain way.
Well, more importantly, not every system in the world can be re-engineered to push the privacy functions down to the client. We're going to always have services where we have to trust a service provider, some critical function. So establishing laws and procedures and operational best practices is certainly important. Going back to your comment about what are the, what is the role of companies in this debate companies versus governments? I wanted to, to point out that a big problem that we have is in many cases, some of these companies, their best customer happens to be say, NSA.
I can think of at least two companies that are represented in the exhibit hall that have that problem. And the question you have to ask yourself is can we trust their products? If there's so much economic pressure at the top of those organizations, Do that question. Yeah. I have final question. Most of the technology, I mean, we discussed a societal and legal ways to manage maybe a solution, but the technology aspect that we discuss or rely on, except except this transparency thing on encryption. So we've seen the last couple of months, the number of research results that's call it.
That's at least I a photographer by I'm a photographer by education decreases a lot of trust in the cryptography engineering ly. So what, what do you embrace Here For this is encryption is encryption.
I mean, all, all the assumptions of protecting information on the devices, relying on the fact that we have actual result form of reliable encryption. How do you see this? Is that working?
Is this, is it open? Can we trust This?
Well, it's a question of economics. The cost of dection is still astronomical and will be astronomical for, for some time. It's not that dection is impossible.
It never, you know what I, the Matthias we're still teaching. Okay. Yes. There's two things. One is the implementation. So you have things like heart fleet, whatever it was called, which, which you knows just a, a stupid implementation, but so there's two things. There's that?
And, and that's the only way to get around that is, is, is through the proper engineering reviews. If, if they had had the proper engineering reviews, they would never had those problems.
No, they did not. That Code was reviewed after it was checked in and the person who reviewed it missed The book.
Well, that's what I'm saying. They didn't have, they should've had automated reviews that should have been cat a cloud. Yes.
That like, for example, with Microsoft, that problem would've been caught, but an Automated review, this one missed definitely. So, you know, a couple of things here with quantum encryption and that comes online would be interesting. Cause it right. It allows the detection of detection, right. But one of the things, you know, you don't wanna put all your eggs in one basket, right? That's not a good idea. So data is increasing mathematically exponentially, mathematically exponentially. So the value potential of that is absolutely unprecedented, right?
Privacy used to be an artifact of secrecy. Paper files in the county were open, but whoever saw them, right. Secrecy is dead, but privacy doesn't have to die with it. Privacy is a social contract. We can treat it. Now. It doesn't mean that we shouldn't keep trying to keep it secret. Let's keep trying encryption, quantum equivalence go for it. Cause we need all the help and get. But ultimately it's a social contract that's enforced by social norms. It's enforced by social means. It's illegal only cause we decided it's illegal. Right? We make decisions as society. I know I'm a lawyer.
So I'm seeing every problems with legal problem. But if we're using technology to try to make people reliable, we're own with one or in The water, no, but we can use technology to make things expensive. Absolutely. And making things expensive, makes people behave in certain ways. And so I think that's part what you're saying by saying, we should go as many keys as possible. Then if the cost of, of, of penetrating a key is X, we just Automated the process of fulfilling the warrant because I wanted to keep it manual.
I wanted to keep it difficult because that delayed the amount of time, that increasing amount of time, it took me to respond and delayed the benefit of me turning over the information. And the whole point was to ensure that there was a review and that the power wasn't abused.
And I think the more we automate systems, the more we lose that human element going to cryptography though, I think what was very disturbing about some of the revelations was how the NSA was trying to affect the standards process and going in the companies that were implementing cryptography and getting them to surreptitiously in As old as the Hills too, they were doing that, you know, in the 1980s, it's not, I mean, look, I was, I was telling when I started selling cryptographic stuff, it was considered a munition. Yeah. Oh Yeah. Okay. I had to go, I didn't have to deal with Thea.
I had to deal with the, with the Canadian CSEC. They were, they were so charming that one day I asked them, how, how can you guys be so charming? And they said, it's because we were so mean spirited, that we were all sent on training courses by the auditor general.
So anyway, so, you know, we used to have to go every week for a meeting with our handler so that we could get the export licenses to sell our, our, our software to those mortal enemies, the Dutch. So, so We can, One of the challenges, one of the challenges you had is channel channel security. So there was a Dutch study in 2009 that suggested that data about people. An average person was on 2,750 servers. So problem, as you said is deployment.
Let's say you got great encryption in 2,749 of those situations you lose because if my social security number is on a thousand servers, you don't, you can't secure the whole system equally and part, and I'm not saying you shouldn't try. But part of the challenge is how do you gotta get the standards? Gotta make it easy to deploy the protection, right? Because otherwise you have opportunities. Systems have gotten too complex for any individual. And in most cases, any organization that audit every component of that, as a result, we have to rely on reputation.
There are certain open source projects that I trust more than others, even though I haven't necessarily audited them myself. There are other projects that as a developer, I've worked with the code directly and therefore looked at it and know exactly how it works. That's the issue going forward.
And the, the problem is we still live in a world where half of it is closed. The servers are closed. We have a lot of closed source software. More importantly, we have closed hardware.
You know, how do we, how do we know the Intel random number of generator isn't tainted? We can't check. How do we know that we don't have one, there's a back door. We wonder why in the bios for my servers that captures the description key. When I type it in at Buddha, we have no way of checking for those. And openness is one solution I don't know of another, but maybe somebody else does. And I think the same goes for cryptography keeping those standards and that process open. We look at the pseudo random thetical curve, pseudo random number generator, which made headlines.
That was a standard that was largely ignored by most of the cryptographic community. The few people that did look at it said it was ridiculous, but because there were a few people pushing it, it made it through then companies adopted it without realizing that it was a weaker standard. And even then, if you look at the vulnerability very closely, you'll realize that the math was actually sound.
The issue was that the spec or the standard suggested some constant values that were the public key of a public key, private key pair and the, in an appendix, they actually told you how to generate your own new set of constants, which if you had done that, would've made the random member generator far more secure. So, you know, subtle things like that at the implementation level are what are gonna make the difference. And the only way you can catch those is by getting enough eyeballs on a problem. You look like you were about to say something A couple it's so interesting.
Would you like to finish maybe with each one, getting a short statement on what's your prediction? What's the Most likely thing to happen in that area in the next five years? So it's really limited perspective. So what's your guess what's going to happen? What? Sure. Thanks a lot.
So, so again, I think we're gonna, we're seeing it already. I think there's going be with the companies are gonna step forward because they're under pressure because they can perform more nimbly than governments. So we're gonna start to see incremental competitive differentiation based on privacy. And so that's gonna lead to privacy as a service differentiations, which then will lead to those privacy elements being outsourced, which will create the fourth party system. And I think that's how the market's gonna start.
It's gonna be incrementally offered by companies and then companies are gonna come in as full privacy companies, which will allow that separation and that reduced conflict potential. And so I think that's Right.
So you, you said, well, I wanted say, so I say something else. So I think in five years time, Pia is gonna be much more popular. And I think the industry start to start to create somewhat some code practice and we start to find out who's inheriting to develop that kind of color practice, not, and that's going to improve our life. Okay. Yeah. I agree with, I agree with both of those prognosis to a certain extent, except that I, I think, I don't think the, the future is that privacy is separated from the rest of security.
I, I believe that it is that privacy is integral to the rest of security and it's just a matter of making the tools simple enough for people to, to use them and cost effective enough to use them that this competitive environment that we're talking about can actually, cause, you know, I, I would actually say massive uptake, especially given the, the other media is Ian would say arriving, which is the internet of phase.
I think in the short term, from the bottom up, we're going to see an increased use of object level message, level information, level, encryption, encrypted, telephone services, instant message services, encrypted email, and the service providers. The companies are going to become more of a commodity because they won't be able to peak inside the package, think of an Android phone that you get. It does all of it encrypts all of your data, much like Firefox sync before it leaves your device.
And you can pick from any of a hundred companies that it stores that encrypted data in, in the cloud, the cloud becomes a commodity. I think from the other end, there will be data that we will never be able to protect just because of the nature of it. Numbers dialed, purchased, bank account information. Yeah. Places visited. And I think from that end, we're going to see changes both on a legal, political and operational front. Personally, I would certainly like to see an end to secrecy.
I, I think a government shouldn't be able to investigate you for more than a certain number of days, say 60, without it becoming a public record, if they haven't found enough information on you in 60 days to charge you, then you should at least know that you're under investigation so that you can do something about this idea of being stuck on a no-fly list for 10 years without ever knowing why that needs to end whether or not we have enough impetus in the community of people who understand these issues to do something about it is yet to be seen.
Some countries are doing a better job than others as an American. I can't necessarily be, say that too proudly, but I'm hoping that will change. I think five years from now, we're probably going to see a bigger shift away from a focus on the software and the protocols and more onto the devices and, and the hardware. Like I said, building those systems in a way where we can detect Trojan firmware detect when, you know, our Cisco router was interdicted and a little extra device was added to its motherboard. We don't have the tools or necessarily the information we need to do that. Right.
And hopefully over the next five years, people smarter than myself will come up with ways of doing that, that fit into the business models of today. Okay.