Scott David, University of Washington (Seattle) - School of Law
Dr. Michael B. Jones, Microsoft
Dr. Karsten Kinast LL.M., KuppingerCole
Ladar Levison, Lavabit
Amar Singh, KuppingerCole
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Scott David, University of Washington (Seattle) - School of Law
Dr. Michael B. Jones, Microsoft
Dr. Karsten Kinast LL.M., KuppingerCole
Ladar Levison, Lavabit
Amar Singh, KuppingerCole
Scott David, University of Washington (Seattle) - School of Law
Dr. Michael B. Jones, Microsoft
Dr. Karsten Kinast LL.M., KuppingerCole
Ladar Levison, Lavabit
Amar Singh, KuppingerCole
To a very interesting cast of panelists here today. And I think we have an excellent subject with privacy and compliance. This is gonna be the whole day on that issue. As you know, we are gonna start with privacy and communication, probably one of the most changing issues in the moment we want to focus here in this round on how strong privacy could be in the internet communication for granted that it's not perfectly strong right now.
And of course I have a legal background and would like to contribute to it more than just being the moderators who have a double function here today to the legal barriers. But I think we should introduce ourselves. Maybe we want to start with you later and give a short information on what's.
What, what, what makes you be here today? Yeah. Is this when you want me to give my intro or Yes, please. Okay. My name is Luard Levison. I am the founder of Loveit a company that made headlines last year because I shut it down rather than turn over the encryption keys that basically would've given the FBI unfettered access to all of the communications coming in and out of the network. I just felt it was an egregious violation of privacy and contradictory to current law.
Since that happens to be our topic today, privacy and compliance, and some of the barriers to, I would assume protecting privacy in various parts of the country. I thought I'd set the stage by talking a little bit about the United States. I guess I'll start with the 800 pound gorilla. Her name is Kaia that's a little late for that. Now Kalia effectively makes it illegal to operate a telecom company in the United States that does not allow access to United States law enforcement agencies so that they can listen and monitor in on any of the phone calls that are going across that network.
Now what's important to know about Kalia is that it doesn't apply to internet service providers. At least that's what we thought. There were discussions about extending the law enforcement assistance provisions of Kalia to internet service providers. And they were rejected by Congress, which made my particular case somewhat interesting because it was a situation.
The us attorney's office came along and decided that they would take the technical assistance provision of the pen register trap and trace statutes, and say that I was provi required, provide technical assistance to facilitate them capturing metadata information and that in order for them to capture this information, they needed the encryption keys. Now to me, technical assistance is manuals protocol specifications pointing out which wire or person's communications are going over. Things of that nature.
And I just felt that trying to lump in encryption keys in that category was a little bit over broad overreach. Now I'm not a lawyer, I'm sitting on a panel with lawyers, so I'm sure they can give you a far more precise, legal definition of some of the issues that I'm talking about. But I did grow up in California and we have a long tradition in California of not exactly going along with what the government thinks and in high school that involved learning a little bit about our constitution. And particularly I recall doing a project on random locker searches and how they were illegal.
So when I first encountered the FBI and they said that they wanted these keys so that they could decrypt every connection, parse it, figure out which ones belong to a suspect. And then in theory only record those particular connections. I started thinking random locker searches all over again. How can this be legal? Not to mention the fact that I had never heard of another company turning over its SSL keeps at least at that point. So I took him to court. I lost.
And when I lost, I decided since there was nothing I could do about it, the gag effectively prevented me from telling anybody I had lost even members of Congress. I mean, it's, there's quite an interesting exchange in court where our motion to unseal the case, or at least the SSL key demand is denied. And my attorney asks the judge for at least permission to inform members of Congress, how their laws are being interpreted. And the judge says, no, that would harm the investigation, democracy and action. So I just felt like I had no choice, but to shutter the business.
And I started looking at moving it abroad and looking for other countries that had friendlier privacy laws. And I ran into two problems.
One, if I continued to live in the United States and controlled the system, I would probably still be subject to us jurisdiction, which means I could literally be put in a position where I would have to either break us law or the laws of the country and where the servers were hosted, which I'm sure is a problem that we'll talk about a little bit more later on coupled with the fact that if I went outside the United States, I'd spend the rest of my life, defending my network from a large intelligence agency that would be determined to break into it because they knew if they asked, I'd just turned them down.
So here I am effectively trying to solve it with technology, trying to create a world in which we don't need to have explicit trust in our provider, but that's the future. And the problem is even if I fix it for email, I can't fix it for everything. I'm only one, man. So this question of how much access our government's warranted or entitled to get and what procedures and safeguards must they go through to get it is still a very important one. And it's also a very complex one because every country has a different standard and a different set of rules here in Europe.
Privacy is a basic human right in the United States. We're supposed to at least have some MOOC crim of privacy as guaranteed by our fourth amendment. But of course, that has eroded to the point where the only place you can consider yourself private is in your own bedroom. Everywhere else is Fairgate well, maybe your kitchen too, but the bottom line is once you leave your house, once you leave your castle, there's nothing left to protect you. And that what's very scary about that is it applies to the bits that we generate when they leave our house. They become fair game for inspection as well.
So I think I've used that my five minutes. So I'll hand it off to the open speaker to my right. Thank you so far. Very good. So I'm Mike Jones, I'm a standards architect at Microsoft working primarily on digital identity issues, but along the way to identity, one of the journeys that I've taken has to do with digital signatures and cryptography and in particular, trying to make those easy enough to use that normal internet programmers will decide to use them as opposed to being in a world where in order to do internet cryptography, you had to get arcane things like XML, canonization to work.
So I'm coming at this primarily from a technology, an enabling point of view. I will say that while I'm not in the center of this, one of the working groups I sort of audit at the ITF is the TLS working group. TLS is transport layer security. What most people know of as the HTTP S for secure in front of URLs, as well as the, the HTML working group.
And particularly in light of recent developments, there's a lot of impetus inside the IDF to try to do what they call opportunistic encryption or encryption of network traffic by default, even if it wasn't directly requested, which might have the effect of taking bits, which are legally observable and at least rendering them quite a bit more work to get anything out of PFS. So with that, I will pass it on, but look forward to the discussion. Thank you. Thank you. My name is Omar.
I'm a senior Analyst Analyst at K I come from, although I have a technical background, my main kind of perspective is also from end users and management. And I, I also fulfill kind of interim CS O role. So I see it from the end client point of view, you know, the end users and privacy and communications, you know, as the topic here is very, very relevant to me because it's fascinating how there's one word I've kind of put here is ignorance. Actually people know they want privacy, but they really don't understand what maybe sometimes privacy actually means. There is a lot of conflicting advice.
One story that I share, it's completely kind of, you know, I'm gonna use the word confuse me. When I first heard about it is I joined an organization some time ago. And the first thing that I was told was, oh, we have expert advice on how to communicate secretly. And I said, all right, you know, something I have note of, I'd love to know what it is. And they said, we have an expert advisor who speaks on BBC and sky, no names. It's not me who, who said that if you want to communicate secretly by email, what you should do is share one password amongst all your team about me, right?
And only save your emails in the draft folder From a movie. It actually they're actually doing it right. They were doing it. And it was a common practice. And it was considered at that point in time, the most secure operation apart from falling on the floor, laughing with my turbine, falling out and all of that kind of really funny stuff.
You know, I wanted to bang my head on something because they, they really truly believed it was secure and that their communications were private and that they were sharing one parcel with 20 people. I just couldn't believe it. And that part of it kind of, you know, epitomizes the whole problem with privacy.
You know, the many other examples, I'm sure we all could share, but that is the end user perspective of we are really, really secure with our communications. There's no way they can because we haven't sent that email. So that's it for me. Thank you. So my name is Scott David. I'm from the university of Washington school of law until two years ago. I practiced law for 27 years with a firm in New York, doing financial products, and then moved out to Seattle where I started working with Preston gates at Ellis and that's bill Gates's dad's firm.
So I was his partner for about 18 years before he retired. We helped to form the gates foundation. I did Microsoft second stock option plan. I worked on as a tax lawyer. So in a long history with Microsoft also worked with open ID foundation. I was their council and also formed open identity exchange. I was their council. So that's my background. I worked with the world economic forum on their data development initiative, work with the UN high commissioner of human rights, securing their it channel, which is a state action kind of issue there.
So the number of projects out there is a welcome trust project. We work on with the global genetics database. There's a lot of projects working on trying to understand some legal structure for data and gross. I have a number of things I wanted to talk about. I didn't even know if you wanted to do them now.
I wrote, I actually don't usually work off notes, but there was kind of a convoluted piece. It take about five minutes, but I don't know if I should do it now. Is it? That's what we okay. Perhaps we, so that's great. So let me let, just go through, ask you to roll up the sleeves of your mind for a minute. I'm a very theoretical guy. So I wanna go theoretical here. I'm gonna dip into non theories. So just bear with me here a bit. So I wanted to, I looked at the title and I was thinking, okay, I'm doing five different sessions here.
So I wanted to distinguish each of the sessions and this one, I said, okay, let's define privacy list, defined communication. And one of the things I think is really important here is when we're talking about privacy for a while, I've talked about it as channel integrity. And what I mean by that is there was a gentleman, Irvin Goffman in the sixties who was a social theorist who had an idea of social theory, self. And the idea was that people perceive themselves as they believe other people perceive them. So if I want to be perceived as a mountain climber, I'll act like a mountain climber.
So people believe I'm a mountain climber. And then my belief that they think I'm a mountain climber, you know, recruitive back and forth. Okay. There's a guy Doug stead who wrote a book go down and I tried to read it. I read two chapters.
I said, I don't even know what this guy's talking about. He, he wrote a second book after that book, I am a strange loop and he explains me what he was talking about. Inher doc and both urban Doman and Doug Hoffsteader were both talking about the idea of a social theory of self.
The idea, both of their concepts is that me, the eye, the sense of self, the, the internal identity, not reputational identity, the internal identity is derived entirely. It's an emergent phenomenon from this series of your life's expressions and perceptions. So when you're a child, you're in your high cherry, throw your bottle on the floor, your parent picks it up and you say, oh, efficacy. I can cause things to happen by what I do. Okay. So that concept what, the reason I've been focusing on that in my practice for the last five years is the notion of how do we increase privacy?
I think it's something that you can't increase directly. I think it's an emergent phenomenon. It has to do with identity. The way you do it, I think is to give greater channel integrity on the expressive channel and the perceptive channel. Okay. So that we can do that. So let's talk about that a little bit, what you need for those channels. So you say, well, how do you give integrity to at any system you measure things and you check conformity to your measures, right? So what I've talked about for a number of years is the idea of tools and rules for making the systems had reliability.
And I know that's heard me say this many times in O meetings, but the tools, when you are a reliable system of technology, you create a specification and then you check conformity that specification that's the tools reliability. So if we and I are gonna build a, a system and we both had the same specification available to us, if the specification is sufficiently comprehensive and we build to it, then even if we never met each other, we just met each other. Now our systems would be interoperable cause they would conform to the spec. So you could do scale standards of that.
Spec would allow you to do scale for the technology. The problem is that there's also people involved in the system. When I was in private practice, I worked on about 50 data breaches, 50 different data breach events. Some of them would curl your hair. They never made the press. I'm sure some of you are aware. Some of them never made the press, but in each of those, I never saw one. It was a technical issue. It was always, always a people issue. It was either somebody's boyfriend had a drug problem. So she stole credit card numbers. We left a laptop on a bus. Always people issues.
People are not made reliable by conformity to specifications. They're made reliable by conformity to rules, engineers, don't draft rules. They draft specs lawyers, draft rules. That's where we are. The problem is the internet was built in the laws of physics. It's universal. The laws of people are not universal and that's that's fundamental problem. We can have interoperability across the world immediately on the technical side tomorrow. That's the subject saying it to BLI me. I know the engineers say, wait a second.
But, but, but the laws, what the problem is. So the problem is we need to think about mechanistic trust. I think because how do we create something that you know, people talk about trust?
Well, trust is a very slippery word, but if I say to you, I trust that when I step on the brake of my car and stop the car, you wouldn't say what a weird sentence, what a weird use of the word trust. You'd say, oh yeah, that makes sense. You trust you're trusting a mechanistic system. It was designed by a person, but ultimately it's a mechanistic system. And so one of the things that I'm believe is that when you really need to focus on the mechanistic elements of trust first, and so what would that, what would that be?
Well, if we wanted to figure out what type of things you could start with, you start with the things that are most scalable. And so in each country, we have different in each system, we have rulemaking operations and enforce in any civilian system, the operations of the most neutral there in the center, we making these really down to the different cultural phenomenon. Cetera. So is enforcement. But if you think about operations, what is in common with all systems? So what does everyone want in their system?
Everyone, whether you're the Russian Maia, whether the us government, whether you're Argentine in business, doesn't matter. Everyone wants reliability, predictability, ease of use, auditability, accountability, transparency. Those are common. So if we start to think about reliable systems and mechanistic systems, what I've been focusing on is what can we do those mechanistic elements for trust and for reliability?
So one of the things that when we talk about the standards and those common metrics and the legal technical standards and legal standards, starting to think about that, well, the legal standards, what does that mean? What that means is standard duties. So the way law works is rights are Nu without duties. If I say have the right to free speech and no one respects it it's words on paper, meaning duties are where the action is. Rights are an artifact of duties being performed adequately. So when you look at these say, okay, well, there's different kinds of rights involved here.
We've got fundamental rights, we've got economic rights, all sorts of different rights. And one of the ways I think we can help bring the us and the EU together. And there's a lot of challenges there. We focus on the differences, but the, there are obviously fundamental rights in both jurisdictions. Now the fundamental rights typically are compulsory. They're not really negotiating in the us first amendment, fourth amendment due process of the fifth and 14th amendment. Those are all elements of fundamental rights of constitution. You can't contract away constitutional rights.
And in the EU, obviously there notions of fundamental rights also from engaging directors, manifested and other German laws, et cetera. So one of the things I've been thinking about is to go through those different commonalities in terms of fundamental rights, but then identifying on the operational side, those economic rights, and we can standardize those. So the duties come from two sources, all duties come from one of two sources, compulsory or voluntary. So compulsory is legislation. I don't have a choice if my country has a law, gotta follow it.
The, and so fundamental rights might lend themselves to compulsory elements. So in the United States, you can't contract the way constitution in the Germany. You can't contract away fundamental rights.
So, but there's also, you can also get fun. Excuse me, rights from contractual arrangements, voluntary arrangement, excuse me, let me back up duties can come from compulsory or voluntary.
So the, you can get the, the duties assigned either by self binding to a contract or by legislation. Sorry. I mixed that up with the duties and rights.
So the, the point is to look at across jurisdictionally, the there's no international legislature, so we're not gonna have compulsory international laws on this stuff. It's just not gonna happen in the near term.
Look, international laws, law of contracts, treaties, too. We're not ready for treaties here. You need norms for a long time before treaties to happen. So if international laws, law of contracts, contracts are always voluntary. If you enter a contract with the gun at your head, it's compulsory, it's a Nu doesn't doesn't work. So the question is here, can we take and almost done with this? Can we take the idea of contracts and voluntary self binding to duties and make reliable that operational center piece? That's where I think we are in this.
And so for, from the question of, can we a cause there to be a meeting of the idea of law and privacy internationally, it's really a matter of establishing those metrics and then understanding what we need to do in, in almost like a, a large scale negotiation and really that's what's happening in this room. We're negotiating those contractual terms for international agreements with dealing center, that operational center that can be then volunteered into by players, governments, companies, individuals in different comp in different countries.
So that it was, I know it's a bit convoluted there, sorry, but I wanted to convey that idea that there's efficacy here. We feel like this is really out of our control, but the again, contracts allow us to do things at scale. The example, just a quick example, you have different stock exchanges in Tokyo, London, and New York. Each one is conformant with local law, but they knit together with contracts, swabs, forwards futures, et cetera. That's what we're talking about here. You can have local independent laws created by legislation, but then define that contract layer that knits 'em together.
And I think ultimately that'll be the source of privacy because it lends integrity to the channels on the operational side. Sorry about, thank you. Thank you for, for your thoughts. Thank you for sorry.
No, that's fine. I'm wondering you, you stated right in the beginning before you introduced us to your, to your thoughts and to your ideas here, you were stating I'm a theoretical person. I know the feeling, but how are we ever going to achieve anything like that? Taking for granted for a second? That exec that's exactly our roadmap. That's what we should do in case panels. Don't agree. Please show us that we don't agree, but I had the impression that you were not, you were pretty much with him with what, what your ideas just gave to us. So how are we ever going to get there?
I see different stakeholders, the ones that you mentioned before, I see so much to do for all of them. It's the citizen at the end, but it's not always the citizens we can see with your example. There was just another player, which is the authorities asking you to do some things and to, to support their action. So probably it's something that you don't agree to that up only to the citizen or to accompany, to achieve all what we want to achieve.
So what do the different stakeholders in the near future, in the further future need to do in order to get the homework done, if it's exactly what you pointed out or derivative from that, whatever, but what do we all need to do? What do companies need to do authorities in the us in Europe and so need to do? And what at the end does the citizen need to do? That's what I would like to, to really discuss based on your thoughts. So any opinions on that? Sure. So in the technology community, there's a phrase which is sometimes used.
That goes something like different than that, that code is policy. That if by default you enable something sort of irrespective of what it was in the software that people use every day. Most people do not change the defaults. Most people aren't interested in it. They don't know how. And so some of the discussions about email or data at rest, if the default is that stuff is moving around unencrypted and stored unencrypted, well, that's what people will do.
And without regard to, in most cases, law or public policy, if software was to be in people's hands that by default encrypted things, and by default communicated things in a way that only the intended recipients could read it, whether the law has changed or not, you've changed the privacy landscape of how communication and data operate. Now, I'm fully aware that to do that and to make it easy enough that people will normally do it. There's fundamental technical challenges, which the engineering and the public policy people do need to work out things like key distribution.
My wife doesn't really know what a cryptographic key is, and she's not interested in it. She knows she doesn't want to type anymore passwords and asks me, when are you gonna get on that? But if we got to a place where I could in an opportunistic way, decide to send any of the panelists, something encrypted and know they would be able to read it and others wouldn't, we would be in a different defacto policy world than where we are now. We haven't achieved that.
No, that's fine. I go back to the end user perspectives. I've been thinking and discussion on the different a few months ago. And the question I asked is, in fact, I don't know if anyone has hello.
Yes, it is right. Sorry. I have a great loud voice to follow, keep it away.
What if, what if Google turned around today and said, folks, I'm gonna give you 1%, 2% extra on your bank account. If Google ever did become a bank and maybe they probably would in the future, if you let us, you know, if you give away your, a certain amount of privacy, I really don't have an answer to that. And the other entrusting, you know, I have a 16 year old daughter, 12 year old boy, their concept of privacy is completely different than I think what maybe a lot of people in this room have as their concept of privacy.
And I think the problem with privacy, one generational concepts, you know, it's a very subjective topic. People are appear to be, to me, happy to give away previously, depending on what they get in return, talking about the efficacy of that. And that is actually very fascinating itself because what if 50 years from now, what you and I, or all of us here think of privacy. And we are, we are really kind of, you know, passionate about the United Kingdom is very passionate about not having and identity card.
You know, well, Singapore, where I originally come from has, and hasn't had an identity card for donkey number of years now, almost 20 years, every citizen to get one ID, that's it? You know, and that is part of the interesting challenge that I see when I deal with the end user community is on a technical perspective. It becomes a very different discussion, but from the end user perspective, the ones who are really not bothered about, you know, encryption, they want a bit of privacy, but they're happy to kind of compromise it at, at certain different situations.
And that's when it's really becomes entrusting because when you go and deal with, from a corporate perspective and you start dealing with, you know, the Amazons, the really the folks who are operating in almost every jurisdiction, trying to talk to them on a contractual base basis becomes a very, very difficult task, especially from a C I O perspective. I've, you know, I've been a chief information security officer, and either they throw hundred different contracts and say, choose the one you like best.
And then you start getting legal, involved and legal folks themselves are actually finding it very challenging. Is do you pick the most restrictive policy? I know Germany has very, very interesting privacy laws, or do you pick the most kind of common baseline that can be operated across almost every country? And that's where again, like, I, I, you know, it's of real interest to me is the end user having a totally different idea of privacy compared to maybe all of us panelists who are looking at it from a very different perspective.
So this is, this is fantastic that Mike's and DeMars comments said, so Mike talked about defaults and you talked about generational change. So I want to link those together for a second. So defaults aren't neutral, right? And so legal defaults are policy defaults are policy legal defaults are called institutions. Think about it. And those institutions are default that were artifacts of prior solutions, right? Because institutions take a while to build they're built in the context of old stuff and their legacy, the default setting sometimes remains past their time, right?
Pass when a problem is there. So what we do is redeploy them for new problems, pick some modify, shift them like with defaults, right? So if you look for the legal side, think legal programming, right?
So the, the change challenge that we have, and this goes to the generational issue. If you look at social theories of self, we define ourselves by our institutions, by our default settings. So we ourselves are resistant to change because changing our institutions means that that's changing fundamentally who we are. So generationally, we in this room, depending, probably two generations of this room, maybe one and a half, three, I don't know. Let's see some gray hair four, I don't know. But each of us has ourselves identified with a certain set of institutions. That's our persona, right?
So now our, our institutions are making because they're performing poorly. In some context, some governmental institutions have performed poorly.
Recently, some commercial institutions who performed poorly, what they're doing is essentially what my teenagers did before they moved out of the house. They dirty the nest a little bit. They're making it easy for us to abandon those default settings, cuz they're saying, we're demonstrating to you. We're really not ready for this set of problem. We're not up to the task. And so what I think is happening really is that the different generation is gonna have different institutions.
And what the, the thing that I wrote down here is, is we're all very serious about these issues, but in the future, we're gonna be called quaint, right? We're gonna be like, oh look, they were trying to help. They were trying to help.
How, how sweet of them to try to help. But we just simply don't understand some of the things, the institutional constructs that we bring to bear our paradigms are just not ready for prime time on some of the issues Later. What do we Think?
I, I figured I'd start with this concept of industry norms because I think that's where we can have the biggest change. Most of this room is filled with engineers and engineers think very differently than lawyers and politicians. And as engineers, we look for technical solutions and that technical solution could be a privacy policy. It could be a ULA, it could be a contract, especially if Scott is your lawyer, it could be underlying code and how it functions if Mike is your best friend.
But the bottom line is that these norms that we're setting are going to define for the next several generations, how we view and react to the issue of privacy. Now, one exercise that might be useful later is to have each of us define what privacy is. But if we operate under the assumption that it's about giving users control over their information, how do we codify that? Both with code and with policy. And I thought, I'd start by talking about what the original policy was at. When I would, would get a search warrant from law enforcement. Now I'm not anti law enforcement.
I just happen to be pro-customer and anti-crime. So to reconcile those differing views, when I would get a search warrant, I would freeze the account, lock it with an administrative lock and then turn over the data. And if the user would contact me, I would let them know that that was done because there was an indication that the account was being used for criminal purposes, because it was the subject of an investigation and give them a copy of the warrant. And our ULA said that, you know, we reserved the right to lock accounts if they're being used for criminal purposes.
Well, starting a few years back, I started getting warrants that had a little extra tidbit of language in them that said something to the effect of if taking or fulfilling the requirements of this request would in any way, indicate to the user that there's, they're under investigation, we would advise you to please contact us first Conflicted with my existing policy. But for a while I went along with it. I didn't know how to react. I didn't know what to do.
I didn't like the fact that criminals were continuing to use my system, even though I knew they were criminals, but here they were suspects. Yes. Thank you. Suspected of criminal activity, but I had to go along with it. I didn't really think too much of it until last summer when we started hearing more and more about the Fisk and then I kind of smacked up against it firsthand.
And what scared me is that what I had seen a few years earlier was in fact, part of a much larger trend of court enforced secrecy, whereby there is binding secrecy upon a third party that effectively infringes upon their right to free speech. So I see something wrong, but I do not have the ability to say that it's wrong because the court has ordered me to remain silent. I thought your comment about how you can't contract away, your constitutional rights was also interesting in the United States.
When you're about to be exposed to classified information, they actually make you sign a contract stipulating. You will not divulge the national secret. You're about to be let in on. And that's their way of dealing, reconciling the constitutional first amendment, right? To free speech with the, So it may have been illegal for Snowden to divulge the documents because he was violating his contract. But at the same time, anyone he gave it to could speak very freely upon it.
So just when you, when you think about how law enforcement reacts and you think about it in that paradigm, it starts to make a lot more sense. They couldn't do anything once the documents were out there and had been transferred to the journalist. So they were quite intensely focused on preventing that from happening in the first place, interdiction interception. But the question that we come to is what should be the norms going forward. And I think my case along with several others illustrate just how important it is that our courts remain open transparent.
And to me, I think going forward, I don't think I'm going to accept any future warrants or court orders that are under sealed. I just feel like the court should not be able to gag a third party. In other words, I'm a business, I'm a service provider. I am not a party to the litigation. It's the United States attempting to prosecute a suspect and effectively, and Scott can probably speak to this more eloquently than myself. But as a third party, as a service provider, the current view is we have no rights. I have no right to counsel. I have no right to free speech.
I have no right to protect my passwords or encryption fees or other source code. And I'm not okay with that. So my reaction to that is if the government isn't willing to stand behind their request in full public view, then why should I be willing to abide by? So what I would like to see is have that become the industry north become a corporate policy, an international standard, so to speak.
And it starts to make sense because if you were the person who was under investigation, wouldn't you want to know Wouldn't you want to know that your internet service provider turned over a log of the URLs that you had visited for the last 60 days. Maybe there's something in there that could be interpreted the wrong way, and it's time for you to go searching for a lawyer who can help with that explanation, help you prepare a defense.
More importantly, what disturbs me is the idea that that information could sit around in a database lead to unintended consequences without you ever knowing that it was turned over or being given the opportunity to correct the official record. Think in terms of a no-fly list, or think in terms of that random IRS audit that you got because an FBI system picked up the fact that every month you got an email from a Swiss bank, nevermind that you've got a third cousin who works at that Swiss bank, who happens to email you monthly pictures.
All the system sees is the fact that you get that email once a month and it plays into this secret IRS formula that triggers your social security number for an audit. I'm sorry to interrupt you, but actually this raises a very interesting question, right? LinkedIn recently, or the last year, I'm not sure how many of you have actually used that app. If you have kind of don't use it, introduce an app. I can't remember what it's called weird name as The one that harvests your contacts, correct. Connect to or Something, something like that, right?
It's a, it is a LinkedIn functionality that they promised every single user, regardless of whether you pay them the wonderful premium money per month. And, and they promised you unbelievable collaboration and all of that. And what surprised me was how many of my contacts, who I thought, you know, I'm previously in security and everything around this whole topic actually were using that in the, in the, in the month that they released it. And then if I may, and I'm sorry to interrupt you, that the question comes as some of them when I contacted them.
And I said, so what, and that really got me thinking is really, you have no issue with LinkedIn. I mean, and dream comes, people Don't understand Exactly. And The Facebook app is another one, Another one, the end. The problem seems to always be that one word I wrote is that ignorance, the level of ignorance in the end user is so surprising. They were happy. They were saying, oh, it actually gives me a list of all my collaboration, you know, without realizing that it's probably NSA's dream come true with what LinkedIn had introduced. I think they still had to have it.
And I'm not sure if they've pulled it off. And that really is, to me is very fascinating and concerning at the same time. Cause they're actually given away every supposed right that they would've had if they didn't use that, you know, across every country. So even if they were in Germany, I think, and if they were using that particular LinkedIn functionality, everything that they communicate through that was going to LinkedIn SOS.
Well, it's, it's an interesting debate. And one that I can relate to personally, right? Because I actually worked as a consultant on a project for another mobile app that worked very similarly. And I remember getting into an argument with the CEO about how the app would work, right. And it has to upload your phone book to the server to process your relationships with other people. And he wanted to kick off that upload as soon as the app launched to speed up the process. And I was like, no, you can't do that.
You need to go through these screens where you explain to users what you're about to do and give them the opportunity to abort before you automatically harvest their entire phone book and upload it to a server. It's fascinating.
I mean, it came to a point where I was actually threatening to walk away from the project before he got into his head that, oh yeah. You know, maybe I don't want my contacts slurped by some other app unexpectedly because there's always the scenario where somebody downloads an app to find out what it does without realizing the fact that it's harvesting all of their most sensitive secrets. Totally. Yeah.
I mean, that is certainly another industry issue. How much data do we collect and how long do we retain it for? And that might actually be something that could be addressed by laws. Speaker 10 00:45:46 Another question I Speaker 12 00:45:55 We're running low on time in the panel, but you've got about six different, really important threads going on in this conversation because one of them is the legitimate law enforcement access or illegitimate law enforcement access. But another one is baseline operational.
You know, let's just say for now encrypted communications. I mean, there's all these other angles here, but, but I think Mike hit the, the real issue and I'll, and I, Mr. David makes some points that are valid, but very theoretical. The truth is safe. Harbor was an agreement between European union and Clinton administration. It had a lot of provisions about compatibility of privacy obligations, but did that ever generate agreements on operational constructs code standards, et cetera, that would enable some of these agreements?
Of course it didn't because governments and business, our CU the customers, when I worked for CA our customers and any, any of your customers don't want it, they want the free flow of data as much as possible to monetize the data. Speaker 12 00:46:58 Governments want free flow of data for their own purposes. That could be legitimate tax related investigations, etcetera. So I guess the question I have for the panel is we connect all these threads. What is the path, you know, it's yet in?
And we, I do it too. It's yet another privacy session that we've had for years and years and years. What is the path to move to the point that Mike talked about where the default condition, which could obviously be, you know, disruptive, what is where the default condition on communications is as reasonably private, as feasible, reasonably private, as feasible? How do we get there?
And it is, isn't something like the new EU dated production regulation, for example, this conference, right? To be forgotten, isn't that maybe a path to get there, cuz it's not gonna come voluntarily. One other comment, 12 year old kids, all they care about is connectivity. So have them caring about anything other than hormones and connectivity, isn't gonna happen. It's up to adults to deal with these issues.
So, you know, those are my comment That 12 year old, maybe you want to give a short answer. We're almost running out of time. Oh really?
No, we just, we've just out of this. I, I would just touch upon you brought up safe Harbor and safe Harbor can actually mean two things. It can refer to the agreement between the European union and the us. It can also refer to the legislation in the us Congress that basically granted immunity to us, corporations for cooperating in potentially illegal ways with law enforcement. And we touched earlier upon data breach laws.
You know, one issue I brought up with the prosecutors in my case is that what they wanted me to do actually was a violation of Texas data breach laws. And they brought up safe Harbor in response to that, oh, you've been given immunity. Sure. I've been given immunity from civil lawsuit and criminal prosecution. What about my ethical responsibilities? And I think what we are seeing now is the gap between the ethical and the practical, the moral and the legal and how we reconcile those going forward is going to be very interesting.
But to me, I think we need to return to, like I said, yesterday, customer centric, security, customer control, designing systems in such a way that the customer retains control over their information and has to unlock it in ways that they're comfortable with. We're starting to see that the LinkedIn app we talked about would have to ask for permission when you installed it to access your contacts. But then you get to the issue of, and users don't necessarily always understand what that permission means and what it entails.
So that goes back to corporate and moral responsibility to ensure we don't abuse this trusted position that we're in now to collect information we're not supposed to, because it could always be used nefarious. In my view, I think one of the short term solutions might be, and I was thinking about this just a few minutes ago, if a company retains any information, it collects as a result of my activities for more than 30 days, I should at least be able to get a record of that information and or requests that it be purged.
I think one of the biggest problems we have in this industry is that we tend to hold onto things forever. Forever is a long time. If we started coming up with somewhat more sane standards for how long we've retained personally identifiable information, it would at least give this problem somewhat of a value. Let's see closing remarks. If we're going to get from the place where the defaults are, that everything's pretty much transmitted in the clear with a few exceptions to defaults where things are encrypted, I'm gonna apply my own worldview.
But part of what we need are standards for doing that, what is the standard way for a person or an entity to publish a set of public keys by which anyone can reliably discover them and then use those to encrypt the content to them. And I get that, that we do need people doing these kind of standards. One of my friends in PayPal was talking about what fraction of span in the world was claiming to be PayPal at one point. And it was large. How did they solve that? Originally?
They went to the major email providers and said, PayPal is going to sign all of their outgoing mail in this particular way and got Comcast and whatnot to do something one off for them first, which solved it for PayPal. But they were good enough citizens to say this shouldn't just be for us. We should develop standards that let anybody determine whether this is at least an authoritative sender. That doesn't mean that it's set with good intent, but you can know from what origin it came, including what account and that's a different world than we were in before those standards exist.
I was just gonna add, since we're talking about standards, I know that w three C has a working group investigating this very issue. And the last update I heard is that they were effectively trying to codify some of these operational rules and requirements into defined standards that could be expressed with nothing more than an icon. So when you go to a website, you see an image and it indicates to the user how their information is gonna be used, how much information is collected, how long is it retained, things like that, but that's the distant future.
And I think that's the distant future because we don't yet understand the problem. So how can we come up with a solution to it On my side?
I mean, what really worries me is there was a late, there was a 16 year old girl in Kent, south England who signed up for a youth commissioner, something initiative. She lost the job because at the age of 14, she had tweeted some stuff that kind of puts a story around the whole concept of privacy, you know, purging information, the right, I think Europe are very passionate about the right to people. Dollar levels is been some of the words that been changing recently, you know, in the legislation.
So I am quite passionate, I guess there is the concept of fair game, but maybe corporations definitely have some kind of a moral and possible legal obligation also to allow a certain age group that is always, as you mentioned, the homeowners and you know, the right to be forgotten after a certain age because where you and I, you know, we could do whatever we wanted. And we were very, our, our space was restricted locally today, any 12 year old who tweets or likes an Instagram photo that he or she may not know what it means.
You know, the concept is if I then want to employ that 12 year old, who's now 1820. And I see he that he or she liked an image that is wrong. Do I then employ that individual? And that's really interesting. And I'm really passionate about that. Cause I think there must be some kind of a concept of, you know, purging because everything is now difficult. So that's Scott would like to give some last words where 10 minutes over at the time. So just very quickly.
I think, first of all, the idea of corporations and moral I think is really interesting. Look at the articles, look at the bylaws, look at the contracts and show me where there's any moral invocation. A that's not that there's a marketing function in corporations. So the also we have an exponential increase in information. So privacy was an artifact of secrecy. Secrecy is dead. Does privacy have to die with it? That's a decision we need to make. It's not an accident. It used to be an accident. Privacy was an accident and something had happened because of the way things functioned.
And so now I think we, we don't make hammers soft so that they can't be used to hit people in the head we make 'em hard and say, don't hit people in the head with a hammer. We don't make cars go five miles an hour. So they can't be used for bank getaways. We make cars go a hundred miles an hour and say, don't Rob banks. Data is a dual use technology. We just have to make some decisions about our morality and morality's not something we can outsource to corporations and governments. That's it.
I, 30 seconds the law of unintended consequences, a hundred years ago, you could change your identity by moving to a different town and telling people a different name when you introduced yourself. And then we had the social security administration come along. And now all of a sudden we've got these identities that we can't change that follow us throughout our entire lives. GPS chips were introduced in cell phones so that when you called 9 1, 1, the operator would know where to send the ambulance.
And then all of a sudden that very same device is now being used as a portable locator that follows you around. We need to be careful of what the unintended consequences of our actions are. At least if you ask me, I'd like to return to a hundred years ago where I could change my name and identity whenever I want it. It's the only way I can guarantee that nothing follows me. Thank you. Let me just give one or two sentence. It's a very difficult subject. And to be honest, I had believed that we find a bit more of a tiny idea of a solution. There are so many pictures to this problem.
There are so many perspectives you have brought up loads of them, maybe not all loads of them. And it's a very complex thing. So probably was a bit naive to believe from my side that we could have a little glance at a possible solution. Actually. I'd like to have that discussion now from now on every year. And I think we discussed for longer time more on how we discussed it today. So probably it's gonna take for a long time. Maybe people will change earlier than the surroundings and the, the stakeholders, the other stakeholders changed. That's what we discussed.
So that gave me the impression we are bit running out of time to find a solution here. But as we have to run out of time here, I really need to cut that. I thank you deeply. That was a really great discussion. And I didn't wanna interrupt you too much. I hope you enjoyed yourself and I hope everyone else enjoy. Thanks.