Keynote at the European Identity & Cloud Conference 2013
May 14-17, 2013 at Munich, Germany
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Keynote at the European Identity & Cloud Conference 2013
May 14-17, 2013 at Munich, Germany
Keynote at the European Identity & Cloud Conference 2013
May 14-17, 2013 at Munich, Germany
Our next speaker. Lauren Lisia is the executive director of Oasis. One of the, if not the most influential standards organizations in the world. I finally remember sitting in an Oasis meeting about 15 years ago for discussion of directory services, markup language, DSM, L a protocol, which I can pronounce is dead and has been dead for many, many years now. It was supposed to be the X ization of LDAP and never caught on with widespread use. But Lauren, if you'll come up here, you can tell us what's going to be happening. So I think I'm on.
Yes, I am. Yeah, but you need that for your slide. That's right. It's all yours. Thank you. So I guess I don't need to introduce myself anymore. I will do it with a French accent. I am French Licia and yes, I'm executive director and CEO of Oasis open, which is a global standards consortium. I can't resist a quick dig at Craig based on the previous conversation. It looks like Sam is not dead at all. Just wanted to say that. I know. I know. So what is this going to be about? I'm going to make three points today.
The first point is that unintended privacy consequences can arise from seemingly harmless, big data applications, unintended consequences. My second point is going to be that knowledge is power. As we saw from the wonderful regulatory framework presentation on the EU initiatives, we need to decide as consumers, what kind of deal we're getting into with big social and other big data applications. And we need to know that we're giving personal information away in exchange for a value proposition that we actually understand. So that's my second point.
And my third point will be that there is a toolbox already out there to address privacy issues and it comprises standards, but because it's six 15, and we're all tired, I am not going to do a presentation in depth about standards. Instead, I'm gonna take us on an adventure through the internet, as I've discovered it across the world, some of the wonderful things I've seen and some of the darker things that I've encountered and that maybe some of you have encountered. So what is this brave new world? The first case I wanna bring to your attention is the curious case of the Westchester gun map.
And if it sounds like a Sherlock Holmes story, that's because it very much is. So what happened here, advocates for gun control in Westchester county decided that they were going to create a map of all the gun owners in that county to show people who their neighbor was. This is perfectly legal. If you buy a gun in Westchester county, you have to record that transaction. And that transaction is actually publicly available. So now all of a sudden you have all of these red dots.
You can look that up on Google maps and you can see that almost everyone who surrounds you has a gun, totally legal, the ju juxtaposition of two data sets, Google maps, the power of Google maps and perfectly available legal data about guns. Totally harmless, right?
Nope, not harmless. Here's one thing that happened.
Inmates, people who were in jail found out where their law enforcement officers lived and started sending them death threats. The journalists who helped put this map together started getting death threats. Wives were being stocked by their ex-husbands and so on and so forth. The unintended consequences were numerous. So juxtaposition of data. All right. So this one I found and I thought, ah, harmless, for sure, right? This is a science study. The scientists were trying to trace back French, Italian and German speakers in Switzerland back to their original villages.
If you think about it, that's completely mind boggling, right? You can actually tell from somebody's genetics, where they came from a long, long time ago. And I thought, wow, this, this for sure is, first of all, useful, powerful, great science and harmless.
Maybe, maybe not the guy who was covering the story for discover magazine just took this leap of faith and said, wouldn't it be great if we could find out where the ancestor of a criminal lived. And I thought, how, how did we get to that from a genetic study? So there's also a social commentary of these applications that can create unintended consequences. Here's one of my favorites. If any of you have used Google map, you've probably downloaded those dot KMZ files where you can fly through cities. There's two of them that I prefer. One is Hong Kong and the other is Vancouver. So right.
You can approach the buildings. You can zoom in, you can zoom out. It's like a video game. It's a lot of fun and I encourage you to try it. And I thought harmless and fun. Nope. Juxtaposition of data sets. Now I know that the CEO of a security company, I won't name it let's say is on the 20 or first floor coroner office. I can get this information. And now it's mapped onto this building. You can imagine the kind of security risks you come up with with this kind of application.
So again, the law of unintended consequences, this one is easy. I don't know if animation is gonna work on it. It's a wind map. It's absolutely amazing.
What, what happened here? It's animated. Okay. But it doesn't work.
All right, let's go back to the wind map. You can zoom in into it and you can see the wind patterns in your neighborhood. I actually look at it every day. It's totally soothing. It's meditative. And it's a beautiful map. That's gotta be harmless. It is. That's the kind of visualization where there's no other data set. There's no personal information that comes in. It's pure science. It's awesome. So that's our first takeaway because our personal cloud has stretches has stretched so wide.
The regulatory framework may be behind the times and we may have unintended consequences from perfectly legal things. Alright, let's look at that through the lens of big social and some of the discoveries we're making there, y'all know this picture. This is tarry square in Egypt. So we know that big social fosters change and makes change happen. That's one of the great things about it. This is not so good. I don't know if you guys know this story I'm I actually still can't get over it. I found out about this three or four weeks ago. I was at home.
I, I saw this in the paper. These are two young girls. They committed suicide at, at several weeks interval. Here's what happened? Do you know the term? Roofy have you heard of that? It's a drug that you put in, somebody's drink to put them asleep and then you raped them. So this was a party rape. The guys who did that, took pictures of these girls as they were doing this and posted them on a site that will remain unnamed. But you can guess what it is.
What happened was the communities where this happened, actually sided with the assailant, not the girls started calling them sluts and the pressure was too much and they committed suicide. So obviously I'm not saying social media is responsible for this. What I am saying is it provides an amplifier for dark human behavior. And we need to take that into consideration, much less dramatic, but very annoying. You've all had to do this.
You know, somebody sends you a card or a gift or invites you to play a game. Now you have to give more information. You're always entering extra information for this application in the ecosystem. And I've grown wary of these suspicious and actually frankly, bored. So I deny all of these requests. Don't send me a, a play castle veil, cuz I'm not gonna do it. All right.
Quiz, pop quiz, who said all these concerns about privacy tend to be old people's issues who did Zuckerberg? Zuckerberg. Mm Nope. Another guess who? Yes. Who said that? Raise your hand. That's that guy's knows his stuff. Reed Hoffman. He actually retracted that statement afterwards, but I think it's significant that he said that it's not really in the business model of big social to consider privacy, right? They're paying fines as a consequence. It's not part of their philosophy. And here's why, and look at what Eric Schmitt said. He's absolutely right.
If you put your data out in the network, it's gonna be very difficult to reel it back in. But the second part of the message is, well, then we shouldn't care. Let it go. It's done. You're an old person. If you don't believe that, well, regulators seem to think of or otherwise. And I don't feel that old and I feel otherwise. So that's the second takeaway. It's okay for us to be the product. I'm okay with that. But I really need to know what's being done with my personal information. Especially if the deal is changing on me and you haven't told me, how am I doing on time? Good.
So what can you do about this? You can be reactive. Most of us are, or you can be proactive and there's several ways to do this. We just got a wonderful presentation about the EU regulatory framework, which is coming next year. I don't know if there's going to be anything on the us framework.
I, I haven't seen it in the list of workshops. What I can tell you is that the approach is gonna be a lot more fragmented as always in the us. The states are going to get a measure of decision making in the privacy debate. There's going to be some pieces of federal legislation, but most of all, I think the government is trying to work on a voluntary basis with industry to get something going most memorable quote in my mind is the one at the bottom. I can't repeat it back to you for obvious reasons, but I would invite you to read it. I'll give you the context for what this guy is saying.
He's an influential blogger in the UK. He was using Instagram before it was bought by Facebook. And then when it was bought by Facebook, the terms and conditions changed. And then all of a sudden he had no clue what was happening to the metadata around his photographs. And he said exactly what he said here and decided not to use Instagram anymore.
So that's, to my point, it's okay to be the product, but we have to know what's being done with our data. Hi, let's do the right thing. We can do this together. There's a number of, for a, including this excellent one where you can be and participate. I wanna draw your attention to two aspects in this slide. The first is the one at the top big data and privacy at the O E C D level. There's a fantastic work group that we're part of. And I would invite you to follow its discussions. You can write down this link or, or photograph the slide or whatever. This is all gonna be available online anyway.
And then I would invite you to do what we've been doing, which is to listen to all sides, including the sides. You may not want to listen to the consumer advocates. There's a really cool project called VRM vendor relationship management, which is hosted at Harvard. And the goal of that project is to put the power back in our hands as consumers and let us decide on a granular level, whether we want to opt in or not, and trace our information, our personal information, as it goes into the big data sphere. VRM very cool. There's two other projects that are very cool.
Drummond reads, respect, network Drummond. Are you in the room? No. Okay. Well you should check that out. And then there's Kalia, Hamlin's personal data ecosystem and both of these projects are doing the same thing is Kalia in the room? No. Okay. What's that? Hi. So you know all about it and these, I would really strongly encourage you to check that out. It's about re-empowering us to control our personal information. And then there's this very good book by cord Davis. And finally, my plug for OAC. I promised it wasn't gonna be a plug for Oasis. There's only one slide on that.
And my plug is you need to help us make and implement these open privacy standards. We have two excellent standards underway and at this table over there, there are three, four people you can talk to Don Jula, John Sabo, Geren, Janssen, and DHO who will tell you all, you need to know about RI privacy management reference model, which allows you to map all the personal information, touchpoint your it chain, and then automate privacy policy.
And very interestingly as well, PB, D S E privacy by design for software engineers, which does the opposite of what we have done as a developer community, which is to actually bake privacy into our work. Not as an afterthought, unfortunately, that's pretty much what's been going on. So big data standards, those are still coalescing and you need to help us figure out what they're going to be. And my three points were first unintended, the law of unintended consequences in the big data sphere.
Two, there are standards out there to help you do your job. And three knowledge is power. You can be in empowered to decide where your personal information goes. This is thank you in Chinese shisha. And the reason I put it there is because we're not talking about privacy in Asia. We should. Thank you.