KuppingerCole Webinar recording
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
KuppingerCole Webinar recording
KuppingerCole Webinar recording
Good morning. I'm Dave KES, senior Analyst with KuppingerCole welcome to our webinar on privacy by design. Good afternoon and good evening to those of you in different time zones for me. And still welcome to this presentation. Joining me for this discussion, our Dr. Anne Kabuki and the privacy commissioner for Ontario, Canada and Michelle D the chief privacy officer at McAfee associates. For those of you who are not familiar with KuppingerCole, we are Europe's leading identity analysis firm.
We cover enterprise it research advisory, decision support, and networking for it professionals through subscription services, advisory services, and events. Our reach is worldwide, but our headquarters is in Europe. One of those events that we do annually is the European identity and cloud conference coming up in April this year in Munich, Germany. This is I believe the seventh EIC conference always well received.
Wonderful, wonderful discussions, wonderful networking opportunity for you, whether you are in north America, Europe, Africa, or the far east. Now a few guidelines for today's session. Everyone is muted centrally. You don't have to mute or unmute yourself. We control by a feature. The webinar will be recorded, and the podcast recording will be available tomorrow. Perhaps even later today, depending on your time zone towards the end, we will take Q and a. Although any questions of a, of a very important nature might be interjected during the conversation.
You can ask the questions using the Q and a tool, which is in the little panel that you have on your screen at any time, and we'll pick them up and then feed them to the appropriate person at the appropriate time for our agendas. Today, we were gonna start off by asking Dr. Kaki and who came up with the concept of privacy by design to explain that show us what it's all about. Then Michelle D, who is a chief privacy officer, and also the head of something called the identity project to talk about privacy in general and how that relates.
I add some notes about what the programmer's perspective on designing privacy or the software designers perspective, and then we'll go on to Q and a, but before we do that first, I'd like to get a little bit of introduction from the people who were here. So I'd like to ask Dr. Kabuki to tell us exactly what a, a privacy commissioner is.
Okay, so I'll turn it to her right now, Ladies and gentlemen, at least it's it's morning in, in, in my time zone, it's a pleasure to be here. And what a commissioner does.
We, we enforce compliance with privacy laws. We're regulators.
We, in my jurisdiction, we are independent of the government of the day. I'm an officer of the legislature, which gives me that independence that I need to if necessary criticize the government during the course of an investigation, I have order making power and I have a mandate to educate the public on issues relating to privacy and access.
And I think the most important thing that I do, of course, I uphold the privacy laws in my jurisdiction, which extend to all aspects of provincial state level, government operations, municipal government operations, and also EFP HIPAA, the personal health information protection, privacy act, and HIPAA is just a wonderful health information, privacy law that applies to privacy everywhere, public or private sector. The, the most important thing I think commissioners do though, is elevate the understanding of privacy. And I always tell people privacy's not just a personal right. Of course it is.
And the protection of your personal information is very important at an individual level, but it also represents a societal, right? It is the basis of a freedom everywhere. And so it's such a vital important, right? And we just have to ensure that it continues and hopefully get stronger, not the reverse that in a nutshell is what we do. Thank you. In other words, as you said, you get paid by the government to criticize them. Sounds like a wonderful thing.
Now, Michelle, you're in the commercial sector. So what does a, what does a chief privacy officer do? So I just gave you the secret of the universe and the answer is 42. So what a privacy officer does can vary by jurisdiction and, and by vertical. But most of us in, in a nutshell, we are the information asset managers for the company. So we speak in a voice that is for the consumer, for the employee, for the individual.
And we also look at what is the, what is the enterprise at large doing to maintain the integrity of, of private data, to make sure that products, if we are selling products that will impact privacy or security are built with privacy by design and, and in mind. So it's, it's a rather horizontal type of a, a dedicated resource. If you would, to really just look at the informational asset and make sure that it is treated with the utmost of perspective integrity.
Well, that sounds interesting. Of course, this, isn't your first crack at being a CPO who did the same thing for some Microsystems back in the day. Now we'd like to start with the main part of our presentation, which will be let off by Dr. Kilian who's, who came up with the concept of privacy by design, which within the past year or so, it's become an overnight success after being around for 10 or 15 years, which is, you know, just like a rock band.
So, so I'm gonna turn it over to you so you can lead us through the explanation and, and show us what's going on. Thank you very much, Dave. I'm just gonna get into here's my talk. Okay. So I'm going to advance the slides and hopefully this will all work well. And I wanna thank you very much for inviting me here.
It's, it's kind of ironic in a way that you have a regulator who's suggesting that what you should really do is not turn to regulation to protect privacy, but try to embed privacy into the design of all that you do, including technology, business practices, networked infrastructure, so that you can have privacy protected by design, meaning not by chance, but in a very intentional methodical way. And that is the essence of this. And in order to do this, we need to have a paradigm shift.
We have to change the paradigm from the existing largely zero sum model of looking at the world where everyone looks at privacy versus something else, privacy versus security, privacy versus connectivity, privacy versus business practices. It's always this, this dichotomy, which I will suggest to you is a privacy dicho.
It's, it's a false dichotomy because what privacy by design is all about is multiple functionality. So let me talk about privacy and security. Just for a moment. I always tell people privacy does not equal security. They are not one. And the same.
However, security is absolutely vital to privacy. You cannot have privacy without strong security. So while they are not one, and the same security is essential to privacy, but you can have the reverse. You can have security without privacy. You see the term privacy assumes a much broader set of protections than security alone.
If you think of the fair information practices, which populate all privacy laws and, and privacy policies, the term privacy infer information practices represents depending on your jurisdiction, eight to 10 principles that relate to use limitation and purpose specification, and a whole host of other interests. So the term privacy, subsumes a much broader set of protections than security alone. If you look at the O E C D guidelines, security is one of the eight principles that is covered under that set of fair information practices.
So the way I like to look at the world is not one versus the other. I want you to change the paradigm from zero sum to positive sum and the term positive sum.
I, I apologize for using that, but I always say when I was in graduate school, many years ago, game theory was really big. And we were doing all these experiments in game theory and positive sum. And zero sum were, were the terms used, of course, zero sum took off and everybody knows about zero sum, but not that many people relate to the term positive sum, which I prefer where, because it's essentially means that you can have two positive interests increments in two functionalities increasing in a positive manner at the same time in unison, not one to the exclusion of the other.
And so when you apply this to privacy and you change the paradigm from zero sum to positive sum, it creates this win-win scenario. You get away from the either or scenario, which essentially involves unnecessary trade offs and false dichotomies. And it can be as simple as replacing the verses with an and so instead of privacy versus secure security, it's privacy and security, it's privacy and biometrics, it's privacy and business functionality. You can do this and it is an absolute, the, the sky's the limit.
When you take this approach, it just invites innovation and new ways of thinking and new ways of doing things. And that's what privacy by design is all about. And as I think Dave mentioned at the beginning, I developed privacy by design many years ago, back in the nineties, but it really took off in the last couple of years, two years ago, in 2010 at the annual conference of privacy commissioners and data protection authorities. Usually this takes place in Europe in 2010, it took place in Jerusalem.
At that time, we introduced a resolution for privacy by design viewing it as an international standard. And the good news is it was passed unanimously by the entire assembly of privacy regulators at the closed session at the end of the conference. And just to make it clear, these passage of these resolutions is not a slam dunk. Most of them don't go through.
So when this was passed unanimously as the, the framework for privacy, from this point on, it was just such a, it was so gratifying that now privacy was being used as an international standard, a regulation around the world were saying, yes, we want to do this proactive approach to trying to prevent the harm and give us some, some help on how to do it because you've been doing it near jurisdiction, which we have. Then we can point you to multiple examples. So essentially we've started applying privacy by design. I call it the trilogy of applications with it.
That's the obvious place to start, because if you can embed privacy in the code and the actual design and infrastructure of what you're doing, you're, you're laughing. It it's part of the system. It can become a default, but we've extended it to be applying to accountable business practices and also to physical design and networked infrastructure. Think cloud computing, think of mobile devices. So the sky's the limit. As I mentioned, let me re briefly run through what, what exactly do we mean by privacy by design?
These are the seven foundational principles, and you can find all this material on my website. We have in fact, a website call up privacy by design.ca and I've already covered a number of these it's proactive in nature attempts to prevent the harm from arising, as opposed to offering remedial systems of redress. After the fact and most existing laws offer the latter, they, they are remedial in nature.
So if, if a harm arises, you can seek some system of redress. I'm not suggesting we not have that.
Of course, we still need that, but wouldn't it be great if we could prevent the harm that arises an essential component of this is striving to make privacy. The default setting when privacy is the default it's embedded in your code, it's embedded automatically as a part of your system. So someone doesn't have to ask for it. They don't have to opt out of a non privacy protective system. They they're already in a privacy protective system. If privacy is the default condition it's available, automatically people don't have to ask for it. It's embedded in design.
Full functionality is absolutely essential to privacy. By design the positive sum, multiple functionality has a much greater appeal as well than a zero sum model.
When, when I was asked to join the, the board of the European biometrics forum many years ago, I was surprised that they were reaching out to a Canadian privacy commissioner. There were hundreds of excellent European commissioners. And on that one, they said it was simple.
They said, I didn't say no to biometrics. Many years ago, we, we came up with a very privacy protective system of biometrics called biometric encryption, but I didn't say no to biometrics. I said yes to biometrics and yes, to privacy working in unison, that's full functionality. It's very appealing and it allows much more to take place as opposed to shutting the door on some particular interest end to end. Security is absolutely essential.
So we talk about this as spanning the entire life cycle of data, full life cycle protection from start to finish, including secure destruction of records, visibility, and transparency of your B of your practices of your information collection practices is absolutely key because if you can't be open to your subjects, your data subjects, your consumers, whoever your users are, then they're not gonna be trusting of your systems. They're not gonna have confidence in them. So openness and transparency is absolutely key and respect for user privacy.
We could have started with this as the first principle. If you keep it user centric, you are far more likely to have the support of your users to have their comfort in terms of their understanding of your practices and, and have their confidence. So this is the essence of privacy by design. It can be applied in many different ways. And I just wanna be clear. The essence of this has to be reflected in reality on the ground. Now this is not an academic construct or some theoretical formulation. This is the essence of it, but we're embedding it in code.
Last year, I called it the year of the engineer. I spoke to engineers around the world, including Munich. I know you're having a conference there, including in, in Palo Alto and San Jose and California. And of course in my jurisdiction of Canada, because it's all about getting engineers and software designers and systems operators, to understand how you take these and translate them into real live functionalities. This was the resolution that I mentioned was passed in Jerusalem in 2010. And it really was a landmark resolution.
And we're delighted that privacy by design is now viewed as the framework by which regulators around the world are looking at how to advance privacy interests and are considering it to be an international standard. I just wanna switch gears to identity management today, and I'm not gonna speak on this a great deal, because I know Michelle will be speaking on this, but identity management, absolutely critical to all that we do. Wherever we go online through your IP address, your Mac address on mobile devices.
You leave these, these digital breadcrumbs that leave a trail that can be very revealing in terms of, I, you know, identifying where you've been and what websites you've visited and what your activities you're interested in. What are your interests? It is extremely revealing, and we have to ensure that it is very, very strongly protected. We have to be again, very flexible. And user-centric in terms of the identity management schemes that we develop.
I'm not gonna focus on this, a great deal, because I do wanna leave time for Michelle, but I think the user-centric nature of identity management, keeping the user at heart and ensuring that they're empowered to, to ensure that they have control over their personal information. I would say that privacy is not about secrecy. Privacy is about control, freedom of choice that the individual, the data subject retains, retains control over the use and disclosure of their personal information. This is absolutely key and has to be at the heart of identity management systems.
How we create this user-centric identity management infrastructure is absolutely critical. We have to ensure that we have adequate tools to manage our personal information on the variety of devices that we now have in our possession. More and more of these devices are becoming mobile in nature. They may connect with an online device, a device we use at the office, things we use at our home.
I mean, the range of devices is, is just growing and the nature of the policies that attach to these, the sticky policies, very, very important that the information and our wishes with respect to our, our preferences on who has access to our data and who uses this, that it Accords with the policies in place that this sticky policy travels, essentially with the information on whatever devices are used. Absolutely very, very important. I know we're not gonna have time date to, to dwell.
And, but you wanna mention we're holding international symposium on something called smart data in may, May 14th to the 16 in Toronto, Canada at the university of Toronto. And it's all about how you empower data. If you will, from an AI perspective, to ensure that your preferences and wishes attached to the data in, in a way that goes way beyond sticky policies.
Anyway, I wanted to just mention that for your interest. You can take a look at our website for more information on this. And I should tell you, we, we have a privacy by design website.
Of course, we have a Facebook page. We call it the engineers corner. It certainly isn't for the exclusive use of engineers, but we're trying to interest people who actually are technicians or people writing code systems, engineers, software designers, people, the techs of the world. We want to interest you and get you involved in all these different areas where privacy by design plays a large and increasing role.
And again, it's all about embedding privacy and other functionalities operating, operating together, hand in hand in unison, both to the benefit of each other, not one to the exclusion of the other, but operating in unison. Let me just stand by talking about why you benefit, if you can lead with privacy by design, and if you can embed it in your code, in your business practices and all that you do in this positive some manner, which is doubly enabling, think of it as this win-win solution that will deliver many, many business interests.
I would say, you know, there's a, a positive payoff, a privacy payoff to embedding privacy in design, and it allows this multiple functionality win-win paradigm. And whenever I should just tell you, whenever I talk to the techies, the brilliant security minds like Bruce and I and others, they have told me that the, the few times that they've actually been required to embed design systems that have both privacy and security as part of the code, that invariably is elevated the entire level of protection that that system has offered. So it's rarely the case that it's one versus the other.
This is essentially a false dichotomy consisting of unnecessary trade offs, get rid of it. It's a dated flawed model. If you go with embedding privacy, then you will have the future of both privacy and freedom supported and growing well into the future. Thank you very much.
Well, thank you very much, Anne. That was a wonderful exposition of, of what we're doing. And now I'm sure some members of the audience might have some questions about what it was you were saying. So remind them that they can type them into the, the question part of the little panel on their screen.
And we'll, we'll bring them up as they come along. At this point, I'd like to turn over the microphone and the screen to Michelle Ben, who besides being the chief privacy officer for McAfee is also the head of something called the identity project and intriguing term of course will play out her name, but perhaps Michelle, you can enlighten us as to what that project is all about, how it relates to this whole privacy issue. Michelle.
Yeah, I, I would love to do that. So I, I just wanna riff quickly off of something that Dr. Veian said, and it's, it's always so wonderful to talk with her.
And, and I was, I was honored to share the stage with her in Jerusalem, as we were talking about identity management and, and some of the other kind of technologies that, that have been viewed, I think as either Ana bunch of privacy, unrelated to privacy or something that couldn't be retrofitted into privacy. And I think we've, we've shown that, that, that almost every technology that processes touches manages information can benefit from privacy by design principles.
But I think the definitional concept of privacy as secrecy, if that is your definitional concept of privacy, I think it's a misconception. And if you design for secrecy, you're basically you're designing to fail. You. You won't have a flexible system that is capable of audit review management and, and really leverage if you design privacy as, as Anna suggesting, as privacy is designed for control, for respect, for care of information, then you're really designing in value into your systems.
And that's, you know, I agree with Bruce and I, or when I've seen that done and done well, what it is done is open up the, the information to the right people at the right time who can make wise decisions with that information. It opens up that data so that if you do need to get rid of it so that you will prevent liability from happening to your organization or prevent harm to the users and the owners, the true owners of that information, you can do that. So with those principles in mind, first, I wanna kind of define what the identity project kind of was and, and what it has morphed into.
So I left Oracle. I actually was put in sales after I left sun Microsystems after Oracle bought sun Microsystems. And so I actually was in sales for a year.
And, and it was a, a role that I was, I was terrified to take on and thrilled that I did coming as, as a compliance walk and actually going out and having to walk into offices. And the reason I, I took that challenge was because for all these years, I had built products at sun Microsystems. I I'm terribly, terribly proud of the work that we did there designing privacy into systems, deep into the system all the way down from the chip level up.
And, and I was always amazed when that Infor that, that, that gear, if you will, the software and the hardware would be sold into the marketplace. We really didn't applaud ourselves for our security and privacy benefits. We really talked speeds and feeds and scalability and all of these kind of industrial words. And yet really the, the power behind these tools was their ability to really manage one of the most important assets that any individual, any country EV any organization has, which is its informational assets.
So when I left, when I left that organization, having had this new insight of talking to thousands of customers about why they do and why they don't spend money or, or engage in privacy by design and purchasing for privacy by design, I decided to start something called the identity project, which really was intended to be, and it still is an advisory kind of service if you will, for startup companies and privacy intensive companies, not just tiny ones, sometimes big ones that really have information as their core asset.
And to really get in very early, particularly with my very early stage startups. And say, for example, you wanna do smart grid. Let's really discuss what happens if you're successful. What happens if you had the ability on, on your, that you suggest on your whiteboard. So if you get funding from this, this venture capitalist, what is the data asset that you'll be commanding? How do you intend to respect geographical boundaries and differences in the law? How will you control that information? What are you going to do exactly to promote yourself as a privacy by design company?
So that was a piece of it. And obviously now that I've joined McAfee, that part of it, the advisory side has, has dropped off I'm, I'm, I'm doing that work for McAfee as its chief privacy officer. In addition to my compliance and governance responsibilities here, I'm looking at all of our products and services for privacy by design and looking at and drafting. It doesn't have to be a brand new concept.
So drafting off of some of the principles of agile computing, for example, to look at how are you actually looking at code all the way down to the source code level and saying privacy is not just a matter of a policy or a legal notion that has to go on top of what you're building, but it really is essential to quality. So where a product can't fit itself into the framework of fair information and governance practices, where the Provence of, of data is in question, for example, where you can't delete strategically, where you need to delete that's of quality failure.
And so it goes back into the, the design and redesign. I think the beautiful thing about the term about privacy design is as anyone knows, who's been in technology for very long, there's never, ever a time where you sketch it out on paper and the final product looks exactly like what's on paper. So the other side too, that I don't wanna miss is particularly because it is international day privacy day. One of the companies that I was, I was working with was working on an authentication technology and service for children is a group called all clear ID.
And as part of my, my services to them, I, I said I would actually use their technology. And what they had developed was a way to scan children's identities.
If you, if you try to get a credit report on your child, and this is not necessarily true in Europe. So I apologize for some of our international listers. I'm not sure exactly how the credit bureaus and things work over there, but, but in the us, it is true.
And in, in, in certain extending Canada, if you ask for a credit report, what they do is they look for the match of a name of birthdate and a social insurance number. In our case, it's a social security number.
What, what happens is you will find that in only 1% of the time, will that search come up with fraud against children, 99% of the time, what will come back is a record that says no record found, and a parent will feel, you know, if, if you've taken the time to check your child's identity, when you do your own yearly credit review, you, you will feel comfort. And what I discovered by using this new technology and this new search methodology is that my daughter, who I call online, Ms.
Sang, and if you, you met my daughter within 10 seconds, you probably would realize why we call her Ms. She is a wild one and the world better get ready. She's an innovator from day one, but she had had her identity stolen twice, once, 11 years before her birth.
And, and I think the, the ultimate insult. So here I am, I've traveled literally around the world, talking about data privacy, trying to bake in engineering, privacy, and privacy protections, respect for information, human rights protection. And here I am my most important thing. My children have been exposed and completely UN unaware.
So, you know, her, her identity had been stolen 11 years before she was born, because it was an uncurated piece of data. It was a string, a nine digit string of information that could be leveraged to purchase credit credit information. So it was used in her case to get a store card and use and repeatedly use a store card later, her identity somehow was available. And I still don't know why, which is a flaw in our systems, right?
To, to have somebody who was illegally entering the country needed to have utilities turned on in an apartment. And so purchased her social security number and was able to give that number to the utility and to their landlord to get these utilities turned on. And then of course, skip town and left my daughter with a damaged credit record. Now my daughter's case, I found out she's, she's quite young.
We were able to fix this, but what I found it since that time was, and this is where the identity project really took off in a different direction than I had anticipated is I'm finding that there are vulnerable populations, older people who are not as savvy about technology, very young people who have what I call uncurated identities. There are people who are walking around with these very valuable assets to crooks, and then they find that they are disqualified for getting college scholarship money. They are kicked out of jobs because they cannot pass the background check.
They can be wrongfully arrested because their identities are attached to criminal wrongdoing. They can be met at the hospital. As one of my friends was by social services saying, why are you dosing your child with psychotropic drugs? Because someone had stolen her identity at age six and used it to, to get their psychiatric needs met.
So, so part of, part of the identity project is an educational one to let people know that your children and your grandparents are a jeopardy, we need to do something. We need to talk to policy makers about how we make an uncurated identity, less valuable to wrong doors.
And, and the other side of that is, and this is why I came to McAfee, really focusing on cyber education. So we're going to give more and more users choice.
I, I firmly believe that the money, the business is in building privacy by design, which I think is very good news for all of us. The harder news is that where we have true privacy by design is and suggest where the users in control. We have got to get users ready. We have to start very young. We have to educate them systemically. And we have to make sure that rather than just turning over the car keys to someone and saying, learn the rules of the road while you're driving, we teach our, our new populations coming out exactly what it is to be a cyber citizen, how to be safe online.
So that's a big part of, of our task here at McAfee as well. In addition to providing all of the, the protective services and the predictive services to actually secure your it environment. It's also to provide you with the information that you need to be a safe driver in the cyber society to get off my soapbox so we can have more of a chat about it.
Well, thank you very much, Michelle. That was very good. I'd like to take a couple of minutes to look at this issue from, from the point of view of the, the software designer, the software engineer, something I've done well for far longer than, than I think either of you have been around the back in the 1980s, I was the it manager and software designer and software engineer for string of retail stores. We were actually video rental stores.
And as you know, over the past number of years, maybe not so much right now, but 10 years ago, video rental records were being assaulted by people to find out what, what, what famous people were, were watching on their, on their TVs or on their, on their tape tape devices. At the time when I was doing the software, of course, I had really no concern whatsoever about, about privacy or security. I cared about database integrity, and I did a lot of work to make sure that the database didn't become corrupted.
But in fact, and of course in those days, we weren't connected over a TCP I P network from store to store. It would've been necessary for someone to actually dial in using say PC anywhere to take over control more of the systems. But as we found out this week from Symantec announcement, PC anywhere is now vulnerable because the source code to an earlier version has been stolen.
And so some of the security surrounding today's PC anywhere can be overcome back in the day, the security amounted to a username password combination that was sent in the clear, easy enough for anyone to get easy enough for anyone to get in. And if they understood a database, they could quickly download all of the rental records of any of our users. And as I say, I never gave it another thought because as a software designer, security, privacy, and so on are secondary to the actual reason for the application or service.
And that's of course, the hard part here for, for both of you is to drum that point into software engineers, software designers that not only do they have to efficiently read and write the database, but they also have to do it in such a way that it protects the privacy of the data and secures a company's intellectual property and secures the privacy of the users or customers or suppliers or whoever's information is kept in there. It's it's, I, I don't envy the task of convincing those people to do that. I really don't.
I'll remind everyone that they can ask questions and they can put them into the little box on the screen. And we have one here I wanted to ask you about, because it's a question that does come up frequently when we're talking about ownership of this personally identifiable information, essentially, the person is asking here that when you do a transaction online, there are two parties to that transaction.
So how do you determine who owns that data and how do you determine who can release the data, or who can determine that the data should not be released into the while Michelle and either of you have a comment, I'll just start and then I'll transfer to Michelle. I prefer not to use the language of ownership. And instead I talk about custody and control of the data because ownership involves property rights and, and it gets very thorny and who owns the data? It's not clear, but when you talk about who has custody and control of the data, it's usually the, the company you're dealing with.
Who's collecting some information about you. For example, let's say you've bought something from Amazon or something online, and you give them your credit card number in order to pay for the product. And they have that information. It's not a matter of ownership when they have that information.
Yes, they have custody and control over it's in their possessions, in their servers, whatever. But along with that comes a duty of care. They have to protect that data. They have to keep it secure. They have to abide by their privacy policy, which hopefully says something like they're only permitted to use that data for the purpose, for which it was intended, the primary purpose, meaning paying for the product and without the consent of the data subject, they shouldn't use it for other secondary uses.
So I, I always use the language of custody and control and attach to it a very strong duty of care of the company or firm or whoever it is who is collecting the data. And that's the way I speak about it, Michelle, over to you. Yeah.
So, so it's, I, you, you took the words out of my mouth. So I, I, that's why I like to refer to curated data and fiduciary responsibility. I think ownership is an interesting concept. I think there, you know, there are a million reasons why, for example, when I do a transaction with some online retailer, they do need to keep a track record of my payment vehicle that I paid. If I didn't pay how to contact me so that they can give me my stuff. For example, they need records to say that they sold that stuff.
So, and they need it for all sorts of reasons, right? That they had, they recognize the income coming in. If there's a dispute, they know what to do with you. They know how to give you better services, C cetera, cetera, that doesn't ever need, that that information loses its value to me as the person who it represents. And it at the same time has value to that organization within you shared it. So I think when you look at it, it, as Anne suggests as custody and control and even, you know, control, I think I, I worry about that.
The concept, because I think a lot of people falsely believe that if they hand it to a third party vendor or someone else, who's going to manipulate that information on your behalf, that somehow they've lost their own control. You still retain that fiduciary responsibility.
So it's, it's a matter of, and I like, I like the concept of duty of care. It's almost like it's, it's a controlled substance. And once it has passed through your, your kind of Porwal and, and becomes your fiduciary responsibility, you, you don't really, there is no right to forget, right? So we talked about the right to forget as a, as a kind of the, the latest and hottest item in, in European parlance. But the corporation does not have such a right until that information is completely out of their custody.
It completely deleted, gone, or permanently encrypted with the encryption keys forever lost. That's, that's kind of how I would address that issue of ownership.
I think, I think you do get into a very dangerous place. If you hand over your own ownership as a user, to a third party, and you point at them and say, now you that's my data. You take care of it, or that's my data. I own it. And you can't, you can't manipulate it.
You, you you've lost the you've lost the conversation that is data protection for yourself. So you're actually harming yourself more than you're doing yourself any good by insisting that it is yours and yours alone. It is a shared fiduciary responsibility. Okay.
Oh, okay. Now there's here. We got a very good question for you, Michelle. Okay.
Oh, one of our attendees wants to know if McAfee has privacy by design tools that can be sold to developers. Oh, interesting. What a Great idea. Why don't you develop it? Yeah. If we don't have them now we will coming to a, coming to a Porwal near you. So I think that's, you know what, I have to just hold out my hands and say, I've only been here three months. I don't know if we do, but if we do, I'm gonna be very loud about them. I absolutely agree.
I think, I think what we do have, and, and this is, this is kind of emblematic of the security community. We are the, the freaks of the geeks, right?
I mean, the security people within the it industry are geekier think geekier. They are, you know, I love what I said about the European engineer. I've had like the decade of the engineer.
I just, I love them. I love your crazy black t-shirts with the messages that don't make sense and your unwashed long hair and your crazy perfect docs.
I love, love, love, love you. And I think the nice thing about this community is, is as you suggest as much as we need to make a profit and work as a business, providing that kind of tool and toolkit is something that is, is definitely not in secret. And I think this is the push for the security industry, a greater push to transparency, transparency for a security person is an unnatural act.
However, I think that it's the only way forward for our industry. We need to be a lot more public with not necessarily the secret sauce, but for example, that it takes data to protect data. It scares me to death to hear us talking about the right to be forgotten in an advertising context or in a social networking context. When we broad brush that over the security industry, because it takes a lot of data. It's like the weather, you need to have a ton of data to, to have predictive protections.
So I think those tools, if you, if, if we don't sell them now, if you're the product owner of that magazine tool and you're listening right now, I so apologize, come and find me. I'm on the eighth floor in my office. I love you too.
If, if you don't yet exist, it will soon it will present at Ann's conference next year, This would be fabulous. We would welcome this. We would sing it from the rooftops and it would sell big time. So you All of our competitors close your ears. It's a terrible idea. We'll see if Microsoft can come up with something, then you mentioned the right to be forgotten the right to be forgotten, which is now part of the proposed new EU privacy directive, which was promulgated within the past week or so.
I don't know whether each of you have had a chance to look at what the, the consequences of this new directive would be. But as I understand, it, it, it means that as far as privacy goes, organizations would only have to answer to the privacy authority, privacy commission in their headquarters country, rather than all 27 EU privacy authorities, which seems to me to be a, a, a good thing, at least streamlining the, the process.
And on the other hand, there would be major penalties for violating the policy, which could be up to 2% of organizations for what sales, which strikes me as a awful lot of money. Do either of you have a, have an opinion on, on the new directive. Let me just comment on the right to be forgotten.
And the, you know, it's essentially an opt-in model that they're proposing, and I will watch this with great interest far, be it for me to be in any way critical of this. In fact, asking that there be essentially positive consent before someone's data is used for another purpose is the embodiment of privacy as the default condition.
It, it basically says you will be assured of privacy unless the, the individual, the data subject opts out of it, as opposed to having him or her opt out of a, of a practice that is privacy invasive. So it's an opt-in model. I think the us will have a great deal of difficulty with this and businesses, but I will look with great interest.
We're, we're delighted that privacy will be the default on the right to be forgotten. You know, it's, it's interesting again, far, be it for me to, to criticize this, but, and I don't know this historically, I don't know that is an actual right, that this exists as a right. And to Michelle's point, I don't know truly if this is desirable, that all data will be expunged because there's also some value to having data.
But I, I don't wanna cast this in a negative light. I think that what was driving it is in this online age with everything, you know, everyone being on Facebook and every aspect of one's life, seemingly appearing to be noted somewhere. Do you really wanna carry this with you?
Well, into the future, you can imagine kids are starting at age 13, there become students, and they go to university and got all this stuff on there, which are, you really wanna have this information following you? And you're trying to get your job the first job. And we know, know that the majority of employers now access people as Facebook profiles, and that's a significant number of candidates. Something like 70% are rejected categorically based on the impressions form, just from looking at your Facebook page.
There are companies now that have appeared such as reputation.com that parents will hire to clean up their kids' acts to give them a chance to get a job. So the notion of the right to be forgotten, I certainly understand where that's coming from, but I think we need to look at it a little more deeply and really look at potentially unintended consequences that may arise from this. I'd love to hear Michelle's thoughts on this.
Yeah, my, I, I feel very similarly and I, I think the one thing that I am, am applauding in the new, the, the new conversations that we're having in Europe is this kind of hope for, and I don't know that it's gonna happen. And it's a hope for harmonization because the worst thing that you can do for real privacy, for actual privacy, for, for things that actually protect individual human beings is to have a fractured landscape where companies are forced by cost into one it infrastructure. And then they've got 75, 85 or different compliance structures on top.
So one, one jurisdiction says, save everything for 10 years. The, the next one says, get rid of it in six months. The other one says something.
And what, what that does is that forces companies to uniformly be a non-compliance and it forces a privacy officer who has to be accountable internally. Or if it's, even if it's an external consultant or an independent auditing firm, they have to somehow sell the notion of protection. It's such an unnatural act, right?
So, so all of these rules say that there's no rule because there's a pragmatic reality that you have to build one it system, you cannot buy one, you know, you can't have one per jurisdiction, or it's not just cost prohibitive. It's it's business prohibitive. You have to have that diversity of thinking to be a competitive entity in this day and age.
And, and so it lets you choose it lets you race to the bottom. So the hope for harmonization is the hope for privacy going forward. It does make it easier on business, which doesn't mean it's a bad thing for a consumer. So I am very hopeful. I'm hoping, you know, the efforts like this ethics like APEC, you know, efforts like, you know, somehow getting the us to adopt some comprehensive federal level leg legislation is, is a very positive step for the, the, the unintended consequence makes me very, very, very nervous.
So I totally agree this, this there's a, there's a social good in allowing human beings to mature and have that kind of new image. You can kind of re-image, you know, when Facebook first got out, of course, I mean, the funny thing about privacy and privacy people is everyone on my privacy on my Facebook page is a privacy person. It's really funny. People are like, oh, you do this. And I was like, yes, I do. I do all the social networking. I'm looking at Pinterest. I do Twitter.
I, you know, I, you can't be a privacy professional and not dig in and do your best to try to stay abreast of where is the culture going so that you can, you don't have this notion I've heard so many people say that young people don't care about their privacy. They don't talk to young people. That's right. When you talk to young people, they care very much about their privacy. I I've always, I've always suspected that there's a company waiting to be born out there called facelift.com. That's kind of a, you know, the next version of reputation.com.
So Hey, if anyone has it out there and, and I think what, you know, I, the, the, so there's the other part of this too. And so I know I'm getting into like a very like mushy kind of intellectual thing about this, right. To be forgotten, but it fascinates me so much because the power, I think in identity, you know, in your identity and my identity, my children's identity is the power to tell your story, your, your real and authentic self story. And to tell it on your own.
That's, that's my one, my great work. And my great hope in ending child identity theft is not that, not that Ms thing and, and sweet cheeks will be perfect when they grow up, but they can tell their own stories. So I am not a perfect person. I didn't go to all the Ivy league schools.
I got, you know, a decent high B average at Ohio state university. I partied like a rockstar when it was age appropriate to do so. I'm I drive a minivan. Now the power of my identity is all of that.
And so I, I really do get nervous when we say we wanna delete wholesale people's prior convictions, or, or that, that, that loses a sense of, of cohesion, culture, and identity that I think is very, very destructive to privacy itself. So that's my soapbox on that issue. It kind of, it's fascinating to me, Michelle, you mentioned one of the problems for businesses and enterprises is the various jurisdictions that they're operating in and the conflicting rules and regulations that they have to put up with.
So isn't, it, it almost as important for us to try to educate government and the judiciary as it is to educate users and software engineers A hundred million thousand percent. I think I know that's mathematically wrong. So any engineer, I apologize, it's a hundred percent correct.
I, I think people that have been, and I, and I, I certainly have talked to many, many privacy officers or people that call themselves privacy officers who never, ever, ever want to be anywhere near Ottawa or DC, or God forbid, Munich or Brussels. And I think that's a huge mistake.
I, I recently heard a staffer and, and I will leave the, the jurisdiction out. So the person can rename nameless and blameless, who, who was talking about IP address, the internet protocol as a new found technology like cookies that could be used for spyware. And it was a jawdropping statement because of course IP address is, is the, it's what the internet is.
It's, it's the addressing system of the internet. And it's been around since, you know, before, you know, a Gamber in someone's eye in the sixties. Right.
I think, I think Grace Hopper knew about IP addresses. So I don't think it's a new concept. And to your point, I think it's, it's not, you know, we can differ on who owns what or where things should be built, or whether we want clouds to be local, to create jobs.
We can, we can argue about that. That's fine. But when we're telling ourselves that basic technology and how it works is something that it's not, then we're building policy on top of fiction and fantasy and false assumption rather than saying, you know what reality is complex enough. So I think it, it is the who's every, you know, in the year of the engineer, I think, I think we should have another year of the engineer even if last year was yet, let's have another one let's do, let's get the engineers to talk to all the regulators and teach them what they know.
And, you know, if I can just add when I have, you know, I don't actually like to appear as a regulator when I'm talking to engineers. I just wanna appear as a person advancing privacy by design and the attraction of privacy by design. When I do this kind of talking to different groups, especially engineering groups, but also to businesses and politicians. The reason that appeals to them is you can do it anywhere. It's not specific to a particular jurisdiction. It's not the legalese of, you know, section 31 sub two part D it appeals to people everywhere.
And last year, when I talked to engineers, I talked everywhere. I was in, as I said, I was in Munich, I was in Palo Alto. I was in San Jose. I was in Ottawa all over the world. I was in Mexico city, Jerusalem. They have an appreciation of the principles of protecting privacy without getting into the exact formulation so they can adapt it to their particular situation in what makes sense, what is meaningful? How does it apply in a engineering context?
I mean, I love engineers. They have been so, so welcoming of these concepts.
Never, no one has ever said, you know, I can't do it, forget it. How, how would you, it's like, oh, let's explore. How do we translate this into code? Or how do we do it? So it's very exciting. And I love doing this educated role.
It's, it's wonderful. I learn so much. I think it's, it's mutually beneficial and I think we need to do it more. You mentioned one other group, the judiciary, we're just expanding our role in an effort to try to reach the judiciary because you're absolutely right. They need to be alerted to the nuances, like data linkages, that a number isn't just a number, but it can be linked through process of data linkage sometimes with just one key and identifiable data. So these not have to be imparted to them. We've just about run out of time here.
As a matter of fact, we may even running a little bit over, but, and we did not even get into a discussion of the announcements that Google made this week about changes to their privacy policy. Something on I wanna address. And if you subscribe to my newsletter, you'll probably see something about that early next week, but I wanna thank Michelle de from McAfee associates, the chief privacy officer and Anne KA, a privacy commissioner of Ontario, Canada for being with us today for this, this wonderful session that we've had informative session that we had on privacy by design.
This has been recorded and the podcast of the webinar will be available either later today for some of you or early tomorrow for the rest of depending on what time zone you're in. But with that again, thank you, Michelle. Thank everyone. Wonderful day. Thank you, Dave. Thank you.