Do people really care about data privacy?
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Do people really care about data privacy?
Do people really care about data privacy?
So just short introduction about Meco, really, just in case you're wondering who we are. We, we provide the infrastructure for trusted personal data ecosystems. So really what does that mean?
Well, we give businesses the tool to empower their customers, to access control and create some mutual value from, from their personal data. And, and that sort of comes out of our, our mission really, which is that we believe that that, that everyone should get equity and value in exchange for the database that they share. So my role is I'm, I'm the chief commercial officer I've been involved in data and AI for the past 10 years, both in my own startups and, and, and doing other things prior to joining Meco.
I worked for a global data broker where I led a global project around, around data privacy. So I think really the question is, and I've this many times for me, when I've been presenting, is, does anyone actually really care about data privacy? And I think what I wanted to do is start with some headlines.
We've, we've all read these headlines. This is, these are newspapers from, from, from around world, all sort of, you know, talking about data privacy, doing the right thing for, for consumers and, and so on, and executives need to be afraid. And so is this, is this just media hype around data privacy, or do people actually really care about it? So I wanted to start with looking at some recent stats. And one of the things that I was intrigued by was the Edelman trust barometer. They've been running the trust barometer now for, for, for many years.
And, and this one really sort of jumped out to me because we've all been aware of the global pandemic. We've probably all have tried to do our bit in relation in relation to that. And you would think that part of doing our bit would also mean that we would be, you know, glad to share data, to help government and organizations actually get on top of the pandemic. But I'm afraid actually what has happened is that the, the trust has, has broken down.
So the percentage of people who agreed with that statement, I'm willing to give up more of my personal health and location tracking information has gone, has gone down. And it's probably not a surprise. I can point to at least one example here in the UK, where I'm based where the UK government actually broke the law out, its its test program. It didn't do a, a full assessment of the privacy implications involved in rolling out that app.
And I think there have been other examples where data has been shared in breach of privacy policies and, and so on, which meant that obviously trust in sharing this vital data had, has gone, has gone down. And I think the, this needs to be put in the context of a wider picture. I think what we're looking at generally is trust and technology declining. And I think there's a number of reasons for that.
And there's some really important issues that, that we all need to tackle right at the bottom left hand, kind, we talk about data responsibility and, and data privacy, but there's a whole host of host of other things. We, we know that there is a growing rise of machine learning and, and artificial intelligence. And some people are fearful that that may remove some jobs out of the economy.
And, and again, we've all seen the headlines associated with that, but we're starting to see that actually filter through in a number of number of roles across the economy. And at the same time we have all probably experienced or know somebody who's experienced some issues around fairness or explainability in, in the use of that technology. And again, a most famous example is apple doing a sort of credit scoring and managed to sort of provide different credit, scoring quite a disparity between a husband and wife. And they couldn't really explain why that should be the case.
We've also seen things like facial recognition technology that actually hasn't served certain communities very well. And we're starting to see that actually there's a lack of diversity within our tech ecosystems and that is, that is causing problems.
And, and lastly, I think what we're starting to sort of realize now is there needs to be some kind of sustainability. There needs to be some kind of ethics underlying in this stuff, and that is probably drive in some, some regulations. So here's another example of whether or not people care about privacy. Many of you in the audience will, will have known that apple decided to change its terms.
So it, for the first time it allowed users to actually decide for themselves whether they wanted to opt in or opt out of apps that, that track your location and various other things. And here some stats around the sort of shared like of users who have actually allowed the apps to track them, it's below 20% and that's a pretty sort of consistent, consistent figure. And those sort of figures are really staying, staying, staying flat. So that again, I think is another indicator that actually people do care about their privacy.
They wouldn't be opt in out if, if they didn't care about, about their privacy. And I think another example staying on the, on the app piece is, and we've all would've heard about WhatsApp changing their terms conditions and what that did to the, to the market.
And, and when that was sort of first announced in January of this year, the number of downloads of WhatsApp fell 16%. Whereas if you compare that to the more priv conscious app of signal compared to their previous month, the number of installs the signaled to the order of four and thousand.
So again, I think that's another telltale sign that what we're seeing in the market is actually people caring about caring about their privacy. And it's probably no surprise that what we're seeing is a whole host of data, privacy regulations emerging globally. We've all experienced the GDPR in Europe. And now obviously from the new data governance fact, which will actually extend some of those GDPR regulations, but you've got GDPR like regulations bringing up all around the world.
There are some countries that I've left off here that have recently introduced New Zealand, but even, even China now has started to look at the principles of consent based data sharing their PI protection law had gone into we've gone into action, and we're starting to see them taking some action against some of the tech companies as well around around privacy. So I guess the, the big gap would be the us, but I think we we've seen states move. And I suspect that we'll probably see a movement on federal landscape, particularly with the change of the us government there.
So setting context, we've got grown regulation. We do have consumer sentiment really indicating that they do care about their privacy. So I thought I'd delve into some absolutely hot up the press stats. And this was a survey done in the us largely by KPMG in August. So just last month. And I think some of the stats here are just incredibly telling. And the two that I really wanted to pick out on the top line is the stats from the business leaders. But over a third business leaders said that consumers should be concerned about their personal data used by their company. I'll just pause there.
So the third of business leaders are saying that consumers should be concerned about how their company is using personal data. That's an astonishing staff, almost the third also say that their company employs ethical data collection methods. So these business leaders admit that they're not doing things perhaps that they should be. And then on the bottom line, you've got consumers, the left hand side, 86% saying data. Privacy is a accruing concern.
I think, you know, all the stuff, hopefully that I've just shown you, shows that people do actually care about their privacy. And then you've got, you know, growing numbers that are concerned about the level of data being collected. They don't trust the companies to use their data ethically and, and so on and so forth. And so what we've got here really is what I'm calling a data, privacy people are generally concerned about their, their data. They're concerned about how it's, why is it being used? Who's using it, who it's being shared with, is it gonna be safe?
Is it gonna be subject to some kind of data hack? And, and what are you doing with that, with that data.
And yet, on the other hand, you've got businesses who are actually just failing to meet those, those consumer expectations. And, and, and third of those business leads are actually admitting that why is this important?
Well, I think it's really important because there is a real danger that I think that consumers could actually get off the personal data supply to some of these businesses. And we know that businesses are concerned about noting access to news of behavioral data sets. Certainly my previous employer was doing work around how they could get access to those data sets. They felt that they were being shut out and, and that fear was largely because they thought the tech giants were actually harvesting this data behind their own sort of wall garden.
But what we're actually starting to see is those tech giants are losing some of this data. If you, if you just look at some in, in the browser and search space alone, people like duck dot, go and brave, which is you a massive growth in users in 2021, we're starting to shift the tech giants. And this is really important for businesses because you know, businesses really want to understand consumers to be able to provide better products and services, but largely a lot of the automation that we're seeing in the marketplace with machine learning applications rely on data.
And, and at the moment, we're not doing anything to, to bridge this sort of data, privacy chasm, but there's a on the positive side, a massive opportunity for businesses here and, and really the way forward for them is to build some trust. So how can businesses build trust and help try and bridge this, this, this chasm that that is opened up? I think the first thing really is around analyzing the own business ethics when it comes to data collection.
You know, for third of business leaders are saying that they're not doing things in an ethical way. It's time for businesses to take a step back and actually say, okay, we need to start looking at data collection data use in a much more ethical way than we are currently. And that means a branch reform. Really. I think one of the other key areas and the previous speaker talked about this as well is really about transparency.
I think people don't mind sharing some data if they actually knew what was going on, but, but all too often, things like privacy policies, terms, conditions, they're buried away in small print. They're very legally written documentation, very hard to understand. Sometimes you have to split across various documents and it's just not at all clear what's happening with your data who it's being shared with and so on and so forth. And that that's again is, is a real problem.
If you look at some of the open banking space, particularly in the UK and the us, when people are asked to share their open banking data for the various things, there, there is genuine concern because there there's not a, an awful lot of transparency around what, what are you gonna be doing with that, with that banking data? Is it gonna be used to my detriment? So being more open and transparent is, is, is critical.
And, you know, we do have technologies now that that allow us to, to, to help here. And so privacy enabling technologies protect as they call anonymizing data, utilizing things like synthetic data. I think the previous speaker talked about synthetic data. These are data sets that are, that, that are not held by PII, but they can mimic PII data sets. So you can actually do the analysis of them, but still preserve privacy. Cause you're not using real data.
These privacy enabling technologies, just aren't being used sufficiently enough in the marketplace and, and they, and we need to do more in relation to that space. So I think what I'm saying is here, corporates actually have a great opportunity and they need to establish some, some, some data responsibilities, because what we're also seeing is whilst consumers are concerned about their data privacy, they don't really know how to protect that personal data. They're not aware of what can be when, when that data's handed over and, and they're part in the back on companies.
So it's incumbent on us as businesses to actually do more to ensure that not only are we behaving responsibly, but actually we can give consumers people much more direct control over their personal data, let them take some responsibility and guide them and help them to be able to do that. And, and I wanted to kind of lead with two things. I wanted to leave you with a kind of call action.
You know, we're all employees, we're all consumers here, as well as being business leaders. You know, we have the power support and change here. If we think this is really important.
And, and it's incumbent on us as employees and consumers to actually make some active decisions, to be able to take steps, to change this so that this data privacy actually gets reduced. And we, we give people some empowerment around their personal data. And I just wanted to leave you with a great example of how companies are actually starting to do this. So this is a company called Vela. They provide workplace and credential in Australian market, and they've just started.
And essentially what this does is allows work to create what call verifiable credentials, which basically holds their information and, and, and that gets stored in their own app. So they're in complete control of these credentials and they can then share them when they are obtaining new, new employment. So it's great for a sort of contingent worker supply. They don't have to keep, you know, providing photocopy of their data when they apply for a new work or get new contract work, they can share their credentials.
And then from a, from an employer perspective, it's great because you don't necessarily have to store that PII date. I think you can check the credential that the person is who they say they are, and that they are a qualified nurse or electric or whatever their, their trade might be. And as long as you've check and that, and that's correct, you don't necessarily have to store it. So it really reduces this sort of enables this data minimization where people are oversharing data, and there's no sort of, you know, or reducing the burden.
If you like in relation to, to employ it all while people are actually in control of their own, their own data,