Keynote at the European Identity & Cloud Conference 2013
May 14-17, 2013 at Munich, Germany
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Unlock the power of industry-leading insights and expertise. Gain access to our extensive knowledge base, vibrant community, and tailored analyst sessions—all designed to keep you at the forefront of identity security.
Get instant access to our complete research library.
Access essential knowledge at your fingertips with KuppingerCole's extensive resources. From in-depth reports to concise one-pagers, leverage our complete security library to inform strategy and drive innovation.
Get instant access to our complete research library.
Gain access to comprehensive resources, personalized analyst consultations, and exclusive events – all designed to enhance your decision-making capabilities and industry connections.
Get instant access to our complete research library.
Gain a true partner to drive transformative initiatives. Access comprehensive resources, tailored expert guidance, and networking opportunities.
Get instant access to our complete research library.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Keynote at the European Identity & Cloud Conference 2013
May 14-17, 2013 at Munich, Germany
Keynote at the European Identity & Cloud Conference 2013
May 14-17, 2013 at Munich, Germany
The last keynote this morning will reflect something. We discussed a number of times yesterday, also on the talks, namely, there was some people saying, oh, there's NFC, and this will replace all the different cards and stuff that we have.
Actually, what you, what most people mean by NFC is using a smart device, like, like a, like an iPhone or an Android device as an alternative for strong authentication. The problem though, is that as we all know, these devices are not that trustworthy that we actually think, or at least expect they should be compared to smart cards. So Berg from university of Frankford will explain us more details on that. Okay. Thanks a lot, Sasha.
Thanks a lot for coming that early and in the morning, and let's see what we can offer you on what phones are nowadays smartphones actually can do with regard to the challenge that Sasha has already set up. And this comes from a German professor, as you see from go to university in Frankfurt. So it's kind of classic. It has a classic and agenda starting with some introduction and about and credentials. And then we discuss a little bit what smartphones can do and cannot do.
And I will try to bring some approaches, how we think that some of the problems can be solved that on claims that this covers all of the problem. And again, German professor is standing in front of you. So you get it. Some we and an outlook and out of all of this, and again, it starts with the introduction.
So, and what we've seen already from, and from Peter and from Ralph is that obviously we have identifiers identities and credentials, whatever we have quite a few of them, maybe we have more, then we actually would like to and would like to have, but there's good reasons to have not only one, but you have to several ones. And if you wonder about all of these, I suppose you can easily and can easily see the SIM card. You see a U proof credential and somewhere covered.
You see one of the in log in mass of one, our most recent projects, which is in ABC for trust about which I reported in a bit last year about the general and, and about the general development that I cannot report that we're moving forward strongly towards some of the trials, but it's too early to talk about that. And at this point in time, and you see one of the most fundamental and ideas of all of that, and that is basically part of your identifiers or part of your identities are actually attributes. And very often only some of your attributes as well, already set and should be shown.
And that means actually some information maybe, maybe on, in a credential or maybe somewhere, but you don't want to show it. So from that point of view, you need technology that actually helps to limit in data flows. And that is just one of the typical, one of the typical requirements that you have on such an type of an assurance token that is actually delivering your credentials. So in principle, you, as the holder of that token should be able to decide, what do I want to present?
Like, for example, I'm over 18, but not where I live or whatever in another situation, maybe I want to present that, but I really want to control that and that kind of data flow. And that means basically that some, some communication needs to be an happening between the assurance token, ideally, and the user and the owner of that, and is shoe in token.
So, and from that point of view, of course, everything that is larger than a smart card could actually be useful and, and smartphones come into, into the play easily. The next thing is, and very often it's useful that an assu token could be able to protect itself and you have a smart card, smart card is nicely protected, usually against any direct hack, but for example, and does a smart card know which time it is, does a smart card know whether a certificate of a reader, a certificate of a reader that says a I'm a legitimate reader. I can read you.
I should be able to get data from you, whether that certificate is actually outdated or not. And whether it is simply coming out of a totally different time, usually a smart card cannot do that because it doesn't have any other channel of communication except through the reader. And the world is presented to the smart card. The world is presented through the reader. So from that point of view, some extra communication mechanisms and could also be useful. And so communication in always comes down to obviously something which we, which would be a communication device.
So from that point of view, there's lots of nice reasons why smartphones should be much better credentialed carriers than actually smart cards or any other, and any other close devices. You can do credential selection. You can have some advisor there, anything that software offers you and could actually work. You have more processing power, doctor bots, communication channels already. Now the question is, is a smartphone, a secure storage device? Is it a trusted environment in terms of, and that the user interface can be trusted?
Is it a trusted platform on which the software is going to run in the way as I, as a user think about, well, this idea to have something like in secure environment on which you can store something nicely is actually not new. This is a slide let's say historic, it looks already like I dug it out of a very old heap of slides. And indeed, this is true. This is from the 20th century.
And, and is this already there? We had this idea of a wallet, or there was this idea of a wallet or a personal security assistant with a private key and some signature function at that point in time, the idea was to say, we need that because we are not really sure that we can trust this old PC. Actually, if you remember that gray device, that is a typical in PC, as it was seen in the early days of the internet.
And, and that may not be secure enough because any kind of software and Trojan horse could happen on that PC. So the idea was, well, let's have this wallet and this secure device, and that can protect all of these important things. And like Edward's contacts, even money can perform sensitive actions like signatures and so on and can do that.
So from that point of view, if you look at this wallet and if you look at nowadays or older or whatever smartphones say, look quite similar, even James Bond has been using one of them, as you can see there, actually that was an early P in an early PDA, but indeed they, things are M you see things working. So what is the, what is the problems and why don't we simply use all of that? And there are lots of them. That's another reason, actually, we have lots of smartphones that are coming into the market, and they're not too expensive anymore.
Like they used to be because there was a discussion about this, about this credentials carriers and about any kind of these assistance. And one said, well, will people be able to pay 20 Euro for them? Or 50 Euro for them would invest. And we know what people pay for smartphones, or what operators are paying for smartphones. And what we also had was actually a decision, a discussion happening what would be actually the typical challenges once we would have such a nice device and usually usability was on top of the list. Like it should be easily portable that works with smartphones.
Important content should be easily visible, like you're entering a new network and your trust environment may be different well to have visibility for that should work adequate representation of functionality while smartphones seem to get ever smarter and nicer to, to work with all of that was, was there. And maybe they're now and actually close to close to early.
But then we come to points like protection from unauthorized access to store data, manipulation of function, functionality, and protection against that, and denial of service attacks, and also trust in a way that actually non experts would be able to establish that trust and to say, well, simply this is a good device. I can use it if we had this morning already from Peter's the discussion that brand is obviously very important because it reduces complexity and for people, so they can say, okay, this is a good device. I can trust it. So where are we with regard to that?
And this is just one of the, one of these things. Mobile apps take data without permission charges over collecting kids' data. Smartphones apps are earn spyware, earn privacy invaded by smartphones. And these are only some of the things you could have quite a few and quite a few more. So from that point of view, this wonderful powerful device, and to some degree, partially due to the weakness of the technology and the weakness of the operating system, which to some degree went in a similar way as the early days of the PCs went.
And, and on the other side, also sometimes due to the business model of those people who are actually financing the smartphone and like Google financing and operating system, because they like to get data out of it like financing or co findings, and quite a bit of the wonderful developments of some of the smart issues wire in some of the data that they get out and some of the money they're making from and developers and, and who want to go into the app store, there is quite a bit of data going out of it.
And from that point of view, especially in the app ecosystem where one says, okay, money is not only made from the, in selling of the smartphone, money is made from selling apps or money is made from organizing the app market and, and getting money in as the smartphone is unfortunately not as secure as we probably would like to, to wish it. And here you find a few of the, and a few of the analysis that you find. There is an downside that actually a study from an a study done by a ministry.
And actually one of the unusual ministries, usually we don't have studies on smartphones being done by the ministry of food, agriculture and consumer protection. Usually you would have had research or economy ministry here, but the point here is actually consumer protection and the smartphone has come far enough that the ministry for consumer protection and the ministry that is usually trying to make sure that our milks that we buy can be trustworthy and is also something that maybe it's something like milk also. So one should take a look into it.
So from that point of view, when you're talking into milk, maybe it's useful to take a look into the shop, but first let's see what, what are the issues that they have that they have listed? They say, well, there's not a real, not really an appropriate level of transparency regarding smartphone applications, sensitive information processing activities.
There is, and users are usually not aware of what is going to happen. There is usually a biased conception of what an impact user's privacy. So some things may actually be totally overvalued. Other things may be undervalued and people really don't have a proper concept of that.
Obviously, a smartphone. Well, is it more complex product than milk?
Maybe, maybe not, but clearly consumers don't really understand things that they understand the smartphone as good as they would understand, would understand milk. Some principles are clearly violated.
I mean, we have that with milk also, but we know what to do with it, and we no understand it. And well, then you see actually 37% of people who don't use a smartphone say, well, we don't really trust it. So what could one and what could one do?
And again, what I'm leaving out here is all of the hardware protection issues and via R ID and trying to build basically a new smartphone into the, into the bigger phones. That's also an important topic and one can do some things, but I suppose you've heard about that before. What I thought I tried to bring in this talk as something which is hopefully a bit more, a new is to see what and one can do with smartphone app markets, but in any case, and I mean, what you already have, of course, with smartphones, you have sandboxing. And that happens as installed time.
And basically the, the says, and WhatsApp is obviously one of those who's most aggressive in trying to get information out of a smartphone and ask, well, I would like to have all of this, but usually it asks, well, either you, you give me all of this or you're not going to use WhatsApp and we can see how many of you are, how many of you are using WhatsApp nowadays on your smartphone. Okay. So it's quite a few of you. How many of you have considered not to use WhatsApp because of that? That is more people still, because we're an expert and we're an expert conference.
And I think I've even seen some hands going up twice. So that means some people have considered, but then they have maybe, and have maybe given up. And so from that point of view, he is, it seems to be a problem unless we say market can easily solve everything just by itself. So the other thing, obviously that we have, we has one time permissions and I've tried to spread it nicely, or actually Kar and write for support. And we very much with this presentation based on the research project they're doing have, and tried to spread it nicely over all of the platforms.
So you see here's the one term permission that safari is asking for common location. You have some on demand settings. That is basically what you earn, what you have, but usually privacy information or information about what happens to your privacy and what happens to your data flow is static. It's quite cause grained it's coming as a wrong time. Usually it comes when you actually want to get your job done and don't want to consider privacy. And from that point of view, it's largely ignored even here with the experts, some of them are giving and giving up.
And that's where we often not a really effective measure to inform users about the specific risks that we have. So what could one do? And that is basically the research questions that we have, and deriv from all of this.
And well, how could we try to enable people to make more informed choices and how could we, and better indicate the level of trustworthiness that maybe there was apps and how could users maybe even understand this in the end, if you try to earn, make that goal a little bit more detailed, you can see, well, how could we actually try to develop concepts to integrate trust indicators into the ecosystem app ecosystem? Actually let's say, is the app store for, for practical purposes for the moment, of course, it can be more, but basically it's the first point.
It is the, the app store and, and how can we inform systems and, and users about what an app is actually doing and how can that be communicated to do that? There's basically a proof of concept for, for Android and the Google place on now that we are developing, not because we think that Android is so much better than all the other operating systems, but that it's the easiest ones that a university can do and can do research on if they want to develop something. And we're trying to find out what could we trust indicators here?
And for that, we developed this privacy for app market project, and it started basically with a process overview. So what actually is happening when you're doing, trying to get an app done or trying to, to find an app that is useful, what do you do as a user? You go to the earned, your go to app search to the by app search to the market, to the app market. You get some list of app apps that hopefully match what your requirements are then earn. Usually they're ordered by relevance, whatever relevance means.
And, but it seems that there's some kind of relation between what you want and the app is hopefully there. And then you can select an app. Then you have data decision usually on, well, the waitings that you find already. So some kind of waiting happening features described, and maybe you look and feel that the catalog and the app has, and then you download and install. And at that point in time and the permission requests come.
So once you've already actually made your desire and, and made your decision, now the permission requests come and you have to make decisions without much of an understanding, but you can repeatedly use it. So you have limited information, which is usually not good on a market situation to make a decision. And from that point of view, some you trust elements would be useful to actually be, be added. And that should actually be, and that's what we're trying to, trying to integrate some, something on privacy behavior already in the list of apps.
So the privacy behavior and the data transfer behavior should already come in into the ranking to some degree. And we, of course, we're trying out to see how much that is interesting for people and how much it actually works out. Then privacy related information profile of apps should also be part of the description, obviously, because you may want to see why something has got a good ranking or not such a good ranking. And then the installation process is going to happen.
And it's basically the idea is to have it on, on, based on this privacy profile with a minimum delay and actually showing alternatives. So that would be what you want to see. And from that point of view, then also when people can make decisions and, and relate on this privacy information and give some, and, and they can give some feedback that is important here, we shouldn't overlook that. So basically once you really install it, the ideas as an option, you can give feedback. Obviously we hope that's some kind of educational aspect or educational effect on developers can come from that.
So if actually somebody has put themselves nicely into the ranking, and if somebody has put themselves in a wonderful description and then actually from the, from the installation process, something happens, but actually does not match the description that should create user disappointment. And if that user disappointment can be channeled to inform other users that should put more transparency into the, into the market. And basically there is one main concept and five, let's say sub-concepts that we are trying to work with here.
The main concept is to say, well, what is obviously the privacy impacting behavior? And what are the patterns of this behavior? And then one thing is everyone can do is of course, one can look at one time privacy based on apps, behavior. So what is really happening when the whole is, is working, and even there, we need a feedback loop, then a privacy score that is useful because all of these rankings seem to be very popular with, and with app markets and also some trust enhanced app discovery. So the score is actually going to be used for not only for waiting, but also for a ranking.
You see that in a minute. And then at installation trust enhanced in some way. And finally, a policy, some will be based on what is really, and what is really happening. So here we have the whole play.
Again, we have local service on the device and you see, we didn't want to do too much, too much advertising of Android for Android devices. So this is not an Android device, it's this point in time, but another one, but probably one could also try to get this one and try to get this one going. This is actually obviously a windows one. So the first time is in one time information and about the long app could be and could be collected at all point in time. And then privacy, the privacy store goes into the market concepts. We is the discovery on the app market and concept four is privacy policy.
And all of that is fed back into this policy thing. So things, there are things which are actually influencing the privacy store manager manager, and there are things that are influenced by the privacy store manager that is, and by the privacy score. So the score is something that is something which is also changing like many of these scores it's going up or going down depending on things that behaving. And we are trying to make sure that it gets close to what the real behavior and of things is. So let's have a look how it actually is going to going to happen.
This is again, this an idea of having and having the monitoring the impact and see the privacy impacting behavior patterns. And that could, for example, come to the sphere of elements that we have thought about where we of movement profiling is something which is important. Communication. Profiling is something which people take very seriously and activity and profiling is, is very seriously. Yesterday.
I heard a presentation from somebody from NOIA and he said all of those and sensor devices and like humidity meter or Excel meter and whatever that these days we just made discuss versus they're expensive or, or too expensive or not too expensive for a smartphone. He expects them to be in the market in a year and already, especially now as companies like Nokia are obviously fighting hard for market share and putting extra stuff into the phone and to have an and get an advantage in the market.
So we can expect the smartphone of the near future to be much more of a collecting device of personal information. And then it used to be in the past. So local one time privacy information could, for example, mean while we're monitoring and analyzing the app behavior. As I see it in picture one notification when an app is matching a certain pattern like unlocking of privacy information and some information about that app and behavior. So in this case, the app is actually asking about, about your hometown.
And actually when it asks about your hometown, obviously there's a certain probability that this kind of location information and that the app is asking for actually would lead to that really, the system knows what your hometown is. And then you can, if you get information about that, you can at least make a decision.
So, and this happens around at one time, that is the idea. It happens when you actually give away. And when you give away that information and then you can make, get some details about all of that. Then there's information about app behavior in, in more detail, because here it's basically was what is actually mean when, what, what, what apps would know about your in home time. So basically down to a map and down to some information in terms of geo geo coordinates, how much is actually going to give, going to be given away, then some possibility to give feedback.
And again, all of that is, is obviously the prototype trial. And so that you don't ask us, when can you buy this software? Can you buy it tomorrow night?
No, not yet tomorrow night. And, but we have some designs and we're making user trials with these and with these designs now, and this is the moment, whereas in using comes in, now you can wait the behavior just at the moment when it is actually becoming interesting for you or not interesting for you, you can say, okay, we are putting more burden on the customer now, not even only to decide whether they want to give it away, but also to give feedback.
And, but that's actually why we try to make the feedback giving easy. And that's why we're doing this usability test and to see how the feedback mechanism would work. Second concept is the score for apps in, in, in app markets. That was a very early visualization of a privacy score, like with wet and green.
That was, let's say our Eve assumption, how you could maybe do it. This is a common state of the behavior bit more complex. So now we have to see whether people want that, or whether we go back to the very simple and whether we go back to the very simple thing, that is something that, and I, I see an immediate reaction from Zha here. So he likes the more simple concept, much easier than the more complex concept, which obviously was developed by a computer science student. So we will probably go through some, through some circles and see what in the end comes up still.
I'd like to advertise that a little bit because you see it's actually differentiated among some of the typical use cases. And what we're actually trying to find out in this, in the, in the trial also is whether people like this concept of what is and understand this concept of what is privacy invasive or behavior or privacy invasive transfer of transfer of information.
So, third concept is basically this trust enhanced app discovery. And here you see an modified modified app store and the privacy sphere comes in at the wide. And this again is an experimental screenshot from in the middle, because obviously if you very privacy, very much a privacy person, you would say the staff with a privacy sphere with very good in privacy should come on on top, on all cases. And the question is whether people take privacy seriously enough, that they want to see that on top or whether they just want to see it as one of the factors.
And so we're trying out a bit to see what actually people understand and what they're like when you try to get these two things in relation to each other. Mm, what we, next thing is the app installation, obviously, and there could be at least two cases that we have seen from the apps that we have seen. The app could be a little bit privacy invasive, which means minimally delay of installation and, and the privacy information about the given app could be, could be presented in that time.
And if you have very privacy, invasive and applications apps, then you can actually say, okay, there should be in delay of the installation and the one click offering of privacy alternatives. And the idea of that of course, is to try to put the privacy decision at a point where people hopefully have some time, it would not work nicely for people who want to install an app access on the spot and want to use it immediately because they don't have the time for that. But not unfor, unfortunately not all of the apps are installed for immediate users happens, but it is not always happening.
So at least in that case, we could have some, some way to get and to get into theit. Of course the most important point is actually to present privacy alternatives, to present alternatives that are more, may do the same trick for you and, and could be more privacy friendly. And then the idea is to have a privacy policy summary, and maybe you have heard some talks from Kim cameraman where he tends to, to like to present what is the typical list of conditions that you have, for example, at a Kim, usually you're using the iPhone and because it is the longest list and you try to download it.
And I think you've just about succeeded. And of course the idea is to get that a little bit and gets it a little bit shorter here. And for some of you it's encrypted, it's actually German. And that's something that we're going to overcome. But obviously for the use study with Germans, we need to do it in German otherwise, and we have already built up the first, the first hurdle. So all in all, and I promised you summer, we end an outlook as a good, good German professor. And we think that smartphones and principle are something that we should use in future more.
And that things are a great platform for credentials in principle. And obviously one needs to overcome a number of challenges.
Also, we don't think that the challenges where we overcome next year, but if we work hard on it and it can probably work the issues that we are concentrating here are on the upside of the problem, enhancing user trust, trying to provide useful privacy information for people and in several phases, app discovery, installation, and usage, and, and also, and so app discovery is especially something that we're trying to get into and not much has happened there. Also, we are trying to get this feedback loop working better, and, and if you want to have another buzzword here, that's crowdsourcing.
So basic to wait the privacy invasive enough ads. We hope that this kind of crowdsourcing waiting and the privacy invasiveness when is actually happening, would hopefully have some influence on the waitings and eventually on the went and on the winks. And maybe even on some discussion of people saying, well, this is a feature that I was really surprised, and that it is there. And that may influence also, and developers.
Now the question is who can make an effective contribution on something like that to let's say, install something like that, or to actually implement it beyond the trial, coming from a university. And what you will see is that obviously at markets could do something on that because they would have to integrate all of this.
This could be an area where maybe, and depending on how the Google and ecosystem is going to develop, and in the Google situation, one may be, you know, in the, an Android situation where maybe in a better situation than with, with apple, because in Android and principle, there are several app markets possible and, and that could compete. And some of them could come from trust where providers with good grants, which may be an organized auto this very nicely.
And in, in the case of apple, you would have to decide whether it's apple or not in the case of, of Microsoft. And actually Microsoft could make a decision to get something like that and going obviously, and that comes also to smartphone provider cases. And in a way you can also say to some degree tablet provider cases. And obviously the question question is once you have such a concept, of course, you can also ask for regulation to actually consider it. Because very often we have seen that people say, well, it isn't possible. So we cannot ask for it because it doesn't exist.
Now, this will be actually a case study to show that in principles, this can exist, happy to acknowledge the support from ABC, for trust and the European union for some of this work and for trust in digital life, actually for this case study in the app market. And while I hope this wasn't too boring for you, I'm still between you and Z and morning breakfast or whatever drinks, but thanks a lot for staying in that long time. It's always good to have some insight in the technology and what's possible. And what's not. So yet the question finally for me has not been answered. Totally.
So this is kind of a ongoing theme for this morning. People are claiming, this is I'm going to tell, and I tell something very interesting, and then they leave the question open at the end. So assuming that these devices may are used for storing credentials, so will there be apps or are there apps that will manage this credential stuff or will the, the devices themselves be the credential handling device? Thanks. Thanks. Thanks a lot for that, actually that is something that maybe I should have jumped over so quickly.
I assume that there will be apps that will manage credentials that I, I think we've, you've seen already and a few early ones of them and in principles, this is not so difficult. Maybe often you see it actually already as part, for example, of the ABC for trust research project, where simply we have some trials, so we need to have credentials being managed. So we have some credential management software, but plainly credential management software has been discussed already in the past.
You have seen some examples with, for example, the passport passport, and also the now I'm wondering there was this Microsoft and Microsoft thing with cards that just now comes escapes my mind card space and pardon Card space, No card space. Exactly. Yes.
And there, you could see one metaphor could be cards because there work. Some people say it should be a metaphors.
That is, I think that's an issue that is not 12 year, but it can be solved. So the idea to have a selection of your attributes being presented on a device, and then you can choose from it. I think that is moving forward. We have seen, we have seen some good examples on that one and whether this end comes as an app or whether it comes as an HDML five, an application going to the device is I think from the development point of view, not, not difficult.
Again, it can be an issue and with regard to the trustworthiness of the device. So from that point of view, my recommendation would be actually once we have better operating systems, it should come from there. And if we say it happens in the apps because people don't want to trust the operating system version, then it could be actually one of those apps there. And obviously protecting of information flow out of the credential properly could also be ranked and rated there. Okay.
Still, if you say at the app level that we are basically at the same, at the same, at the same point where we were, when we tried to build secure soft certificates on PCs, right? This is nothing different. And from a pH logical point, I, my assumption was always that, that if I use the smart card in an NFC environment using, for example, as an alternative to smart cards, that it would actually also use a crypto functionalities of the device. That Is the hardware crypto functionality.
That is, that is what I would like from a development point of view. And, and I, for, for quite some time, quite a few people, including me have argued in 2004 and 2005 and said the smartphone or the upcoming smartphone is a wonderful, is a wonderful platform. If we just develop it bit better than in the early days of the PC.
And if we integrate all of the hardware so that we could have that properly in, and if we put all of this and then we could make a nice, smart cut or smartphone operating system using all of these nice credentials, and then you see what happened in the market, it basically didn't really come up. What is successful now is not so Okia phones that actually had quite a few, these things done and Simian Simian had a better operating system model and the better security model than what you have in the iPhone, and then what you have in the, what you have in the, in the Android clearly.
And so the market went because of us usability into another direction. And now the question is, how do you react on that? One thing is, you can say, well, we need the money. And actually that is also something that I'm arguing is says, the European union should invest more money into real trust versus smartphone platform, a smartphone platform that nearly cruise could use to serve, and her grandchildren could also use to serve. And that could be a useful investment, but at this point in time, this hasn't, and this hasn't come to a, yet to nearly cruise in a way that we see the investment.
And so from that point of view, this is an alternative strategy, which hopefully will also raise a discussion about all of this. I mean, once it's happening in the app market, where the people are discussion will happen, people will ask questions. And then I can, can imagine that quite a few of the security solutions that would be there would be actually asked, then there would be a discussion. Can it happen on the app market or can it happen on the app level? And then more people would listen to what you just said.
And then maybe we end up with a more secure ecosystem or a more privacy friendly ecosystem. That's the idea why we're doing this. Also try to, to, to, to follow the people where they are on some of the things, not claiming that this is the best security solution. Thank you very much.