Yes, my name is Thomas. I'm come from Stockholm, Sweden and Chief PII officer at Key Factor or maybe more known as Prime Key in Europe. And probably my biggest claim to fame is that I founded the E-D-B-C-A open source PKI project some 20 years ago. So quantum computing and the bad and the good. So before we go into the, to the bad, I wanted to just put a slide that quantum computing is supposed to be for good.
You know, give us endless green energy by allowing us to make FO solar panels as sufficient as photosynthesis cure for Alzheimer's and cancer and all this, this stuff. And the good thing is that supposedly, you know, the power of quantum computers will reach the level where they can do some of this stuff before it reaches the level where it can do some of the bad stuff, which or the bad, what we think are bad or bad.
Some others may think it's good, but so if we get a quantum computer that's capable enough to be a, what we call a cryptographically relevant quantum computer, it does threaten our, all our, well, not all our, but some of our cryptographic algorithms that we depend on today. So luckily we have symmetric algorithm algorithms like AS and hash algorithms like chat two, they're largely unaffected. So we don't have to redesign those. We can continue using those asymmetric encryption algorithms or public key algorithms.
On the other hand, we'll be basically zero strength, which is totally broken, which means, you know, RSA or elliptic curve cryptography or LDSA and other things. Everything that we use, you know, for everything today, like TLS connections, SSH connections, all digital signatures on contracts on the code signing to do over there, updates and everything. SM secure email, even, you know, blockchains where your wallets typically have an elliptic curve key to sign your transactions.
All that will be total un salvage unsalvageable. So that's the, the basic story that quantum is coming.
There's a debate out when a cryptographically relevant phone, a computer might be here. Some predict, you know, in worst case or best case, depending on who you are in five years, some say they will never come or in a hundred years, but there is a risk that they will come and that risk is unacceptable. So that's why we need new algorithms to prevent or to hand manage this risk.
You know, IM is a lot about risk management. So this is a risk that has to be managed and the new algorithms are here, which we'll talk more about and it's time to prepare. So things like this take a long time to migrate from a known, you know, something we've been using for 50 years like RSA to something new. It's gonna take a long time.
A small timeline is that, you know, NIST has been running the post quantum cryptography contest. Probably a lot of you will know about this. So that started already in 2017.
So if we fast forward a long, a couple of years here, you know, the, it reached what they called a NIST draft standards for public comment in August last year, which is the last step until we have final standards. And the good thing with these standards are that they are actually really good international collaboration. Even though NIST ran the competition or runs the competition will publish the standards for these algorithms.
You know, it's a development effort by researcher from Europe, from the US and from Asia. So, you know, all parts of the world thinks that these new algorithms are secure and have helped to develop this. So three new algorithms with the standardized two digital signature algorithms and one which is called key encapsulation method, which is for key transport or encryption short.
So currently, you know, there is multiple versions out because this has been in draft and development and things have changed, parameters of these algorithms have changed over time.
So if you want to run a a test run today, you have to know, you know, which exact version of which software to use in order for them to be interoperable. But this will all be done in July. So only in, you know, a month or two the final standards will be published for M-L-D-S-A, which previously known as DI Lithium Digital Signature Algorithm, which will be the generic one, which probably will, will start to replace RSA and EC S-L-H-D-S-A.
The second hash based dig signature algorithm and chem or S-L-H-D-S-A is previously known as swings plus and previously or currently known as kyber, is the key encapsulation methods. So more about that. So it's really, you know, this is where migration and crypto deal starts. This is not the end game. So this is where it starts and where we have to to be ready.
So to look at details.
Now, are these algorithms different? So we have these new algorithms and yes they are different. So we are gonna dig into that here. So keys are bigger public and private keys, more details to come. Signatures are bigger as well. More details to come and cams or key encapsulation methods, they work a little bit differently. So I'll start with the, with the last one and then we'll dig into number crunching. So key encapsulation mechanisms, you know, that's, it's what's gonna replace diff helman key agreement or RSA key transport, which is what we use in all, you know, TLS connections.
There are key is the key agreement in order to negotiate symmetric encryption key for AS typically and cams they work a little bit differently. You know, the end game is the same, you have a symmetric key there, but they work a little bit differently technically, which means that protocols has to be redesigned a little bit.
So TLS has to be updated.
CMS, which underpins SM or typically, you know, EI does the document signing will has to be, well document signing as unit encryption. So that was a bad example, but encrypted email will have to be, you know, updated with these new things. So the algorithms are coming. So you know, everything we ship will change that we use today.
Some things a lot, some things a little, as I mentioned, protocols and formats like TLS or P QCs 11, when you talk to hardware secured modules, hardware secured modules themselves, of course the tpms or secured elements, you know, in our phones and everything will have to change. So some, you know, mostly might only be a firmware update for, for example, an HSM to be able to support new algorithms for other more constrained hardware elements like a SIM cards, they probably need a complete physical replacement with newer updated versions.
So, and any protocol or any application that communicates securely over network or stores data, which uses any of these algorithms. And of course it starts with all the cryptographic libraries, be it open as a cell or CY or you know, whatever is used by your operating system or application. It has to be updated. And in order to do this, the typical way you say that you need to know what you have inventory, you know, your data, your current encryption algorithms to be able to prioritize what you transfer or what you migrate first and the criticality of the data that you're migrating.
So let's get into some numbers now that we kind of cover the basics that some of you probably already know as it was new for some people. But, so first, if we look at the sizes, as I talked about keys and the signatures are different.
First I'm gonna set the, the baseline so to say. So NIST have defined three security levels called 128 bit security 192 and 256 bit security level, which kinda roughly is supposed to mean the same effort as it takes to break a 128 or a 2 56 or something like that. Worth noting is that RSA 24 8, which is probably the most ubiquitously used.
Its metric encryption key out there today. It's only 112 basic security. So it doesn't qualify kind of other than new rules. So RSA 30 72 is 128 bit secured and is what then we should use to compare if we are comparing with the lithium two, for example, which is a quantum safe algorithm with 128 bit security. So that's just to set the level, the field and which is why I choose RSA 30 70. So 30 72 as the RSA comparison in these slides.
So as you see here, the, for the public key size says, you know, common today are ECP 2 56, which have really small public keys or RSA 30 72, which has exactly 3072 bit key size, which is a number, a couple of hundred bytes.
But then the lithium two, which is gonna be the generic replacement for this algorithm, it's a bit more than one, it's a bit more than 1,400 bytes. So the difference is huge. So this might not have differ a lot. I mean on my phone I have 256 gigabytes, right? So I can store millions of these keys regardless on my phone.
But if you have a constrained device with a limited amount of flash memory where you want to store public keys or route certificates and things like that, which today you can store a couple of them, you know, a number of them, you will be able to store a lot less. So this might be a problem for some applications, some devices, et cetera. And similarly, or exactly the same with the private key sizes, you know, P 2 56 elliptic curve, really small or say 30 72, well 30 72 bits and the lithium two, now we are up to over 3000, almost four kilobytes for the private key.
That means if you have a smart card in the IM system or a token secured element or whatever, you know, where you, you can store today say 20 elliptic curve keys or 10 or SA keys, you might only be able to store, you know, one or two tele lithium, two keys. So again, for some applications, not a problem at all. For some applications, a huge problem which makes things so bad worth noting here, is that the hash based algorithms like HINX plus up there, which is the S-L-H-D-S-A as I mentioned, that has a really small public key and private key.
But it doesn't really help because if we look at the signature size here as things plus have a huge digital signature while the, again, we see E-C-D-S-A-P 2 56, only 64 byte that it a signature or SA well here it actually says 2048 but, and the lithium has over two kilobytes of of signature.
And this of course affects things like certificate, which is my kind of home turf, which of course I have to have a slide with a signature or certificate sizes.
Again, we see P 2 56, really small, just a couple of hundred bytes of certificates while if we go up to the lithium two, it's over four kilobytes, larger typical digital certificates. Again, this affects, you know, some use cases a lot while others not we'll see. So on this slide I try to summarize or gather, you know, one of the, where this is discussed intensely, what the impact might be. So a typical certificate chain, if you do a TLS handshake, you know you transfer a couple of certificates in the handshake in order to authenticate the server and or the client.
And typically you have a root ca and issuing ca and the leaf certificate and a typical handshake you only transfer the issuing ca and the leaf certificate because the root certificate is stored in your browsers trust store or some other trust store.
So, but then the public key and signature size of a typical certificate change. So for elliptic curve it's only 224 bytes for if you are excluding the root, a typical handshake, while for a typical DI lithium chain, it would be over eight kilobytes.
So again, if you have a fast local network, low latency with you know servers, et cetera, with a lot of CPU, you're not gonna notice any difference at all. But if you're on a high latency network or if you open, you know, a hundred different connections over, you know, to the other side of the world, so with high latency like a typical web browser does, then you know, this adds to each of these connections and latency might be affected. So this things like browser vendors are terrified about the impact of users, for example. So it all depends.
So I don't like to, you know, when you get asked, so what's gonna be the impact? I hate, you know, to say it depends, but that's unfortunately the truth here, that it depends on your use case. You might not see anything at all or it might be a totally disaster.
So one thing, the second thing is speed of course. So the new algorithms are quantum safe because they are based on much more complicated mathematics. So typically require more more RAM memory in order to process and more CPU. So how does that affect speed?
So I tested on a bunch of hardware secured modules and here we see some well interesting things at least. So on the left side there, there are two different hardware secured modules with two different software implementations in the firmware of the HSM, which are used over network, fairly high latency but multi-threaded. And for RSA and ec we see it's about 30 to 50 signatures per second here, which is, you know, fairly good for most normal use cases.
If we look in the bottom for the lithium, which is the quantum safe algorithm, we see that, you know, one of the implementations have almost the same like 46 signatures per second for the lithium, which is roughly the same as for RA and EC for this HSM, while the other one drops to six, which is practically unusable for a lot of use cases.
So one way to look at this is that probably, you know, implementations will mature and graduate, you know, to be approximately the same as RSC and EC, which is good news.
We can also see the impact of hardware acceleration, which seems to help the new algorithms as well. So on the right on the right here is actually running open SSL on my laptop, which is back in the corner there, which has hardware inter CPU, which some cryptographic hardware accelerated functions on it, which really helps for elliptic curve cryptography and it seems to help actually for the lithium as well, which is faster than RSA in this case. So good news.
I would say also kind of a more application level, of course try to issuing certificates using the E-D-B-C-A software which uses bouncy, also open source cryptographic API in the background and you can, I can issue 800 certificates per second with RSA 30 72 and with the lithium it's just slightly slower but still 700 certificates per second, which is again, good news because it satisfies 99.999% of use cases.
It's, it's perfectly fine.
So with that I can conclude, you know, what does it mean to me or you then as a user, it means that signing a verification will probably not be horribly slow for IT systems. Normal IT systems, which is fairly powerful. Signing a verification might be horribly slow or not work at all for constrained devices and some TLX TLS connections may break. There's even a site to test it because their server and client hello messages are larger and some TLS implementations are not coded to handle that. Database size will probably increase.
So if you're storing a lot of signed transactions or signed logs for example, you know, if you're signing a small piece of data, say bank transfer and then the signature size becomes much, much larger than the actual transaction, then you can expect your DI database size will grow a lot. So you have to plan for that. There will be a lot of upgrades of course, and there's still many measurements to be done.
So I'm gonna skip this and jump into here. I try to see, okay, so this is an IM conference. So what's the relevance for IIAM? So how this kind of ties into IAM is in many places.
So every TLS or MTLS connections, which are used for machine identity, machine communication or well, anything things like code signing for over there updates of course spfi for, you know, spfi identities typically use X 5 1 9 for cluster identities or workload identities in the cloud in Kubernetes, things like smart cards, USB tokens, security elements, they will have to be at least a firmware update, if not the physical update replacement wallets. Most of them use elliptic curve cryptography today will have to migrate fur encrypting per, you know, private information.
PII, you have to look at that and see how you protect that in the future. And of course things like eids, you know, trust service providers in in AI does or Q TSPs, they will have to look at this as well.
And there will come a lot of, you know, I didn't go through in, in this presentation, but there's a lot of regulations that are this being worked on and the EU commission has given all the countries in eu, you know, directions to come up with agree with a, a common plan for this.
So if, if I would give one advice, ensure that you have a budget for next year to start, you know, planning or to start executing your migration. Because today, you know, most organizations haven't budgeted for this. Some have absolutely, but most have not. Of course we have, you know, free to use tools that you can use online.
We have a, we call it PQC lab. I would also like to kind of strike something for, from the ITF Internet Engineering Task force. I have a great resource which is called Post Quantum Cryptography for Engineers, which goes through a lot of this in great detail and it's super helpful. Great document. There's a lot of good information out there that will help everyone in the migration. Thank you.
Thank you very much for bringing us through and giving us a view at the different steps here, the work that's being done and where it's gonna hopefully keep continuing.
So we do have a question from the audience I'd like to ask you quickly. In your opinion, how long will it take for trusted public CAS to adopt the new PQC post quantum cryptography algorithms?
That's, I think they will start next year, 2025 to look seriously into it in the planet. You know, standard wise, be it the web, PKI, of course there's a cab forum that has to go through a process. In Europe there's TSPs and AI esa, right? So they're gonna start next year I think, and then it's will, you know, take a couple of years before this rolls up.
So, you know, my best guess five years there will be start to be deployed, you know, in something that can be production ready, so to say.
Yeah. Thank you very much. Okay.