In this final part, we talk about the candidates for a new standard Digital Signature Algorithm. (DSA)
The core component of a secure digital signature lies in the hash function. The concept of hashing, in its most simplified form, is to take some data (the input), perform some math and/or shuffling of the data, and to get a reliable output. When you put the same data into the hash function, you should always get the same output after performing the hash. This gives us a crucial property for verifying the integrity of data. If the hash function matches what is expected, we can verify that the data is exactly the same.
This explanation may sound simplistic, and it is. But all of the complexity comes in the details of the math and transformations that get us to a working digital signature scheme.
-To have the same output given the same input.
-To not have the same output given different input.
If you are able to easily find two of the same output for two different inputs, this is known as a collision. Finding collisions is a very active branch of research in cryptography, because being able to successfully generate collisions would allow an attacker to forge all kinds of secure data, and therefore gain entry to authenticated systems and exfiltrate data or cause damage to systems. The breakdown of the secure hash function equates a breakdown of nearly all secure systems on the web.
Signatures and hashing are interesting in this regard, because every hashing scheme has infinitely-many collisions, and the security of a hash function is making them too difficult to find.
To understand the concept of why there are infinitely-many collisions among hash functions, you have consider how many possible inputs there are vs how many possible outputs. To get us there, we can consider a very simple hashing scheme.
Let’s begin with our needs. Our algorithm needs to accept any data in any form, such as a document, an email, a security certificate, images, video, whole databases, whatever. From there, we devise a fictional hashing scheme where the output can be the letters a-z.
Since there are only 26 possible letters in the English alphabet, we can immediately see a problem. Given infinite possible inputs, many many inputs are going to have the same answers. This problem scales out to full hash functions, where a limited string of characters cannot be unique given an infinite set of possible inputs, and this problem is compounded by computers being exceedingly good at making guesses (a single PC can make billions of guesses per hour on standard hashing algorithms).
So what do we do? We make the hash outputs so complex that collisions are exceedingly unlikely to happen by chance, and we use math and shuffling to prevent easily locating collisions using any sort of known analysis methods.
Minor disclaimer on all of this explanation: This description is very simplistic, and collisions don’t necessarily equate a fully broken digital signature system. A digital signature system typically combines hashing with a form of keyed encryption, to prevent collisions from explicitly breaking the scheme outright. Quantum computing threatens both the hash functions and the digital signature schemes, hence the search for replacements with a strong preference for collision resistance.
Quantum Computing Steps In
The standards for hashes (like MD5, SHA0, SHA1, SHA2, and SHA3, Whirlpool, and Blake-2), and signature schemes all rely on these same principles with different approaches. The problem is with the advent of quantum computers, they will be exceedingly good at finding collisions using the current schemes, especially if an unforeseen breakthrough dramatically reduces the cost of powerful machines.
Even further, quantum computers (if they continue to advance) will be better than classical computers at attacking current digital signature systems like (HMAC, and ECDSA) in particular using Grover’s Algorithm. and Shor’s Algorithm respectively.
This has prompted the international response to find alternative hash functions and digital signature schemes that will protect computers well into the quantum computing age.
The Entries for Post-Quantum Digital Signature Algorithms
Crystals-Dilithium is a lattice-based signature algorithm that derives its security from the difficulty of solving complex lattices. The design of the scheme is based a similar scheme known as Fiat-Shamir with Aborts. (PDF warning)
The primary advantage of Crystals-Dilithium over similar proposals for this contest is that Crystals-Dilithium uses a static value instead of Gaussian Sampling, (PDF warning) which is very hard to do correctly, and very hard to detect if done incorrectly (allowing a malicious actor to tamper with the algorithm to significantly weaken it undetected).
Fast-Fourier Lattice-based Compact-Signatures Over NTRU (FALCON) is a lattice-based signature algorithm that derives its security from the NTRU Lattice model (PDF warning) and is supported by the GPV framework outlined here. (PDF warning)
It features small signatures and key sizes against other entries due to the NTRU construction, but it also uses Gaussian Sampling, (PDF Warning) which makes the algorithm easier to fail in implementation and it is hard to detect bad implementations (either through error or malice).
Great Multivariate Short Signature (GeMSS) is an evolution of an older multivariate signature algorithm called QUARTZ which is built on a variant of the Hidden Field Equations algorithm using the minus and vinegar modifiers (AKA HFEv-).
While this algorithm is efficient and fast, there have been recent improvements in attacks (PDF Warning) against the signature scheme GeMSS is based on that suggest that there may be problems that could surface with more research. It also has rather large public keys, but also has relatively small signatures.
Lifted Unbalanced Oil and Vinegar (LUOV) (Zip Warning) is a multivariate signature scheme that aims to improve on the older and well studied Unbalanced Oil and Vinegar scheme. (The original algorithm description is only available in Japanese, this is a study of the various algorithms that sprouted from that original paper.)
Like other multivariate entries, it features small signatures but has large public keys. It is also conservative in its margin of safety and assumes that quantum attacks against multivariate signatures may improve, which ensures that the algorithm can survive modest improvements to cryptanalysis.
Multivariate Quadratic Digital Signature Scheme (MQDSS) is a multivariate signature scheme that follows the Fiat-Shamir construction.
It has small keys but large signatures, and was designed with possible hardware acceleration in mind. It is also inherently constant-time which prevents side channel attacks without having to engineer a complicated method of doing so.
The largest drawbacks of MQDSS is that the security proofs are not as robust as other projects, and that MQDSS does not claim to have level 5 security assurances (equivalent to AES-256 strength). It claims that the strongest variant is security level 3-4 (AES-128 to AES-192 equivalent strength).
Picnic is a new signature scheme design championed by Microsoft researchers and multiple researchers from Austrian cryptography schools. It is based on LowMC and the ZK++ protocol (which is based on ZKBoo).
The scheme is designed to be extremely compact for hardware implementations, allowing for easy hardware acceleration to make the cipher lightweight on low-power devices if it were to become a worldwide standard. It is also nearly drop-in compatible with current X.509 certificate schemes, only requiring larger total signature sizes than existing algorithms, and the project already has working modified OpenSSL code.
It also features integrated tamper resistance, which gives the scheme resistance against a PRNG backdoor with a subtle bias. The security rationale of this claim is not well explained in the paper, but if this was proven, it would be an advancement over current schemes that are highly reliant on their PRNG to create secure signatures.
The scheme is designed to be a drop-in compatible component into existing x.509 certificate code, similarly requiring only an edit to allow larger signature sizes in order to function in existing software.
The project does not propose NIST Security level 5 parameters (AES-256 equivalent strength) and currently only claims security level 3 parameters (AES-128 equivalent).
The Ring-LWE problem has recently had some significant advancements in cryptanalysis that weaken its viability. The qTESLA project does not appear to address this in their documentation. There may be more problems that will surface throughout the NIST competition as the contest narrows and research increases on the individual proposals.
Rainbow (.zip warning) is a multivariate signature scheme that targets computational efficiency and small signature sizes, but has relatively large key sizes. It is an evolution of the well-researched Unbalanced Oil and Vinegar signature scheme. It is also mathematically simple, which makes understanding and implementing the algorithm without errors easy.
Rainbow has been around for quite a while, with the scheme being formally proposed in 2005 and security research has not significantly advanced any attacks against the scheme since 2008.
SPHINCS+ is an evolution of the SPHINCS signature scheme, and it is aimed at being an optimized version of the scheme that increases performance while reducing the size of signatures. The project is touted as making few security assumptions, meaning that the foundations of the scheme have sound research backing them up and that it isn’t doing anything novel that could backfire with a few more years of research.
The practical SPHINCS+ implementations also share large pieces of code with the XMSS signature scheme which has been adopted by Crypto Forum Research Group (CFRG) as the first worldwide quantum resistant signature standard.
In the original paper, it is disclosed that fault attacks are possible, and that any final implementation should account for this. And not surprisingly, a SPHINCS and SPHINCS+ fault attack surfaced in 2018 that allows forged signatures at a reasonably low computational cost.
It will be interesting to see how this is mitigated by the team, and if the costs of such a mitigation manage to keep the scheme viable for the competition.