NIST Round 2 and Post-Quantum Cryptography (part 1)

Posted on Feb 6, 2019 by Derek Zimmer
Share Tweet

The National Institute for Standards and Technology (NIST) has announced the round 2 candidates for post-quantum cryptography.

What is post quantum cryptography?

Post Quantum Cryptography is encryption that can resist cracking by quantum computers. There has been a slow advance in the development of quantum computers, which are particularly good at the math that protects current encryption algorithms. The pace of research suggests that quantum computers with the capability to break current encryption algorithms could be built sometime in the next decade.

These systems are particularly adept at breaking asymmetric encryption like RSA and Diffie-Hellman which are used for secure handshakes. Typically, an encryption scheme uses asymmetric encryption to create a secure connection then a secret key is passed through this secure connection to start a symmetric secure session using something like AES. This means that a breakdown of the asymmetric session at the beginning of the connection, leads to the attacker being able to read the secret key for AES, which breaks the encryption for the rest of the connection.

Post-Quantum research delves into finding the best algorithms for replacing RSA and Diffie-Hellman. The criteria are a combination of security and practicality. It has to not only be secure, but also fast enough to be practical. This means it needs to use as few round trips (back and forth transmissions over the internet) and as little computing power as possible.

Additionally, Quantum Computers are also particularly good at breaking signature algorithms. These are algorithms that are used for error checking and authenticity checking to ensure that data has not been corrupted or tampered with by a 3rd party. The current standards of SHA-2 and SHA-3 are insufficient to protect data from quantum attacks. In a successful quantum attack, a malicious user may be able to forge documents, certificates, or information in order to create “fake” ones. This has significant consequences for secure systems and secure communications, as without signatures there’s no way to verify that a particular person or system made a particular piece of data. Without that assurance, you have a breakdown in trust, as a user can verify with absolute certainty that a message that they were sent was not forged. Similarly, a system cannot verify if a file it is loading was not tampered with. This means that without a signature algorithm that has quantum resistance, we lose surety in our messages AND systems become inherently insecure.

The Quest to Find the New Algorithm Standard

So with this problem NIST is looking for algorithms that remain resistant to all known methods of classical attacks, but that are also resistant to quantum attacks like Shor’s Algorithm and Grover’s Algorithm.

NIST certifies a new encryption standard based on a competition. This competition has multiple rounds where a list of candidates is created, then narrowed based on rounds of public comment and a board of experts who eliminate slower or less safe solutions.

NIST has just released their list of round-2 algorithms for Asymmetric and Hash functions that will replace RSA, Diffie-Hellman, or Elliptic Curve Diffie-Hellman, and SHA2 or SHA3.

The selection process is governed by a body of NIST cryptography experts which has very specific requirements and metrics which are measured.

You can read more about that here: (PDF warning)

The Candidates that have Advanced to Round Two

(for Asymmetric Public-Key Encryption  to replace RSA, Diffie-Hellman and Elliptic-Curves)

Classic McEliece
Three Bears

(for Digital Signatures to replace SHA2 and SHA3)


This narrows the competition down from 89 initial candidates to 26 candidates for round two. There will only be two winners, one for each type of algorithm.

Some of these proposals were not really supposed to advance, like the joke pqRSA algorithm (PDF Warning) which was just RSA with the key sizes and prime numbers made so massive that even quantum computers couldn’t solve them. With keys this large, it would take days to generate keys, hundreds of gigabytes of working memory, terabytes of storage and uncomfortably long times to do anything useful with them. The pqRSA team jokingly suggests that in the interest of the environment, NIST should generate and distribute all of the keys, pointing to a completely compromised approach that is both impractical and insecure.

More importantly, it is interesting to see that the conventional
wisdom is wrong, and that RSA has enough flexibility to survive the advent of quantum computers—beaten, bruised, and limping, perhaps, but not dead.

More to come…

Next week I will be writing on the algorithms that have advanced to round two, as these are the serious contenders for being the post-quantum standard.



About Derek Zimmer

Derek is a cryptographer, security expert and privacy activist. He has twelve years of security experience and six years of experience designing and implementing privacy systems. He founded the Open Source Technology Improvement Fund (OSTIF) which focuses on creating and improving open-source security solutions through auditing, bug bounties, and resource gathering and management.

VPN Service