Quantum Technologies – NIST Drives Post-Quantum Standards Adoption
Overview of the first batch of FIPS Standards supporting PQC
Mechanisms to protect information with secret codes, ciphers, passwords, and the field of cryptography in general dates back to well before the Roman Empire. Today’s digital security systems evolved from simple substitution and manipulation of symbols and/or letters in the alphabet. This post will explore the developments of the next generation of cryptographic technologies founded in Quantum Physics – the artist known as Post Quantum Cryptography – PQC.
Let’s get started! Net-net, encryption must evolve to survive.
First – a Brief History Lesson
The word cryptography comes from the Greek words kryptos, meaning hidden, and graphien, meaning to write. Humans have been advancing “hidden writing” for thousands of years. Early evidence indicates symbol replacement hieroglyphics were used in 1900 BC Egypt and in 1500 BC, a Mesopotamian scribe was used to conceal a formula for pottery glaze.
To protect military messages from adversaries, the Romans (in the time of Julius Caesar ~ 100 BC) created the “Caesar Cipher”, a simple encryption method that shifted letters in the alphabet by a fixed number. For example, shifting each letter forward by three would turn “HELLO” into “KHOOR.” Only someone who knew the shift value (the key) could decipher the message. Variations of the Caesar Cipher were used for centuries, influencing the evolution of encryption into more complex forms.
Naturally, over time, the tools, techniques and encryption algorithms have evolved from manual systems into automated systems, first using dedicated electrical physical devices, and then into software and digital code. One of the most famous moments in encryption history, was the battle between Allied code breakers and the German Enigma machine during World War II.
This complex device used rotating cipher wheels to scramble messages, where each keystroke changed the encryption pattern, making every message unique creating nearly unbreakable communications for the Nazi military. Without the correct key settings, decrypting an intercepted message was nearly impossible. Led by British cryptanalyst Alan Turing and his team at Bletchley Park, the Allies famously cracked the Enigma machine coding algorithm, and this development gave the Allies a crucial advantage in the war.
First the scary/dire prediction part – the pending “Q-Day”
Fast forward to today, and the next evolution in cryptography - referred to as Post-Quantum Cryptography (PQC). If you have spent any time exploring quantum computing you have probably run across the term PQC, and the huge looming threat quantum computing poses for potentially breaking today’s core public key encryption methods like RSA. Instead of billions of years, it will be possible for a quantum computer to break RSA encryption in days or even hours, putting everything from state secrets to bank account information at risk.
Currently, many encryption algorithms rely on the difficulty conventional computers have with factoring large numbers. Today’s cryptographic algorithms select two very large prime numbers — which are only divisible by 1 and themselves — and multiply them to obtain an even larger number. While multiplying the prime numbers is easy and fast, it’s far more difficult and time-consuming to reverse the process and figure out which two prime numbers were multiplied together. These two numbers are known as the “prime factors.” For large enough numbers, a conventional computer has been estimated to need billions of years to reverse engineer the prime factors.
Simply stated, PQC is developing new cryptography methods that can withstand quantum attacks to prevent the so-called “Q-Day”. Q-Day is an estimate of the point in time when quantum computers will be able to reliably break existing RSA-2048 cryptography. The most common estimates put Q-Day in the early to mid-2030s, though timelines vary widely. NIST and NSA warn of possible arrival by 2030 with unforeseen breakthroughs in quantum computing. Bottom line, while we know Q-Day is coming, the exact date is unknown. Bottom line, current encryption algorithms will not be secure forever. So, researchers, crypto experts, governments and tech companies are already testing quantum-resistant alternatives.
The NIST PQC Program
A key player in the worlds of cryptography and quantum science is NIST – The National Institute of Standards and Technology in the U.S. NIST is a non-regulatory agency operating within the U.S. Department of Commerce. Its core mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve quality of life.
NIST focuses on three main missions:
Measurement Science (metrology) –- for precise units like time and voltage used in trade, commerce, product safety, and infrastructure
Voluntary Standards and Guidelines – creating formal standards to be adopted globally (including for the US Federal Government and its agencies) for industries like information technology, cybersecurity and manufacturing
Research in Emerging Fields – including quantum science, AI, and advanced manufacturing.
NIST receives primary funding through annual federal appropriations from Congress, typically around $1-1.3 billion. NIST is a world leader that has long played a key and highly valuable role in spurring research, development and implementation of standards for the most advanced cryptographic techniques and technologies.
PQC Program and Competition
Academic research on the potential impact of quantum computing on cryptography dates to at least 2001. A NIST published report from April 2016 cites experts that acknowledge the possibility of quantum technology’s potential to render the commonly used RSA algorithm insecure by 2030. (see Q-Day above)
As a result, NIST recognized the need to standardize quantum-secure cryptographic primitives and in December 2016 NIST initiated a new multi-stage standardization process by announcing a call for proposals to advance PQC. On August 13, 2024, NIST released final versions of its first three Post Quantum Crypto Standards.
Responses to the call for proposals attracted submissions for 23 signature schemes and 59 encryption/KEM schemes (public key–based schema) for the initial submission deadline at the end of 2017. A total of 69 proposals were deemed complete and proper and were moved into the first round of assessment/review. NIST engaged the world’s leading cryptograph researchers and experts to analyze and attempt to crack the candidate schemes which reduced the number of candidates. This entire process was open and transparent.
Candidates moving on to the second round of assessment/review were announced on January 30, 2019. On July 22, 2020, NIST announced 7 finalists (”first track”), as well as 8 alternate algorithms (”second track”). The first track contains the algorithms which appear to have the most promise and will be considered for standardization at the end of the third round. Algorithms in the second track could still become part of the standards, after the third round was completed.
First a word from our sponsor – NIST FIPS
One of NIST’s core mandated is the development, creation, and publication of Federal Information Processing Standards (FIPS), which are mandatory standards for U.S. federal systems. Given the high level of sophistication and diligence that goes into FIPS, in addition to the U.S. federal agencies, FIPS standards are adopted voluntarily by organizations around the world to support robust and secure information and data handling systems that are consistent and interoperable.
The first 3 PQC FIPS standards provide detailed descriptions of post-quantum encryption and digital signature algorithms so they can be implemented consistently to facilitate secure and interoperable communication. These new standards are designed for two essential tasks for which encryption is typically used:
general encryption, used to protect information exchanged across a public network
digital signatures, used for identity authentication.
Federal Information Processing Standard (FIPS) 203:
203 is intended as the primary standard for general encryption and it will be used to establish a shared secret key over open (aka insecure, like the Internet) channels. Among its advantages are comparatively small encryption keys that two parties can exchange easily, as well as their speed of operation. The standard is based on the CRYSTALS-Kyber algorithm, which has been renamed ML-KEM, short for Module-Lattice-Based Key-Encapsulation Mechanism.
Federal Information Processing Standard (FIPS) 204:
204 is intended as the primary standard for generating cryptographic keys, protecting and verifying digital signatures. The standard uses the CRYSTALS-Dilithium algorithm, which has been renamed ML-DSA, short for Module-Lattice-Based Digital Signature Algorithm.
Federal Information Processing Standard (FIPS) 205:
205 is also designed for digital signatures. The standard employs the SPHINCS+ algorithm, which has been renamed SLH-DSA, short for Stateless Hash-Based Digital Signature Algorithm. The standard is based on a different math approach than ML-DSA, and it is intended as a backup method in case ML-DSA proves vulnerable. Much like 204 this algorithm is well suited for remote digital signing, and as such can be thought of as a backup/alternative for 204.
Federal Information Processing Standard (FIPS) 206 = Work in Progress
Work on FIPS 206 is progressing, is currently in draft, and built around FALCON algorithm. 206 will be referred to as FN-DSA, short for FFT (fast-Fourier transform) over NTRU-Lattice-Based Digital Signature Algorithm. While there is no formal release date planned, estimates are 206 will be published in late 2026.
The three finished standards 203, 204, 205 are now ready for use, and organizations have already started integrating them into their information systems to future-proof them against the looming Q-Day “event”.
What the heck is “Lattice -Based” anything?
Post-quantum cryptography research is unfolding along several approaches and algorithm types. In this post we will not attempt to go deep into these approaches, but the primary approaches are:
Lattice based – where a very large multi-dimensional grid of points, a lattice, is the mathematical foundation. Certain tasks on the grid appear very hard to calculate, such as finding the shortest non‑zero vector.
Multi-variate – is based on the difficulty of solving systems of multivariate equations.
Hash based – is a mathematical function that can map data of arbitrary size into fixed-size values. The values returned by a hash function are called hash values, or hash codes.
Code based – leverages error-correcting codes where recovering a general codeword is made computationally difficult by applying transformations to a public key so it looks like a “random” code that is hard to decode.
Isogeny-based – leverage the properties of isogeny graphs (group variables) of elliptic curves. Current Diffie–Hellman cryptographic mechanism uses this methodology.
As mentioned above, NIST standards 203 and 204 leverage the lattice-based schemes known as CRYSTALS-Kyber as the primary key-encapsulation mechanism (KEM), and CRYSTALS-Dilithium as a primary digital signature standard.
But Wait – there is more! FIPS 203 Back Up!
On March 11, 2025, NIST chose the Hamming Quasi-Cyclic (HQC) as the fifth algorithm for post-quantum asymmetric encryption and will be used for key encapsulation/exchange (KEM). This new algorithm will serve as a backup algorithm for ML-KEM (FIPS 203). HQC is a code-based scheme using different math than ML-KEM, thus mitigating possible weaknesses should any be found in 203. The draft FIPS standard for the HQC algorithm is expected in early 2026 with the final publication in 2027.
HQC and ML-KEM are both what experts call “key encapsulation mechanisms,” or KEMs. A KEM is used over a public network as a sort of first handshake between two parties that want to exchange confidential information.
Wrapping Up!
It is apparent that NIST and cryptography experts worldwide are making good progress on developing new super hardened cryptography algorithms and related standards to deal with the potential of Quantum computing cracking current cryptographic algorithms. This is critical work to support ALL organizations who use information (who doesn’t? ) 😊 to begin planning for the replacement of hardware, software, and services that use public-key algorithms so that data and information is protected from future quantum based attacks.
NIST has extensive experience developing encryption algorithms and the publishing of FIPS 203,204.205 is a first step on the journey to ensuring “Q-Day” is a non-event. With these new standards, and additional standards to come, technology managers can begin to inventory their systems for applications that use encryption, which will need to be replaced before cryptographically relevant quantum computers appear.
Let us know what you think. Please share your thoughts via the Comments section for this post or open a new SubStack chat thread … and please forward this post to your friends and colleagues. See you next time!


