Fact-checked by Grok 2 weeks ago

Post-quantum cryptography

Post-quantum cryptography (PQC), also known as quantum-resistant cryptography, refers to the development and use of cryptographic algorithms and protocols that remain secure against attacks from both classical computers and future large-scale quantum computers. These algorithms are designed to protect sensitive data in applications such as secure communications, digital signatures, and key exchange, even in an era where quantum computing could render many current systems vulnerable. The need for PQC arises primarily from quantum algorithms like Shor's algorithm, which can efficiently factor large integers and solve discrete logarithm problems—foundational to widely used public-key cryptosystems such as RSA and elliptic curve cryptography (ECC)—exponentially faster than classical methods. A sufficiently powerful quantum computer could thus break these systems, potentially compromising encrypted data harvested today for future decryption, a risk known as "harvest now, decrypt later." Symmetric cryptography, like AES, faces lesser threats from Grover's algorithm, which provides only a quadratic speedup and can be mitigated by larger key sizes. In response, the U.S. National Institute of Standards and Technology (NIST) launched a standardization process in 2016 to evaluate and select quantum-resistant algorithms. After multiple rounds of public competition, NIST announced its first four selected algorithms in 2022, based on lattice-based (e.g., CRYSTALS-Kyber for key encapsulation, CRYSTALS-Dilithium and FALCON for signatures) and hash-based (e.g., SPHINCS+ for signatures) approaches. In August 2024, NIST published the first three finalized standards—FIPS 203 (ML-KEM, based on Kyber), FIPS 204 (ML-DSA, based on Dilithium), and FIPS 205 (SLH-DSA, based on SPHINCS+)—to enable immediate migration to quantum-safe protections for email, e-commerce, and other critical infrastructure. Additional algorithms, such as the code-based HQC for backup encryption, were selected in 2025. PQC algorithms draw from diverse mathematical foundations believed to resist quantum attacks, including lattice problems, error-correcting codes, hash functions, and multivariate polynomials, while maintaining compatibility with existing infrastructure through hybrid schemes combining classical and quantum-resistant elements. Ongoing research addresses challenges like larger key sizes, computational overhead, and side-channel vulnerabilities to ensure practical deployment.

Background

Definition and Motivation

Post-quantum cryptography refers to the development of cryptographic algorithms and protocols that remain secure against computational attacks by both classical and quantum computers. Unlike symmetric-key algorithms, which are only mildly affected by quantum speedups, the focus is primarily on public-key systems vulnerable to quantum algorithms, such as those relying on integer factorization or discrete logarithms. Classical public-key cryptography, including RSA introduced in 1978 and elliptic curve cryptography (ECC) proposed in 1985, enables secure key exchange and digital signatures without pre-shared secrets but lacks resistance to quantum threats. The roots of post-quantum cryptography trace back to early proposals predating widespread awareness of quantum risks, such as the McEliece cryptosystem introduced in 1978, which uses error-correcting codes for public-key encryption. The field formalized following Peter Shor's 1994 algorithm, which showed that quantum computers could efficiently solve problems underlying systems like RSA and ECC. The term "post-quantum cryptography" was coined by Daniel J. Bernstein in the early 2000s, spurring organized research efforts, including the first International Workshop on Post-Quantum Cryptography in 2006. The urgency stems from advancing quantum technology, which threatens to undermine decades of cryptographic infrastructure protecting communications, financial transactions, and national security data. Without migration to quantum-resistant alternatives, current encryption could be retroactively compromised, exemplified by the "harvest now, decrypt later" threat, where encrypted data is stored today for future quantum decryption. Standardization efforts, led by organizations like NIST since 2016, aim to ensure long-term security for information with extended confidentiality needs.

Quantum Computing Threats

Shor's algorithm, developed by Peter Shor in 1994, provides a polynomial-time quantum solution to the integer factorization problem and the discrete logarithm problem, which underpin the security of widely used public-key cryptosystems. The algorithm begins by selecting a random number and computing a function whose period it seeks to find; this period-finding subroutine leverages quantum superposition to evaluate the function across multiple inputs simultaneously. A key step involves applying the quantum Fourier transform to the resulting quantum state, which efficiently extracts the period by identifying periodic patterns in the frequency domain, enabling the subsequent classical computation of factors or logarithms. Optimized versions of this approach achieve a complexity of approximately O((\log N)^2 (\log \log N) (\log \log \log N)) for an N-bit number, rendering it exponentially faster than the best known classical algorithms. The implications of Shor's algorithm are profound for current public-key cryptography: it can break RSA encryption by factoring large semiprime moduli, compromise Diffie-Hellman key exchange and DSA signatures by solving discrete logarithms in finite fields, and undermine elliptic curve cryptography (ECC) by addressing the elliptic curve discrete logarithm problem. For instance, a sufficiently powerful quantum computer running Shor's algorithm could decrypt RSA-2048 keys, which are standard in secure communications today. These vulnerabilities arise because the hardness assumptions of these systems—large-number factorization and discrete logarithms—collapse under quantum polynomial-time attacks. Grover's algorithm, introduced by Lov Grover in 1996, offers a quadratic speedup for unstructured search problems, reducing the time complexity from O(N) to O(\sqrt{N}) queries on a database of size N. In cryptographic contexts, this enables faster brute-force attacks on symmetric ciphers and hash functions by effectively halving their security levels; for example, AES-256, which provides 256 bits of classical security, would be reduced to approximately 128 bits against Grover's search, while SHA-256 hashing would face similar degradation for preimage attacks. Unlike Shor's exponential advantage, Grover's impact is more modest but still requires countermeasures, such as increasing key sizes to restore security margins. Breaking practical cryptosystems with these algorithms demands substantial quantum resources, highlighting the current gap between theory and implementation. Recent estimates indicate that factoring a 2048-bit RSA integer using an optimized version of Shor's algorithm could require fewer than one million noisy physical qubits, with circuit depths involving reduced Toffoli gate counts (over 100x fewer than prior estimates) and a runtime of less than one week, assuming a 0.1% gate error rate and surface code error correction. Optimized error-corrected implementations require around 1,500 logical qubits, achievable with fewer than one million physical qubits using advanced codes like yoked surface codes assuming a 0.1% gate error rate. However, achieving reliable logical qubits—essential for fault tolerance—entails overheads leading to millions of physical qubits; for example, error-corrected implementations might need around 10,000 logical qubits for RSA-2048, translating to 20 million or more physical qubits under current error correction schemes. Grover's algorithm for AES-256 key search requires around 1,300 logical qubits but 2^{128} iterations, demanding significantly more overall resources than Shor's algorithm due to the repeated executions, with runtimes estimated from days to over 100,000 years based on circuit depth and hardware cycle times. Progress toward these capabilities is accelerating, as evidenced by IBM's 2025 quantum roadmap, which emphasizes demonstrations of error correction codes and hybrid quantum-HPC algorithms to scale toward fault-tolerant systems. IBM plans to deploy architectures like Quantum Starling, targeting around 200 logical qubits by integrating advanced qLDPC codes that reduce physical qubit overhead by up to 90%, with milestones including real-time error handling on classical chips. Current quantum hardware, such as IBM's systems with hundreds of noisy qubits, remains far from cryptographically relevant scales, but projections suggest viable threats within a decade if scaling continues. The primary risk from quantum computing falls on public-key systems, which Shor's algorithm can shatter entirely, necessitating immediate migration to post-quantum alternatives for protocols like TLS. Symmetric cryptography faces a lesser but actionable threat from Grover's algorithm, where doubling key lengths (e.g., from AES-128 to AES-256) can mitigate the quadratic speedup and maintain security. Hash functions require analogous adjustments for collision and preimage resistance, underscoring the need for proactive hardening across cryptographic primitives.

Cryptographic Primitives

Public-Key Encryption and Key Encapsulation

Public-key encryption schemes in post-quantum cryptography (PQC) are designed to ensure confidentiality of data transmitted over public channels, protecting against decryption by adversaries equipped with quantum computers. These schemes replace classical public-key encryption methods, such as RSA, which are vulnerable to Shor's algorithm that efficiently factors large integers and solves discrete logarithms on quantum hardware. In PQC, encryption primitives maintain semantic security while accommodating the computational threats posed by quantum algorithms. Key encapsulation mechanisms (KEMs) serve a complementary role in PQC by enabling secure key exchange between parties, facilitating the establishment of shared symmetric keys for protocols like TLS without direct transmission of the key material. KEMs encapsulate a randomly generated secret key within a ciphertext that only the intended recipient can decapsulate using their private key, ensuring forward secrecy and resistance to quantum attacks on classical key exchange methods like ECDH. This is critical for hybrid cryptographic systems where PQC KEMs integrate with existing symmetric ciphers to upgrade internet security standards. PQC public-key encryption typically achieves IND-CCA2 security, which protects against adaptive chosen-ciphertext attacks by ensuring that even if an adversary can query a decryption oracle (except for the target ciphertext), they cannot learn the plaintext. For KEMs, the base construction often provides IND-CPA security, which is then elevated to IND-CCA security via the Fujisaki-Okamoto (FO) transform, a hybrid encryption technique that combines the public-key scheme with a symmetric cipher and random oracle to bind the message and randomness securely. The FO transform, originally proposed in 1999 and refined in subsequent works, allows conversion of weakly secure primitives into strongly secure ones, making it a foundational tool for PQC KEM standardization. One of the primary challenges in deploying PQC public-key encryption and KEMs is their increased computational overhead and resource demands compared to classical counterparts. These schemes often require larger public key sizes—typically several kilobytes—to achieve equivalent security levels, leading to higher bandwidth usage and storage needs in protocols like TLS handshakes. Additionally, encryption and decapsulation operations are slower due to the mathematical complexity of underlying problems, such as lattice reductions, which demand more CPU cycles on standard hardware. Despite these trade-offs, ongoing optimizations aim to mitigate performance impacts while preserving IND-CCA security. A representative example of a PQC KEM is the lattice-based approach, where a sender generates a public key consisting of structured vectors in a high-dimensional lattice space and encapsulates the shared key by adding noise to a multiple of the recipient's public key, allowing decapsulation through noise removal via the private key. This general structure, as instantiated in schemes like Kyber, relies on the hardness of learning with errors (LWE) problems, providing efficient key exchange with provable security reductions.

Digital Signatures

Digital signatures serve as a core cryptographic primitive in post-quantum cryptography (PQC), enabling the verification of message authenticity, integrity, and non-repudiation while resisting forgery by quantum adversaries. These schemes aim to provide existential unforgeability under chosen-message attacks (EUF-CMA) security in models that account for quantum computation, such as the quantum random oracle model (QROM). Unlike classical digital signatures based on integer factorization or discrete logarithms, PQC signatures must withstand attacks from algorithms like Shor's, which efficiently solve these problems on large-scale quantum computers, thereby rendering traditional schemes such as RSA and ECDSA insecure. PQC digital signature designs generally adapt classical paradigms, such as the Fiat-Shamir transform, which converts secure identification protocols into signatures via hash functions assumed secure against quantum attacks. A prominent distinction arises in hash-based constructions between stateful and stateless mechanisms. Stateful schemes require the signer to maintain and monotonically update an internal counter or state after each signature generation to prevent key reuse, which could otherwise enable forgery if nonces or subkeys are repeated. In contrast, stateless schemes generate each signature independently using randomized components, avoiding the need for persistent state. These design choices involve inherent trade-offs in performance and practicality. Stateful approaches typically yield smaller public keys, signatures, and faster generation and verification times—often by factors of 2 to 10 compared to stateless alternatives—due to their efficient use of one-time signature primitives extended via structures like Merkle trees. However, they demand robust state management to ensure sequential signing and prevent errors in distributed or multi-threaded environments, potentially complicating deployment. Stateless designs mitigate these management risks, making them more robust for concurrent use cases, but at the expense of larger signatures (frequently 10-50 kB) and higher computational overhead from additional hashing and randomization steps. PQC-specific challenges for digital signatures include ensuring resistance to quantum-enhanced forgery attacks, where adversaries may exploit superposition to query the signing oracle in parallel, necessitating security reductions in the QROM rather than classical random oracle model. Moreover, the underlying quantum-resistant hard problems often result in substantially larger cryptographic objects than classical equivalents, with signature sizes scaling to several kilobytes to achieve comparable security levels (e.g., 128 bits post-quantum), which poses bandwidth and storage constraints in bandwidth-limited applications like IoT or blockchain. These issues underscore the need for optimized implementations balancing security, efficiency, and usability. Among PQC signature families, hash-based schemes construct security from the second-preimage resistance of quantum-secure hash functions, employing one-time signatures (e.g., via Winternitz or Lamport primitives) aggregated through tree-based methods to support multiple signatures from a single key pair without revealing the private key. Lattice-based signatures, meanwhile, derive security from problems like the short integer solution or learning with errors over lattices, typically using zero-knowledge proofs with rejection mechanisms to ensure signatures are statistically close to uniform distributions, thereby hiding information about the secret key. These approaches provide diverse options, with hash-based emphasizing simplicity and provable security under minimal assumptions, while lattice-based offer potential for smaller sizes through advanced algebraic structures.

Symmetric-Key Adaptations

Grover's algorithm poses a threat to symmetric cryptography by providing a quadratic speedup for brute-force key searches, effectively halving the security level of a key of length n bits to approximately n/2 bits in terms of quantum computational effort. To maintain equivalent post-quantum security, symmetric key lengths must be doubled; for instance, AES-128, which offers 128 bits of classical security, requires upgrading to AES-256 to achieve 128 bits of post-quantum security against Grover's attack. NIST has confirmed that existing AES variants with 128-, 192-, or 256-bit keys remain suitable for post-quantum use when appropriately sized, without needing entirely new primitives. Adaptations for hash functions follow a similar principle, as Grover's algorithm also accelerates preimage attacks, necessitating doubled output sizes to preserve security margins. For example, SHA-256 should be replaced by SHA-512 to retain 128-bit post-quantum preimage resistance. Sponge-based constructions like SHA-3 exhibit inherent resistance to such attacks due to their design, which absorbs input into a state and squeezes out fixed-length outputs, but larger variants (e.g., SHA-3-512) are recommended for heightened security levels. These adjustments ensure that hash-based applications, such as key derivation, maintain integrity against quantum adversaries without overhauling foundational algorithms. In post-quantum protocols, symmetric cryptography serves as an efficient core for bulk data protection, often integrated into hybrid schemes where quantum-resistant key establishment feeds into symmetric encryption for confidentiality. NIST's standardized key encapsulation mechanisms, for example, pair with AES-GCM for authenticated encryption, leveraging symmetric primitives' performance while relying on post-quantum methods for initial key exchange. For message authentication, quantum-resistant MACs like HMAC, when constructed with doubled-length hashes (e.g., HMAC-SHA-512), provide robust integrity protection, as their security reduces to the underlying hash function's post-quantum strength. This approach balances efficiency and security in resource-constrained environments.

Algorithm Families

Lattice-Based Cryptography

Lattice-based cryptography relies on the computational hardness of problems defined over lattices, which are discrete subgroups of Euclidean space generated by integer linear combinations of basis vectors. These problems provide a foundation for quantum-resistant cryptographic primitives due to their resistance to attacks by quantum algorithms like Shor's. The field originated in the 1990s with Miklós Ajtai's seminal work demonstrating worst-case to average-case hardness reductions for lattice problems, enabling the construction of secure hash functions and one-way functions based on the difficulty of approximating the shortest vector problem (SVP) in lattices. This breakthrough laid the groundwork for public-key cryptography, with early efficiency improvements realized in the NTRU cryptosystem, which uses polynomial rings to achieve compact keys and fast operations. The core hard problem in lattice-based cryptography is the Learning With Errors (LWE) problem, introduced by Oded Regev in 2005, which posits that it is difficult to recover a secret vector \mathbf{s} \in \mathbb{Z}_q^n from samples of the form (\mathbf{a}_i, b_i = \langle \mathbf{a}_i, \mathbf{s} \rangle + e_i \mod q), where \mathbf{a}_i are random vectors in \mathbb{Z}_q^n, e_i are small errors drawn from a discrete Gaussian distribution, and q is a modulus. In matrix form, the search-LWE problem involves finding \mathbf{s} given \mathbf{A} \mathbf{s} + \mathbf{e} = \mathbf{b} \mod q, where \mathbf{A} \in \mathbb{Z}_q^{m \times n} is a random matrix, \mathbf{e} has small entries, and m is polynomial in n. The decision variant, which is computationally indistinguishable from uniform random samples, underpins most constructions and benefits from quantum worst-case hardness reductions to lattice approximation problems like SVP and the shortest independent vectors problem (SIVP). To enhance efficiency, variants like Ring-LWE and Module-LWE restrict the domain to structured rings. Ring-LWE, proposed by Lyubashevsky, Peikert, and Regev in 2010, operates over the ring of polynomials \mathbb{Z}_q/(x^k + 1), replacing matrices with polynomial multiplication to reduce key sizes and computation to O(n \log n) via fast Fourier transforms, while preserving average-case hardness tied to ideal lattice problems. Module-LWE, defined by Langlois and Stehlé in 2012, generalizes this to modules over polynomial rings, offering a balance between the flexibility of plain LWE and the efficiency of Ring-LWE by using rank greater than 1, which supports parallelizable operations and has been shown hard under worst-case module lattice assumptions. Lattice-based constructions leverage these problems for versatile primitives. Public-key encryption schemes, such as Regev's original LWE-based scheme, use the secret \mathbf{s} as the private key, with the public key comprising \mathbf{A} and \mathbf{b}; encryption adds noise to a multiple of \mathbf{b} minus a multiple of \mathbf{A}, and decryption recovers the message by rounding after subtracting the inner product with \mathbf{s}, succeeding if noise remains below q/2. Digital signatures are derived via the Fiat-Shamir with aborts paradigm, introduced by Lyubashevsky in 2009, which transforms zero-knowledge identification protocols over LWE into non-interactive signatures by hashing commitments and challenges, with aborts to mask the secret and ensure uniform rejection sampling. These methods extend to key encapsulation mechanisms (KEMs) for secure key exchange. The advantages of lattice-based approaches include their broad applicability across encryption, signatures, and KEMs, as well as robust worst-case to average-case reductions that guarantee security even against the hardest lattice instances, unlike average-case assumptions in other paradigms. For instance, the NIST-selected Kyber KEM is based on Module-LWE, providing efficient post-quantum key exchange.

Code-Based Cryptography

Code-based cryptography encompasses public-key schemes whose security is predicated on the computational difficulty of decoding linear error-correcting codes, particularly in the presence of errors. The foundational hard problem is the syndrome decoding problem (SDP), which involves finding a low-weight error vector that produces a given syndrome for a linear code; this problem was proven NP-complete for binary linear codes in 1978. The McEliece problem generalizes this by requiring the recovery of a plaintext from a ciphertext generated using a hidden linear code disguised as a general one, assuming the underlying decoding hardness holds. These assumptions provide resistance against quantum attacks, including Shor's algorithm, as no efficient quantum algorithm is known to solve SDP for random codes. The seminal construction is the McEliece cryptosystem, introduced in 1978, which uses binary Goppa codes to enable public-key encryption: the public key consists of a scrambled generator matrix of the Goppa code, while encryption adds a low-weight error vector to a codeword multiple of the message. For digital signatures, code-based schemes often employ quasi-cyclic (QC) codes to achieve compact keys and efficiency; these leverage the algebraic structure of QC codes for syndrome-based verification, as in the Fiat-Shamir with aborts paradigm adapted to decoding challenges. In March 2025, NIST selected HQC, a code-based KEM using quasi-cyclic moderate-density parity-check codes, for standardization as a backup to the primary lattice-based ML-KEM. Classic McEliece, a modern instantiation using Goppa codes, remains a secure candidate in the NIST process for key encapsulation mechanisms, though its standardization has been delayed due to large key sizes. Key advantages of code-based schemes include their long-standing resistance to cryptanalysis—over 45 years without structural breaks for properly chosen codes—and their reliance on well-studied problems in coding theory. However, a primary challenge is the large public key sizes, often around 1 MB for security levels comparable to AES-128, due to the need for dense generator or parity-check matrices to hide code structure. Variants address these issues: the Niederreiter formulation reframes encryption as syndrome computation with an error vector as the message, using a parity-check matrix for potentially smaller keys in certain settings. Additionally, random linear code encryption (RLCE) schemes use truly random codes without special structure, enhancing security proofs against generic attacks while maintaining efficiency through systematic forms.

Multivariate Cryptography

Multivariate cryptography relies on the hardness of solving systems of multivariate quadratic equations over finite fields, primarily the Multilinear Quadratic (MQ) problem, which involves finding a solution to a set of quadratic polynomials in several variables. The MQ problem is NP-complete, providing a foundation for cryptographic primitives resistant to both classical and quantum attacks, as no efficient quantum algorithms are known to solve it. A complementary hard problem is the Isomorphism of Polynomials (IP), which asks whether two sets of multivariate polynomials are equivalent under an invertible affine transformation of variables; this serves as a basis for trapdoor constructions in multivariate schemes. General constructions in multivariate cryptography use trapdoors based on these problems to enable efficient private operations while keeping public evaluation hard. For digital signatures, the Unbalanced Oil and Vinegar (UOV) scheme introduces a trapdoor by partitioning variables into "oil" and "vinegar" sets, where the private key allows solving a structured system that appears random publicly; this was proposed to resist certain algebraic attacks. Encryption variants employ similar layered trapdoors, akin to those in Rainbow-like structures, where the public key is a composition of affine transformations around a central quadratic map, enabling decryption via inversion of the trapdoor. These schemes offer advantages such as fast signature generation and verification, making them suitable for resource-constrained environments in post-quantum settings. However, drawbacks include large public key sizes due to the need to specify numerous polynomial coefficients, and vulnerability to recent attacks; for instance, the Rainbow signature scheme, a prominent multivariate candidate, was broken via key recovery in 2022 using under 53 hours on a single-core laptop. Multivariate cryptography originated in the late 1980s with the Matsumoto-Imai scheme, an early public-key encryption based on quadratic polynomials over finite fields. Following attacks on encryption variants in the 1990s, research shifted focus to signature schemes like UOV, which have proven more resilient.

Hash-Based Cryptography

Hash-based cryptography relies on the security of cryptographic hash functions, which are one-way functions designed to be computationally infeasible to invert or find collisions under classical and quantum attacks. In the context of post-quantum cryptography, these schemes primarily focus on digital signatures, leveraging hard problems such as the construction of one-time signatures (OTS) and the second preimage resistance of hash functions. An OTS allows signing a single message securely but becomes insecure if reused, so hash-based methods extend this capability using tree structures to enable multiple signatures from a single key pair while maintaining security. The second preimage resistance ensures that, given a hash output, it is hard to find a different input producing the same output, which is crucial for preventing forgery in these constructions. The foundational general construction is the Merkle signature scheme, introduced in 1987, which combines OTS with Merkle trees—a binary tree where leaf nodes represent OTS public keys and the root serves as the overall public key—to allow multiple message signatures without revealing the entire secret key set. In this scheme, signing a message involves generating an OTS for it and providing the authentication path (sibling hashes) from the corresponding leaf to the root, enabling verification against the public root. Modern variants distinguish between stateful and stateless schemes: stateful ones, like the eXtended Merkle Signature Scheme (XMSS), track used keys to avoid reuse and support a fixed number of signatures (e.g., up to 2^60), while stateless schemes, such as SPHINCS, generate signatures randomly from the full key space without state management, trading efficiency for simplicity in deployment. These schemes offer provable security reductions to the underlying hash function's properties, such as collision resistance and second preimage resistance, without depending on number-theoretic assumptions vulnerable to quantum algorithms like Shor's. They are quantum-resistant provided the hash function remains secure against quantum attacks; for instance, using SHA-3 with doubled output size mitigates Grover's algorithm, which provides a quadratic speedup for searching preimages. However, hash-based signatures typically produce large outputs—often tens of kilobytes due to authentication paths—limiting their use in bandwidth-constrained environments, and they do not naturally extend to encryption or key encapsulation mechanisms.

Isogeny-Based Cryptography

Isogeny-based cryptography relies on the mathematical structure of isogenies, which are rational maps preserving the group law between elliptic curves defined over finite fields, to construct cryptographic primitives resistant to quantum attacks. Unlike classical elliptic curve cryptography, which is vulnerable to Shor's algorithm for solving the elliptic curve discrete logarithm problem, isogeny-based schemes derive their security from the presumed hardness of computing isogenies between supersingular elliptic curves. These curves have complex endomorphism rings that enable non-commutative group actions, providing a foundation for post-quantum security. The core hard problems in this family include the Supersingular Isogeny Diffie-Hellman (SIDH) problem and the more general computational Diffie-Hellman problem on isogeny graphs. In SIDH, given a starting supersingular elliptic curve and two public isogenies from it, the challenge is to compute a secret isogeny connecting the codomains of those public isogenies. Isogeny graphs model the connections between elliptic curves via isogenies of fixed degrees, forming expander graphs where finding short paths (isogeny walks) is computationally difficult, underpinning the security of related protocols. Key exchange protocols like SIDH and its encapsulation variant SIKE enable secure shared key generation by having parties compute complementary isogenies along secret paths in the supersingular isogeny graph, revealing only the resulting curves publicly. For digital signatures, constructions such as SQISign use isogeny walks to commit to messages via structured paths in these graphs, allowing verification through reconstruction without revealing the private path. These schemes were developed primarily in the 2010s, with SIDH introduced in 2011 as a quantum-resistant alternative to traditional Diffie-Hellman. Isogeny-based systems offer advantages in compactness, with public keys and ciphertexts typically around 1 KB for security levels comparable to AES-128, making them suitable for resource-constrained environments. However, in July 2022, a classical polynomial-time attack by Castryck and Decru broke SIKE by exploiting torsion points and glue-and-split techniques on polarized products of supersingular curves, invalidating its security claims and leading to its withdrawal from NIST standardization. This breakthrough has shifted research toward hardened variants, such as those incorporating commutative isogeny actions or alternative graph structures, to restore quantum resistance while preserving small key sizes.

Standardization Efforts

NIST Process and Timeline

The National Institute of Standards and Technology (NIST) launched its post-quantum cryptography (PQC) standardization process in December 2016 through a formal call for proposals aimed at identifying public-key algorithms secure against both classical and quantum computing threats. This initiative sought submissions for key encapsulation mechanisms (KEMs) and digital signature schemes, with evaluations spanning multiple rounds to assess candidates rigorously. The process emphasizes algorithms that maintain compatibility with existing cryptographic infrastructures while addressing potential vulnerabilities from quantum algorithms like Shor's. Evaluation criteria include security levels categorized by resistance to quantum attacks (e.g., NIST security levels 1–5, corresponding to at least 128 bits of security against quantum adversaries), computational efficiency (runtime and resource demands across hardware platforms), and implementation characteristics such as algorithmic simplicity, randomness requirements, and resilience to side-channel attacks. NIST formed an internal panel and solicited expert feedback to prioritize candidates balancing these factors, with a focus on practical deployment in constrained environments like embedded systems. Submissions were required to provide detailed security analyses, performance benchmarks, and open-source implementations for testing. The timeline began with the April 2016 release of NISTIR 8105, a report outlining PQC needs, followed by the December 2016 call. By November 2017, 82 proposals were received, with 69 advancing to Round 1 evaluation through 2018. In January 2019, 26 candidates (17 KEMs and 9 signatures) proceeded to Round 2, which concluded with a July 2020 report advancing 15 algorithms (7 KEMs and 8 signatures) to Round 3. Round 3, running from 2020 to 2022, narrowed finalists through extensive cryptanalysis, culminating in a July 2022 report selecting initial candidates for standardization. For additional digital signatures, NIST advanced 14 candidates to Round 2 in July 2025, with evaluation continuing and a conference held in September 2025. Round 4, initiated in 2022 for additional KEM diversity, evaluated remaining candidates like HQC, BIKE, and Classic McEliece amid ongoing security reviews. On August 13, 2024, NIST published the first three Federal Information Processing Standards (FIPS): FIPS 203 for ML-KEM (key encapsulation), FIPS 204 for ML-DSA (signatures), and FIPS 205 for SLH-DSA (signatures). In March 2025, HQC was selected as an additional KEM for standardization, as detailed in NISTIR 8545, to provide a code-based alternative enhancing portfolio robustness. Standardization of further candidates, including signature alternates, continues into late 2025. To support adoption, NIST issued SP 1800-38 in September 2025, a multi-volume guide from the National Cybersecurity Center of Excellence outlining preparation steps for PQC migration, including inventory tools and integration frameworks. This complements a September 2025 draft white paper (CSWP 48) mapping PQC migration activities to the NIST Cybersecurity Framework (CSF) and SP 800-53 security controls, aiding federal and organizational risk management. Internationally, efforts align with NIST through ETSI's March 2025 standard for quantum-safe hybrid key exchanges and ISO/IEC initiatives to incorporate PQC into global standards such as ISO/IEC 19790 and 24759, ensuring interoperability.

Selected Algorithms

In August 2024, the National Institute of Standards and Technology (NIST) finalized its initial post-quantum cryptography standards through Federal Information Processing Standards (FIPS) 203, 204, and 205, selecting algorithms designed to provide security equivalent to AES-128, AES-192, and AES-256 levels against quantum attacks. These include ML-KEM for key encapsulation, and ML-DSA and SLH-DSA for digital signatures, with parameter sets tailored to specific security levels. In March 2025, NIST selected HQC as an additional key encapsulation mechanism to serve as a backup option, with a draft standard anticipated in 2026 and finalization by 2027. ML-KEM, standardized in FIPS 203, is a module-lattice-based key encapsulation mechanism derived from the CRYSTALS-Kyber algorithm, intended primarily for secure key exchange in protocols like TLS. It supports three parameter sets corresponding to NIST security levels 1, 3, and 5 (equivalent to AES-128, AES-192, and AES-256 classical security):
Parameter SetSecurity LevelPublic Key Size (bytes)Private Key Size (bytes)Ciphertext Size (bytes)
ML-KEM-51218001,632768
ML-KEM-76831,1842,4001,088
ML-KEM-102451,5683,1681,568
ML-DSA, defined in FIPS 204, is a module-lattice-based digital signature scheme based on CRYSTALS-Dilithium, suitable for general-purpose signing applications such as code signing and software updates. Its parameter sets align with NIST security levels 2, 3, and 5:
Parameter SetSecurity LevelPublic Key Size (bytes)Private Key Size (bytes)Signature Size (bytes)
ML-DSA-4421,3122,5282,420
ML-DSA-6531,9524,0003,293
ML-DSA-8752,5924,8644,956
SLH-DSA, specified in FIPS 205, is a stateless hash-based digital signature algorithm derived from SPHINCS+, recommended as a backup to ML-DSA for scenarios requiring long-term security without reliance on lattice assumptions, such as firmware signing. It offers parameter sets optimized for small signatures ("s") or fast verification ("f") at NIST security levels 1, 3, and 5, with signature sizes ranging from approximately 7 KB to 50 KB; for example, SLH-DSA-SHA2-128s provides a 7,856-byte signature at level 1. HQC, selected in 2025 as a code-based key encapsulation mechanism and a variant of the McEliece cryptosystem, addresses potential vulnerabilities in lattice-based schemes by offering an alternative for key exchange in hybrid deployments. Its proposed parameter sets for NIST security levels 1, 3, and 5 feature larger keys compared to ML-KEM but efficient decryption:
Parameter SetSecurity LevelPublic Key Size (bytes)Private Key Size (bytes)Ciphertext Size (bytes)
HQC-12812,2492,3054,433
HQC-19234,5224,5868,978
HQC-25657,2457,31714,421
NIST recommends hybrid schemes combining these post-quantum algorithms with classical ones (e.g., ECDH with ML-KEM) during the transition period to mitigate risks from unforeseen weaknesses. ML-KEM and HQC are prioritized for key establishment in general-purpose protocols, while ML-DSA and SLH-DSA support signature verification in constrained environments.

Security Analysis

Reductions to Hard Problems

In post-quantum cryptography, the security of proposed schemes is frequently established through provable reductions that connect the scheme's security properties—such as indistinguishability under chosen-ciphertext attack (IND-CCA) for encryption or existential unforgeability for signatures—to the computational hardness of specific mathematical problems. These reductions show that an efficient adversary breaking the scheme could be transformed into an efficient algorithm solving the underlying hard problem, often with only a polynomial loss in security parameters. For instance, many lattice-based encryption schemes reduce IND-CCA security to the hardness of the Learning With Errors (LWE) problem. A prominent example in lattice-based cryptography is the reduction from worst-case lattice problems to average-case instances of LWE, originally established by Regev using a quantum algorithm that preserves hardness across these settings. This reduction underpins the security of schemes like those based on Ring-LWE, a variant over polynomial rings, which further reduces to problems like NTRU encryption and BLISS signatures; specifically, the decision NTRU problem is shown to be at least as hard as search Ring-LWE under certain parameter distributions. These connections rely on Regev's quantum reduction lemma, which ensures that quantum algorithms solving average-case problems imply solutions to worst-case lattice approximation problems like GapSVP. For the Dilithium signature scheme (ML-DSA), security reduces to the hardness of Module-LWE and Module-SIS, providing existential unforgeability under chosen-message attacks (EUF-CMA) security. In code-based cryptography, the McEliece cryptosystem's security reduces to the syndrome decoding problem for binary linear codes, which involves recovering an error vector from a given syndrome and is proven NP-complete, ensuring that no efficient algorithm exists unless P=NP. Multivariate schemes like Unbalanced Oil and Vinegar (UOV) reduce unforgeability to the multivariate quadratic (MQ) problem, where finding a solution to a random system of quadratic equations over finite fields is assumed intractable; this reduction holds under the MinRank assumption for the trapdoor structure. Hash-based signatures, such as the Merkle signature scheme, reduce to the collision resistance of the underlying hash function combined with the security of one-time signatures like Winternitz one-time signatures (WOTS), where forging a signature would imply finding hash collisions or reusing one-time keys. Isogeny-based schemes like Supersingular Isogeny Diffie-Hellman (SIDH) base their security on the computational difficulty of finding isogeny walks between supersingular elliptic curves, specifically the Supersingular Computational Diffie-Hellman (SSCDH) assumption, which posits that an adversary cannot compute shared secrets without knowing the private isogeny paths in the supersingular isogeny graph. Key types of reductions in post-quantum cryptography include worst-case to average-case transformations, particularly in lattices, where Regev's quantum reduction links hard worst-case problems (e.g., approximating shortest vectors) to solvable average-case LWE instances via a quantum sampling procedure. Another type is quantum non-malleability in reductions, ensuring that quantum adversaries cannot malleate queries to preserve classical proof structures in the quantum random oracle model (QROM). Despite these advances, some post-quantum schemes face limitations in their reductions; for example, hash-based signatures often rely on the random oracle model (ROM), which idealizes hash functions as truly random but introduces gaps when instantiated with concrete functions like SHA-3, potentially weakening provable security against non-ideal attacks. Tightness issues also arise in quantum reductions, where security loss factors can be super-polynomial without careful parameter tuning.

Efficiency and Security Comparisons

Post-quantum cryptography (PQC) algorithms are evaluated based on their efficiency metrics, including public key sizes, ciphertext or signature sizes, and computational costs measured in clock cycles or milliseconds on standard hardware, alongside security levels defined by NIST as equivalents to AES-128 (level 1), AES-192 (level 3), and AES-256 (level 5) against both classical and quantum adversaries. These metrics reveal trade-offs: lattice-based schemes like ML-KEM and ML-DSA offer balanced sizes and speeds suitable for general use, while code-based systems like Classic McEliece provide strong security but with significantly larger keys, and hash-based signatures like SLH-DSA excel in verification speed at the cost of larger signatures. Benchmarks from the Open Quantum Safe (OQS) project, using the liboqs library, provide standardized performance data across platforms, showing lattice algorithms typically require 10^5 to 10^6 cycles for key operations on x86-64 CPUs, compared to 10^7 or more for code-based key generation. Recent 2025 hardware accelerations, such as GPU implementations for lattice operations, have reduced encapsulation times by up to 10x for ML-KEM on NVIDIA devices. The following table summarizes representative sizes for selected PQC algorithms at NIST security level 1 (AES-128 equivalent), highlighting family differences; higher levels scale sizes proportionally larger for increased security. Lattice-based public keys remain under 2 KB, enabling compatibility with existing protocols, whereas code-based keys exceed 250 KB, posing challenges for bandwidth-constrained environments like IoT.
Algorithm FamilyAlgorithmPublic Key Size (bytes)Secret Key Size (bytes)Ciphertext/Signature Size (bytes)Citation
Lattice-based (KEM)ML-KEM-5128001,632768 (ciphertext)
Lattice-based (Signature)ML-DSA-441,3122,5602,420 (signature)
Hash-based (Signature)SLH-DSA-SHA2-128s32647,856 (signature)
Code-based (KEM)Classic McEliece mceliece348864261,1206,492128 (ciphertext)
Multivariate (Signature, historical)Rainbow (pre-breakage)~50,000~50,000~2,000 (signature)
Isogeny-based (KEM, historical)SIKE (pre-breakage)56448564 (ciphertext)
In terms of computational efficiency, lattice-based algorithms demonstrate superior performance for key exchange and signing: for instance, ML-KEM-512 encapsulation requires approximately 100,000 cycles on an Intel Skylake CPU, enabling sub-millisecond latencies in TLS handshakes, while ML-DSA-44 signing takes around 1 million cycles. Hash-based SLH-DSA offers fast verification (under 50,000 cycles) but slower signing (up to 10 million cycles) due to tree-based structures, making it ideal for scenarios prioritizing verifier efficiency over signer load. Code-based schemes like Classic McEliece exhibit fast encapsulation (similar to lattice) but key generation exceeding 10^8 cycles, often taking seconds, which limits their use in dynamic environments without precomputation. Multivariate schemes, though compact historically, suffered security breaks like the 2022 Rainbow attack, shifting focus away from them, while isogeny-based like SIKE provided small sizes but were invalidated by quantum attacks in 2022. Overall, PQC security in bits against Grover's algorithm halves classical estimates, requiring designs like level 5 for 256-bit quantum resistance, with lattice families providing the most versatile balance for practical deployment.

Practical Implementation

Hybrid Schemes

Hybrid schemes in post-quantum cryptography integrate classical cryptographic primitives with post-quantum algorithms to ensure security against both conventional attacks and those enabled by quantum computers, serving as a transitional measure during the migration to fully post-quantum systems. The core rationale is to provide a hedge against unforeseen vulnerabilities: classical schemes like elliptic curve Diffie-Hellman (ECDH) remain secure against classical adversaries but are vulnerable to quantum attacks via Shor's algorithm, while post-quantum schemes such as ML-KEM offer quantum resistance but may face novel classical exploits not yet discovered. By combining them, hybrid constructions maintain overall security as long as at least one component remains unbroken, enhancing confidence in deployed systems amid ongoing standardization. Key exchange in hybrid schemes typically employs composite key encapsulation mechanisms (KEMs), where multiple KEMs are executed in parallel, and their shared secrets are concatenated before key derivation. The security of such composites is at least as strong as the weaker component, under the assumption that the key derivation function is secure, as formalized in NIST's generic composite technique. For instance, in TLS 1.3, a common construction pairs X25519 (a classical ECDH variant) with ML-KEM, encapsulating separate secrets that are combined via concatenation and processed with a key derivation function like HKDF; this ensures IND-CCA security if either KEM holds. The IETF has proposed standardized hybrid key exchanges, such as X25519MLKEM768, which integrate classical elliptic curves with ML-KEM parameters for 128-bit security equivalence. Hybrid signatures extend this approach to digital signatures, requiring both a classical and a post-quantum signature on the same message, with verification succeeding only if both pass. A representative construction combines ECDSA (on NIST P-256 curves) with Dilithium (now ML-DSA), where the aggregate signature includes both components serialized together; security inherits the minimum strength of the two, providing classical security from ECDSA and lattice-based quantum resistance from Dilithium. This design has been prototyped in hardware security modules, demonstrating feasibility with minimal performance degradation beyond the sum of individual operations. IETF drafts further specify dual-signature authentication for TLS, enabling hybrid certificates with composite public keys. NIST formalized support for hybrid modes in 2024 through updates to SP 800-56C and recommendations in IR 8547, allowing composites of approved algorithms like those from FIPS 203 (ML-KEM) with classical counterparts for key establishment during the post-quantum transition. The IETF's TLS working group has advanced multiple drafts for PQ-TLS hybrids, emphasizing backward compatibility and deployment in protocols like SSH and IPsec. These standards prioritize simplicity, with concatenation-based combining functions to avoid complex interactions that could introduce weaknesses. The primary benefits of hybrid schemes include bolstered transitional security and interoperability, as they leverage the maturity of classical algorithms while incorporating post-quantum protection, reducing the risk of "harvest now, decrypt later" attacks on long-lived data. However, drawbacks arise from increased overhead: key exchanges require transmitting larger ciphertexts (e.g., ML-KEM's 800-1500 byte encapsulation versus ECDH's 32 bytes), and signatures double in size (e.g., ECDSA + Dilithium exceeding 1 KB total), leading to 20-50% higher bandwidth and computational costs in protocols like TLS, though optimizations like selective use in handshakes mitigate this in practice.

Migration Challenges

Transitioning to post-quantum cryptography (PQC) presents several practical challenges, particularly related to the larger key and signature sizes of PQC algorithms compared to classical ones, which can increase bandwidth consumption and strain existing protocols. For instance, lattice-based schemes like Kyber require public keys of 800 to 1,568 bytes (0.8 to 1.6 KB) in size, leading to higher latency in network transmissions and potential bottlenecks in bandwidth-limited environments such as IoT devices or mobile networks. These size differences necessitate protocol modifications to accommodate expanded payloads without compromising performance. Another significant hurdle is the vulnerability of PQC implementations to side-channel attacks, where physical leakage such as power consumption or timing variations can reveal sensitive information, despite the algorithms' resistance to quantum threats. Lattice-based PQC candidates, in particular, are susceptible to these attacks due to operations like Gaussian sampling, requiring specialized countermeasures like masking or constant-time implementations during migration. Crypto-agility—the ability to swiftly update cryptographic primitives without overhauling systems—poses further difficulties, as legacy infrastructure often lacks modular designs, complicating the integration of PQC while maintaining interoperability. A critical aspect of migration involves ensuring forward secrecy through post-quantum perfect forward secrecy (PQ-PFS) in protocols, typically achieved via ephemeral key-encapsulation mechanisms (KEMs) that generate fresh keys per session to protect past communications even if long-term keys are compromised. This is especially urgent given the "harvest now, decrypt later" threat, where adversaries collect encrypted data prior to 2025—such as long-lived sessions or stored backups—and decrypt it once quantum computers become viable, potentially exposing sensitive information retroactively. To address these issues, organizations must first conduct a comprehensive inventory of cryptographic usage to identify vulnerable algorithms and dependencies, followed by a phased rollout that prioritizes high-risk assets. NIST's IR 8547 provides guidance on migrating to quantum-resistant cryptography, recommending hybrid schemes that combine classical and PQC elements for interim security during the transition. In September 2025, the NIST National Cybersecurity Center of Excellence (NCCoE) published a draft white paper (CSWP 48) mapping PQC migration practices to the NIST Cybersecurity Framework and SP 800-53 controls, offering actionable alignments for risk management. Regulatory pressures amplify these efforts; for example, the U.S. National Security Memorandum 10 mandates that federal agencies complete migration to quantum-resistant cryptography by 2035, with interim milestones to mitigate risks.

Open Quantum Safe Project

The Open Quantum Safe (OQS) project is an open-source initiative launched in late 2016 to support the prototyping, evaluation, and integration of quantum-resistant cryptographic algorithms. It evolved from efforts in the post-quantum cryptography community, including contributions from researchers involved in the PQCrypto project, and focuses on developing practical tools for transitioning to post-quantum security. Central to OQS is liboqs, a C library that provides implementations of over 50 quantum-safe algorithms, encompassing key encapsulation mechanisms (KEMs), digital signatures, and related primitives from various families such as lattice-based, code-based, and hash-based schemes. Key features of the OQS project include seamless integrations with widely used cryptographic libraries like OpenSSL and BoringSSL, enabling post-quantum key exchange and authentication in protocols such as TLS 1.3. The project also offers prototype implementations for SSH via a fork of OpenSSH, allowing experimentation with hybrid and pure post-quantum key exchanges. Additionally, OQS provides benchmarking tools and a continuous performance dashboard to assess algorithm efficiency across different hardware platforms, facilitating standardized comparisons of speed and resource usage. In 2025, OQS achieved significant milestones with releases of liboqs version 0.13.0 in April and 0.14.0 in July, incorporating the final NIST-standardized algorithms including ML-KEM (FIPS 203), ML-DSA (FIPS 204), and SLH-DSA (FIPS 205), alongside support for additional candidates like HQC. In November 2025, OQS released liboqs version 0.15.0, which is the last version to support certain deprecated features and includes additional optimizations. These updates were driven by community contributions from over 100 developers worldwide, with optimizations for hardware platforms such as Intel x86_64 and ARM architectures to ensure broad compatibility and performance. The project maintains active forks and prototypes, including recent enhancements for integrating code-based schemes like HQC into protocol implementations. The OQS project has had a substantial impact by lowering barriers to post-quantum cryptography adoption, enabling developers and researchers to test and benchmark algorithms in real-world applications without proprietary dependencies. Its open-source nature has fostered widespread experimentation, including integrations into production-like environments for TLS and SSH, and has influenced industry efforts toward quantum-safe migrations.