Fact-checked by Grok 2 weeks ago

Key authentication

Key authentication is a fundamental security property in cryptographic protocols, particularly authenticated key-exchange mechanisms, that assures a participating party that an established is known exclusively to the intended communicating entities and not to any unauthorized . This assurance can be achieved through implicit key authentication, where the property is inferred from successful protocol execution without explicit confirmation messages, or explicit key authentication, which involves additional messages to verify key possession and entity identity. In practice, key authentication relies on underlying , including both symmetric and asymmetric techniques. Symmetric key authentication employs a key to encrypt and decrypt challenges, such as random nonces, allowing parties to mutually verify identity without revealing the key itself. Asymmetric approaches, conversely, utilize public-private key pairs, where the private key holder proves possession by signing or decrypting challenges, often supported by (PKI) and standards like certificates to ensure trusted key distribution. Recent developments include the adoption of algorithms, such as those standardized by NIST in 2024 (e.g., ML-KEM for key encapsulation), to mitigate vulnerabilities from advances. These methods address vulnerabilities in unauthenticated key exchanges, such as man-in-the-middle attacks, by binding the key to authenticated identities. Key authentication underpins numerous real-world protocols essential for secure communication. For instance, the (TLS) protocol incorporates key authentication during its handshake to establish encrypted sessions for . Similarly, (SSH) leverages public key authentication to enable passwordless remote access, enhancing convenience while mitigating risks associated with password reuse. Other examples include for ticket-based network authentication and Extensible Authentication Protocol (EAP) methods like EAP-TLS for wireless networks, all adhering to standards such as for port-based access control. By integrating key authentication, these protocols provide robust defenses in applications ranging from to (IoT) devices, where scalability and resource efficiency are critical challenges.

Fundamentals

Definition and Principles

Key authentication is a in cryptographic protocols, particularly authenticated key-exchange mechanisms, that assures a participating party that an established is known exclusively to the intended communicating entities and not to any unauthorized . This assurance can be achieved through implicit key authentication, where the is inferred from successful protocol execution without explicit confirmation messages, or explicit key authentication, which involves additional messages to verify key possession and entity identity. At its core, key authentication supports the achievement of broader security goals, such as through of sensitive data with the authenticated key, via detection of tampering using mechanisms like message authentication codes, and by linking messages or actions irrevocably to a specific through signatures. Unlike password-based authentication, which depends on human-memorable secrets susceptible to social engineering or weak , key authentication emphasizes computational assumptions, such as the infeasibility of inverting one-way functions, to resist attacks even when details are public. Key authentication involves two fundamental entities: the prover, which asserts its by demonstrating knowledge or possession of the key in response to a , and the verifier, which issues the and assesses the proof's validity. Keys fall into two types—symmetric keys, shared secretly between parties for joint verification, and asymmetric keys, consisting of a component for verification and a private one for proof generation—enabling different trust models. Authentication goals vary between one-way, where only the verifier confirms the prover's , and mutual, where both parties authenticate each other to prevent impersonation in both directions. The mathematical underpinnings of key authentication include high-entropy to create unpredictable keys from random sources, ensuring the key space resists exhaustive search. For an n-bit key drawn uniformly, the space size is $2^n, providing entropy of n bits and requiring approximately $2^{n-1} operations on average for exhaustive brute-force recovery, which becomes impractical for n ≥ 128 due to computational limits. This foundation, derived from information-theoretic principles, underpins the security of both symmetric and asymmetric methods.

Historical Development

The origins of key authentication trace back to the mid-1970s, when foundational cryptographic concepts emerged to address secure key distribution without physical exchange. In 1976, and introduced the protocol, which allowed two parties to establish a key over an insecure channel using , laying the groundwork for both symmetric and asymmetric mechanisms. This innovation shifted the focus from pre-shared secrets to dynamic key generation, influencing subsequent developments in . Symmetric key authentication advanced in the late 1970s with the standardization of encryption algorithms suitable for authentication. The Data Encryption Standard (DES), developed by IBM and adopted by the National Bureau of Standards in 1977 as Federal Information Processing Standard (FIPS) 46, provided a symmetric block cipher that enabled initial forms of shared key authentication in federal systems by ensuring message integrity and confidentiality. Building on this, the Kerberos protocol was developed in the 1980s at MIT's Project Athena to facilitate secure network authentication using symmetric keys and trusted third parties, with its core design outlined in a 1988 USENIX paper by Jennifer Steiner, Clifford Neuman, and Jeffrey Schiller. Asymmetric key breakthroughs occurred concurrently, revolutionizing by decoupling and decryption keys. In 1977, , , and devised the algorithm, the first practical public-key cryptosystem, which supported digital signatures and for authentication without shared secrets, as detailed in their 1978 Communications of the ACM paper. This work by Rivest, Shamir, and Adleman spurred the growth of (PKI), formalized in 1988 through the standard, which defined certificate formats for binding public keys to identities in hierarchical trust models. The 1990s saw key authentication integrated into internet protocols, with Netscape's Secure Sockets Layer (SSL) version 2.0 released in 1995 to provide encrypted web communications using both symmetric and asymmetric keys for server authentication. SSL evolved into Transport Layer Security (TLS) version 1.0 in 1999, standardized by the IETF as RFC 2246, enhancing authentication resilience against evolving threats. In the post-2000 era, concerns over quantum computing vulnerabilities prompted shifts toward quantum-resistant methods; the National Institute of Standards and Technology (NIST) initiated its post-quantum cryptography standardization process in 2016 with a comprehensive report evaluating algorithms for key authentication resistant to quantum attacks. By August 2024, NIST published the first three post-quantum cryptography standards as FIPS 203 (ML-KEM for key encapsulation), FIPS 204 (ML-DSA for digital signatures), and FIPS 205 (SLH-DSA for digital signatures), advancing quantum-resistant key authentication.

Symmetric Key Methods

Shared Secret Authentication

Shared secret authentication, also known as symmetric authentication, relies on a pre-shared secret that both communicating parties possess to verify each other's through or integrity checks. In this mechanism, one party encrypts a or generates an authentication using the shared , and the recipient decrypts or verifies it with the same key; successful processing confirms the sender's knowledge of the secret, assuming secure initial . This approach is particularly suited for closed systems where parties can establish trust offline or through trusted channels. Key generation for shared secret authentication requires producing random keys with sufficient entropy to match the desired security strength, such as 256 bits for AES-256 to resist brute-force attacks. High-entropy sources, like hardware random number generators, ensure unpredictability, preventing derivation from weaker inputs. Storage must protect against extraction; hardware security modules like Trusted Platform Modules (TPMs) provide tamper-resistant environments for holding keys, isolating them from software access and enabling secure operations without exposure. The primary advantages of authentication include low computational overhead, as symmetric operations like encryption are significantly faster than asymmetric alternatives, making it ideal for resource-constrained devices such as sensors. This efficiency enables real-time authentication in bandwidth-limited environments without the need for complex management. Practical examples include message authentication codes (MACs) using , where a shared key combined with a like SHA-256 produces a tag appended to messages for integrity and authenticity verification; this is widely used in protocols like TLS for symmetric phases. A limitation is the distribution problem: securely sharing the initial symmetric requires a trusted channel, such as physical couriers or prior secure setup, which scales poorly for large or open networks; asymmetric methods like those in can mitigate this by enabling secure .

Challenge-Response Protocols

Challenge-response protocols represent a dynamic approach to symmetric authentication, where a verifier issues a unique to a prover, who responds using the shared secret without revealing it. In the standard protocol flow, the verifier generates and sends a random , denoted as n, to the prover. The prover then computes the response R = E_K(n), where E_K is the function using the shared symmetric K, and returns R to the verifier. The verifier decrypts R with K and verifies that it matches the original n. This process ensures while keeping the key confidential. The primary security benefits stem from the nonce's uniqueness and the non-transmission of the key. By requiring a fresh for each session, the thwarts replay attacks, as an intercepted response cannot be reused against a new . Eavesdroppers gain no useful information about K from observing the exchange, since the is public but the obscures the key. Nonces are typically generated using pseudorandom functions (PRFs) to ensure unpredictability and resistance to guessing. Mathematically, a PRF F_K(m) can produce the nonce or augment it, where m is a input, providing indistinguishable from true randomness under the shared . Variants extend the basic for enhanced functionality. In , both parties act as verifier and prover sequentially: the first issues a , receives and verifies the response, then sends its own for the reverse process, confirming bidirectional knowledge of K. Time-based challenges replace with timestamps, where the prover encrypts a current time t as R = E_K(t); this requires loosely synchronized clocks between parties to validate freshness but avoids explicit transmission. Practical implementations demonstrate the protocol's efficacy in real-world systems. The (CHAP), introduced for the (PPP) in the early 1990s, adapts this mechanism using a hash-based response—MD5 of the nonce concatenated with the shared secret—for periodic peer verification over serial links. In Wi-Fi Protected Access 2 with (WPA2-PSK), the 4-way handshake integrates challenge-response elements: the access point and client exchange nonces (ANonce and SNonce), deriving session keys via a PRF on the PSK, enabling and protection against unauthorized access.

Asymmetric Key Methods

Public Key Infrastructure

Public Key Infrastructure (PKI) provides a structured framework for managing asymmetric key pairs to facilitate secure in distributed environments, particularly over untrusted networks. At its core, PKI relies on key pairs consisting of a public key, which can be freely shared, and a corresponding private key, kept secret by the owner, to enable operations like and signature verification. Digital certificates bind a public key to an entity's identity, such as a , , or , through a signed statement issued by a trusted authority. Certificate authorities (CAs) serve as the issuers of these certificates, verifying the identity of the requester before signing and distributing the certificate. PKI operates through a to establish chains of . A root CA acts as the top-level , self-signing its own to initiate the . Intermediate CAs, certified by the root or other superiors, issue to lower-level CAs or end entities, forming certificate paths that allow verifiers to trace back to a trusted root. This delegation enables , as root CAs can remain offline for while intermediates handle daily operations. chains are validated by following the path of signatures, ensuring each is issued by a verifiable superior. The lifecycle of keys in PKI encompasses , , and to maintain . Key pairs are generated using algorithms like with a minimum of 2048 bits to resist current computational attacks, ensuring sufficient cryptographic strength. Public keys are distributed via certificates rather than direct exchange, allowing recipients to verify authenticity without prior secure channels. Upon detection of compromise or expiration, certificates are revoked using mechanisms such as Certificate Revocation Lists (CRLs), which enumerate invalid certificates, or the (OCSP), which provides real-time status queries to . Key standards underpin PKI interoperability. The format, defined by in 1988, specifies the structure of digital certificates and CRLs, including fields for subject identity, public key details, validity periods, and extensions for additional attributes. As an alternative to this hierarchical model, (PGP) employs a web-of-trust approach, where users sign each other's public keys to build decentralized trust networks without relying on central authorities. Despite its robustness, PKI faces challenges in maintaining trust. Certificate pinning allows clients to associate specific public keys or certificates with a domain, mitigating man-in-the-middle attacks by rejecting unauthorized substitutions even from compromised . Handling key compromise requires prompt and reissuance, but delays or incomplete status checks can expose systems to ongoing risks.

Digital Signature Schemes

Digital signature schemes enable in asymmetric key systems by allowing a signer to produce a verifiable tag on a using their private key, which a verifier can check against the signer's public key to confirm the message's origin and integrity without exposing the private key. This process relies on the mathematical difficulty of inverting certain one-way functions, ensuring that only the legitimate signer can generate valid signatures. In key authentication contexts, these schemes prove possession of the private key corresponding to a public key, often integrated with for certificate validation. The core mechanism involves the signer first computing a cryptographic hash of the message to produce a fixed-size digest, then applying the private key to this digest using scheme-specific operations to form the signature. For RSA-based schemes, this entails encrypting the digest with the private key; the verifier decrypts the signature using the signer's public key to recover the digest and independently recomputes the hash of the received message; if the two values match, the signature is valid, confirming both integrity and authenticity. Other schemes, such as ECDSA, generate a signature as a pair of values derived from the hash and elliptic curve operations. This approach, known as signing the hash, mitigates the inefficiency of directly signing long messages while binding the signature tightly to the message content. Formally, for RSA-based schemes, the signing operation can be expressed as S = \text{Sign}_{sk}(m) = \text{Encrypt}_{sk}(\text{Hash}(m)), where sk is the private key, m is the message, and Hash is a collision-resistant hash function; verification succeeds if \text{Hash}(m) = \text{Decrypt}_{pk}(S), with pk the public key. Prominent algorithms include , which enhances the original RSA with probabilistic to achieve provable against existential forgery. RSA-PSS incorporates a random during signing, processed via a mask generation function, making it resistant to deterministic attacks and suitable for high- applications. , based on , provides similar to but with smaller key sizes and greater computational efficiency, leveraging the elliptic curve problem for faster operations on resource-constrained devices. , the foundational -based , generates signatures as pairs of integers derived from , offering a standardized alternative to without relying on factoring hardness. Security in these schemes is primarily measured by existential unforgeability under chosen-message attacks (EUF-CMA), where an adversary, given to a signing for of their choice, cannot produce a valid on a new . This property holds under standard assumptions like the hardness of the for RSA-PSS or the problem for ECDSA, provided the underlying is collision-resistant. The , such as SHA-256, plays a critical role by compressing the into a digest that resists preimage and collision attacks, ensuring that altering the changes the digest and invalidates the ; without a secure , signatures could be forged via collisions. In authentication protocols, digital signatures prove private key possession during key exchanges, as in TLS handshakes where the server signs handshake messages with its private key to authenticate its identity to the client, enabling secure session establishment.

Applications

Network Protocols

Key authentication plays a critical role in network protocols by enabling establishment between communicating parties over potentially untrusted networks. These protocols typically combine asymmetric key methods for initial authentication and with symmetric keys for efficient data protection, ensuring , , and during transmission. In the (TLS) protocol, formerly known as Secure Sockets Layer (SSL), the process initiates secure sessions using asymmetric to authenticate parties and agree on a shared . The client and server exchange certificates and perform key agreement, often via Diffie-Hellman ephemeral keys, to derive a symmetric for encrypting subsequent data traffic. This hybrid approach minimizes computational overhead after the initial exchange. SSL 3.0, released in 1996, introduced the foundational mechanism, while TLS evolved through versions including TLS 1.0 in 1999 ( 2246), TLS 1.2 in 2008 ( 5246), and TLS 1.3 in 2018 ( 8446), which streamlined the to a single round-trip for faster connection setup. IPsec secures IP communications at the network layer through protocols like the Authentication Header () and Encapsulating Security Payload (). AH provides integrity and authentication using Hash-based Message Authentication Code () algorithms, such as HMAC-SHA-1-96, to protect the entire IP packet without encryption. Key exchange occurs via the (IKE) protocol, which employs Diffie-Hellman to establish shared secrets for symmetric keys, supporting in IKEv2 (RFC 7296). This setup authenticates endpoints and derives keys for AH or ESP, ensuring protection against tampering in transit. The (SSH) protocol facilitates secure remote access and file transfer, relying on -key authentication to verify user identity without transmitting passwords. During connection, the client authenticates using a private corresponding to a stored on the server, often combined with digital signatures for proof of possession. To prevent man-in-the-middle (MITM) attacks, SSH employs keys: the server presents its , which the client verifies against a known list, ensuring the endpoint's authenticity before proceeding. This mechanism, defined in SSH protocol architecture (RFC 4251) and authentication methods (RFC 4252), binds the session to verified identities. For wireless networks, WPA3 introduces (SAE), a password-authenticated based on the handshake, to derive session keys from a shared password without exposing it to offline attacks. Unlike WPA2's pre-shared key vulnerabilities, SAE uses a balanced exchange where both the client and access point contribute to via Dragonfly's password-to-key derivation, incorporating zero-knowledge proofs for . Specified in IEEE 802.11-2016 and enhanced in later amendments, SAE replaces the four-way handshake, providing and resilience against brute-force attempts. Key exchanges in these protocols introduce performance overhead, particularly in high-latency networks, due to multiple round trips for authentication and key agreement. For instance, TLS 1.3 handshakes add approximately 1-2 round-trip times (RTTs) in low-latency environments but scale poorly over satellite links with 500+ ms delays, increasing connection setup by up to 1 second. IPsec IKE exchanges, involving Diffie-Hellman computations, can impose 2-4 RTTs and 10-20% throughput reduction in high-latency scenarios like edge networks, though optimizations like pre-shared keys mitigate this. Studies on deployments show overhead remains under 5% for data transfer post-handshake but highlights the need for session resumption in volatile connections.

System Access Control

Key authentication plays a critical role in system access control, enabling secure user logins and device verification in local and enterprise environments through structured protocols and hardware mechanisms. One foundational method is , a ticket-based authentication system that relies on symmetric keys to verify identities without transmitting passwords over the network. Developed in the at MIT's , uses a (KDC) to issue Ticket Granting Tickets (TGTs) to authenticated users, allowing subsequent access to services via time-limited session tickets encrypted with shared secrets. This design supports realms for cross-domain trust, facilitating authentication across organizational boundaries while maintaining mutual verification between clients and servers. The protocol was standardized in RFC 4120, which specifies its version 5 implementation for robust network authentication. In settings, integrates seamlessly with Microsoft Active Directory for Windows domains, where the acts as the KDC to manage user and service principals. This integration enables (SSO) across domain-joined systems, reducing login friction for large user bases while enforcing key-based access policies. For scenarios, protocols like and OAuth 2.0 extend key authentication using token-based mechanisms with asymmetric signatures, allowing secure delegation across identity providers and service providers. SAML assertions, defined by the standard, carry digitally signed claims for user attributes, verified via public keys to enable SSO in distributed systems. Similarly, OAuth 2.0, outlined in 6749, issues access tokens for API authorization, often secured with JSON Web Tokens (JWTs) signed using private keys to ensure integrity and authenticity during transit. These JWTs, per 7519, compactly encode claims and are verified against corresponding public keys, supporting scalable access in applications. Hardware tokens enhance system access control by incorporating (PKI) for two-factor , combining something-you-have with knowledge or biometric factors. Devices like s and YubiKeys store private keys in tamper-resistant chips, generating cryptographic challenges for login verification without exposing secrets. YubiKeys, for instance, support PKI-based via standards like for emulation, enabling secure two-factor logins to enterprise systems. For passwordless access, the FIDO2 standard, finalized in 2019 by the , allows bound authenticators to perform directly in browsers or operating systems, replacing passwords with resident keys for phishing-resistant verification. To handle scalability in large deployments, enterprise systems implement key rotation policies, periodically regenerating symmetric or asymmetric keys to limit exposure windows and maintain performance across thousands of users. Automated rotation, often policy-driven, ensures minimal disruption while supporting high-volume authentications, as recommended in best practices.

Security Considerations

Key Management Practices

Key generation is a foundational step in for , requiring the use of approved generators to ensure unpredictability and high . Secure generators (RNGs), as specified in , must be employed, with generation occurring within cryptographic modules validated under to prevent predictable outputs. For symmetric keys, minimum bit lengths of 112 bits are recommended to achieve adequate strength, while asymmetric keys, such as those used in public key , follow guidelines in FIPS 186, with examples including moduli of at least 2048 bits for 112-bit or keys providing equivalent strength. Examples of practical implementations include operating system-provided RNGs like /dev/urandom on systems, which draw from pools when configured properly. Key distribution must occur over secure channels to protect and during transit, preventing interception or tampering. For symmetric keys used in authentication, ephemeral Diffie-Hellman key agreement, as detailed in NIST SP 800-56A, enables secure establishment of shared secrets over insecure networks without prior . Initial setup often relies on methods, such as physical delivery via trusted couriers or manual distribution with tamper-evident packaging, ensuring no exposure to untrusted paths. All distribution techniques must align with the required security strength, typically matching or exceeding 112 bits, and incorporate authentication to verify identities. Storage practices emphasize protection against unauthorized and physical compromise, with keys encrypted at rest using approved algorithms like AES-128 or stronger. Hardware security modules (HSMs), validated to Levels 3 or 4, provide tamper-resistant environments for storing sensitive keys, isolating them from software vulnerabilities and enabling secure operations without key exposure. Cloud-based key vaults, such as AWS Key Management Service (KMS), offer managed HSM-like functionality with automatic and controls, ensuring keys remain within protected boundaries. Physical storage in locked containers with role-based further mitigates risks in non-HSM setups. Rotation and revocation are essential for limiting exposure over time, with cryptoperiods defining usage limits to balance security and operational feasibility. For authentication keys, NIST recommends rekeying private and public keys every 1 to 2 years, or sooner upon reaching the end of the originator-usage period, to reduce the window for cryptanalytic attacks. In cases of , immediate is required through dedicated protocols that notify systems to cease use, transition the key to a destroyed state, and log the event for traceability; this may integrate briefly with components for certificate lists in asymmetric scenarios. Periodic involves generating new keys via secure methods and phasing out old ones without service interruption. Best practices in key management include zeroization for secure deletion and comprehensive auditing to detect anomalies. Zeroization entails overwriting keys with zeros, ones, or random data to irretrievably destroy all copies when no longer needed, preventing forensic recovery as per guidelines. Auditing requires maintaining detailed logs of , distribution, usage, and access events, with regular reviews to ensure compliance and identify potential misuse; these logs should be protected against tampering and retained for forensic purposes. Implementing these practices holistically across the key lifecycle enhances the resilience of authentication systems.

Common Vulnerabilities

Key compromise represents a fundamental threat to systems reliant on secret keys, where attackers exploit flaws or inherent weaknesses to extract or guess the key material. Side-channel attacks, such as timing attacks on decryption, leverage variations in computation time to infer private keys; for instance, differences in timing can reveal bits of the secret exponent when decryption operations are not constant-time. Brute-force attacks further endanger systems using weak keys with insufficient , such as those below 80-bit strength, which can be exhaustively searched using modern computational resources in feasible timeframes. Man-in-the-middle (MITM) attacks undermine key-based by intercepting and altering communications, often forcing the use of compromised protocols or credentials. In TLS, downgrade attacks occur when an attacker manipulates the to revert to weaker protocol versions, such as SSLv3, exposing keys to interception despite initial secure negotiation attempts. Impersonation becomes viable without proper validation, allowing attackers to present forged public keys that mimic legitimate entities, thereby bypassing checks. Replay and forgery attacks exploit predictable or reusable elements in key authentication protocols, enabling unauthorized repetition or fabrication of valid messages. Nonce reuse in challenge-response mechanisms violates the uniqueness required for freshness, permitting attackers to replay intercepted responses and gain unauthorized access, as seen in vulnerabilities where the same nonce-key pair is reused across sessions. Collision attacks on weak hash functions amplify forgery risks; the 2004 demonstration of practical MD5 collisions allowed attackers to generate identical hashes for distinct inputs, undermining integrity checks in authentication tokens or signatures derived from such hashes. Quantum threats pose an existential risk to asymmetric key methods, with enabling efficient factorization of large integers and computations on quantum hardware, thereby breaking and ECDSA by deriving private keys from public ones in polynomial time. This vulnerability necessitates migration to quantum-resistant alternatives, such as the lattice-based scheme, finalized by NIST in , or the code-based HQC scheme selected for standardization in March 2025, as part of efforts to secure key encapsulation against such attacks. Insider risks, often stemming from poor key storage practices, can lead to inadvertent exposure of sensitive material through software flaws or misconfigurations. The vulnerability in , disclosed in 2014, exemplified this by allowing remote attackers to read server memory, potentially leaking private keys used in TLS authentication without detection, affecting millions of systems worldwide.

References

  1. [1]
    Authentication in Key-Exchange: Definitions, Relations and Composition
    ### Definitions of Key Authentication and Implicit/Explicit Authentication
  2. [2]
    Key Authentication - an overview | ScienceDirect Topics
    Cryptographic Foundations and Key Management. Key authentication relies on cryptographic principles that include both symmetric and asymmetric key cryptography.
  3. [3]
    What is SSH Public Key Authentication?
    SSH public key authentication is a key method for secure connections, using cryptographic strength and enabling single sign-on and passwordless logins.
  4. [4]
    [PDF] A Framework for Designing Cryptographic Key Management Systems
    Cryptography is often used to protect information from unauthorized disclosure, to detect unauthorized modification, and to authenticate the identities of ...
  5. [5]
    Cryptographic Controls (SS-08-040)
    Cryptography - branch of applied mathematics (algorithms) concerned with encrypting and decrypting data such that the sender's identity (authentication and non- ...
  6. [6]
    Using one-way functions for authentication
    In this paper we provide a framework by which authentication protocols can b e constructed on a basis of one-way functions, rather than encryption algorithms .
  7. [7]
    [PDF] One-way functions - Harvard SEAS
    Fact 4 If secure encryption schemes exist, then one-way functions exist. It is an amazing theorem that the converse is also true from any one-way function, one ...
  8. [8]
    [PDF] FIPS 196, Entity Authenication Using Public Key Cryptography
    Feb 18, 1997 · The defined protocols are derived from an international standard for entity authentication based on public key cryptography, which uses digital ...Missing: principles | Show results with:principles<|control11|><|separator|>
  9. [9]
    FIPS 196, Entity Authentication Using Public Key Cryptography
    This standard specifies two challenge-response protocols by which entities in a computer system may authenticate their identities to one another.
  10. [10]
    What is mutual authentication? | Two-way authentication - Cloudflare
    Mutual authentication is when two sides of a communications channel verify each other's identity, instead of only one side verifying the other.
  11. [11]
    [PDF] Cryptography CS 555 - Purdue Computer Science
    “Any secure encryption scheme must have a key space that is sufficiently large to make an exhaustive search attack infeasible.” 32. Page 33. Sufficient Key ...
  12. [12]
    [PDF] Principles of Pseudo-Random Number Generation in Cryptography
    Aug 26, 2006 · 1 Since security should not reside in uncertainty about a party's algorithm, it must instead originate from the entropy of the key-space.<|control11|><|separator|>
  13. [13]
    [PDF] New Directions in Cryptography - Stanford University
    Diffie and M. E. Hellman, “Multiuser cryptographic techniques,” presented at National Computer Conference, New York, June 7-10,. 1976. [6] D. Knuth, The Art of ...
  14. [14]
    FIPS 46, Data Encryption Standard (DES) | CSRC
    The standard specifies an encryption algorithm which is to be implemented in an electronic device for use in Federal ADP systems and networks.
  15. [15]
    [PDF] Kerberos: An Authentication Service for Open Network Systems
    Kerberos provides an alternative approach whereby a trusted third-party authentication service is used to verify users' identities. This paper ...
  16. [16]
    [PDF] A Method for Obtaining Digital Signatures and Public-Key ...
    An encryption method is presented with the novel property that publicly re- vealing an encryption key does not thereby reveal the corresponding decryption key.
  17. [17]
    ssl 0.2 protocol specification - Mozilla
    May 15, 2001 · The SSL Protocol is designed to provide privacy between two communicating applications (a client and a server).
  18. [18]
    [PDF] Report on Post-Quantum Cryptography
    Apr 15, 2016 · NIST plans to specify preliminary evaluation criteria for quantum-resistant public key cryptography standards. The criteria will include ...
  19. [19]
    NIST Special Publication 800-63B
    This guideline recognizes that subscribers are responsible for protecting their authentication secrets and not disclosing them to others (e.g., credential ...
  20. [20]
    [PDF] Recommendation for Cryptographic Key Generation
    Jun 4, 2020 · The symmetric key generated within a key-generating module often ... Removed full-entropy, key update and non-repudiation. 4. Section ...
  21. [21]
    RFC 2104 - HMAC: Keyed-Hashing for Message Authentication
    HMAC is a mechanism for message authentication using cryptographic hash functions, using a secret key for calculation and verification.
  22. [22]
    [PDF] Lecture 31 - Introduction to Cryptography CS 355
    Challenge-response based on symmetric-key encryption. • Unilateral authentication, timestamp-based. – A to B: E. K. (t. A. , B). • Unilateral authentication ...<|separator|>
  23. [23]
    RFC 1994 PPP Challenge Handshake Authentication Protocol (CHAP)
    ... Protocol The Challenge-Handshake Authentication Protocol (CHAP) is used to periodically verify the identity of the peer using a 3-way handshake. This is ...
  24. [24]
    Symmetric Cryptosystems and Authentication - CS@Cornell
    First, shared keys can be used to implement string authentication. Second, shared keys help in defending against man-in-the-middle attacks. The obvious ...
  25. [25]
    [PDF] NIST SP 800-97, Establishing Wireless Robust Security Networks
    WPA2 products implement both WPA and WPA2 (IEEE 802.11i) TKIP and 4-Way Handshakes. 7-4. Page 101. ESTABLISHING WIRELESS ROBUST SECURITY NETWORKS: A GUIDE TO ...
  26. [26]
    public key infrastructure (PKI) - Glossary | CSRC
    Components include the personnel, policies, processes, server platforms, software, and workstations used for the purpose of administering certificates and ...
  27. [27]
    PKI Fundamentals
    Components of a PKI. Public Key Infrastructure is Personnel, Policy, Procedures, and a core (public/private key) technology to bind users to digital ...
  28. [28]
    Federal Public Key Infrastructure 101 - IDManagement.gov
    The Federal PKI (FPKI) is a network of certification authorities (CAs) that are either root, intermediate, or issuing CAs. Any CA in the FPKI may be referred to ...
  29. [29]
    Guide for building an EC PKI - IETF
    Jul 26, 2023 · The Basic PKI feature set. A basic PKI has two levels of hierarchy: Root and Intermediate. The Root level has the greatest risk, and is the ...
  30. [30]
    [PDF] Cryptographic Algorithms and Key Sizes for Personal Identity ...
    Jul 5, 2024 · Elliptic curve keys must correspond to one of the following recommended curves from [FIPS186]: Curve P-256 or • Curve P-384.
  31. [31]
    RFC 5280 - Internet X.509 Public Key Infrastructure Certificate and ...
    RFC 5280 profiles X.509 v3 certificates and X.509 v2 CRLs for the Internet, part of the Internet PKI standards, and describes certification path processing.Missing: history | Show results with:history
  32. [32]
    X.509 (11/1988) - ITU-T Recommendation database
    ITU-T X.509 (11/1988) ; Series title: X series: Data networks, open system communications and security. X.500-X.599: Directory ; Approval date: 1988-11-25.
  33. [33]
    [PDF] Public Key Infrastructure (PKI) and Pretty Good Privacy (PGP)
    Alternative: “Web of Trust”. ➢Used in PGP (Pretty Good Privacy). ➢Instead of a single root certificate authority, each person has a set of keys they “trust”.
  34. [34]
    draft-iab-web-pki-problems-01 - IETF Datatracker
    1. Short-lived Certificates Short-lived certificates are an excellent way to reduce the need for certificate status checking. · 2. · 3.
  35. [35]
    [PDF] F-PKI: Enabling Innovation and Trust Flexibility in the HTTPS Public ...
    The core challenges in enabling heterogeneous levels of trust in CAs are to achieve a meaningful overall system behavior with concrete security properties, ...
  36. [36]
    Understanding Digital Signatures | CISA
    Feb 1, 2021 · Digital signatures do this by generating a unique hash of the message or document and encrypting it using the sender's private key.
  37. [37]
    RFC 4056 - Use of the RSASSA-PSS Signature Algorithm in ...
    This document specifies the conventions for using the RSASSA-PSS (RSA Probabilistic Signature Scheme) digital signature algorithm with the Cryptographic ...
  38. [38]
    [PDF] Strongly Unforgeable Signatures Based on Computational Diffie ...
    Strong existential unforgeability under an adaptive chosen-message attack is ... A digital signature scheme secure against adaptive chosen-message attacks.
  39. [39]
    [PDF] On the Security of RSA-PSS in the Wild - Cryptology ePrint Archive
    Oct 31, 2019 · The RSA Probabilistic Signature Scheme (RSA-PSS) due to Bellare and Rogaway (EUROCRYPT. 1996) is a widely deployed signature scheme. In ...
  40. [40]
    RFC 8446 - The Transport Layer Security (TLS) Protocol Version 1.3
    This document specifies version 1.3 of the Transport Layer Security (TLS) protocol. TLS allows client/server applications to communicate over the Internet.
  41. [41]
    RFC 6101 - The Secure Sockets Layer (SSL) Protocol Version 3.0
    RFC 6101 The SSL Protocol Version 3.0 August 2011 ; 5.6.7.3. Client Diffie-Hellman Public Value ; 5.6.8. Certificate Verify ...
  42. [42]
    The Evolution of SSL and TLS | DigiCert.com
    Feb 2, 2015 · The first usable version of SSL—SSL 2.0—was designed by Netscape and released in 1995. However, vulnerabilities were found in SSL 2.0, requiring ...
  43. [43]
    RFC 2404 - The Use of HMAC-SHA-1-96 within ESP and AH
    This memo describes the use of the HMAC algorithm [RFC-2104] in conjunction with the SHA-1 algorithm [FIPS-180-1] as an authentication mechanism.
  44. [44]
    RFC 7296 - Internet Key Exchange Protocol Version 2 (IKEv2)
    This document describes version 2 of the Internet Key Exchange (IKE) protocol. IKE is a component of IPsec used for performing mutual authentication.
  45. [45]
    RFC 4252 - The Secure Shell (SSH) Authentication Protocol
    This document describes the SSH authentication protocol framework and public key, password, and host-based client authentication methods.Missing: MITM | Show results with:MITM<|separator|>
  46. [46]
    RFC 4251 - The Secure Shell (SSH) Protocol Architecture
    The Secure Shell (SSH) Protocol is for secure remote login and network services over insecure networks. It has three components: Transport, User Authentication ...
  47. [47]
    [PDF] Assessing the Latency of Network Layer Security in 5G Networks
    May 12, 2025 · TLS in contrast realizes end-to-end security at the transport layer [24]. Similar to IPsec, it uses an initial handshake to exchange.
  48. [48]
    [PDF] Performance Analysis of TLS Web Servers - Rice University
    TLS is the protocol of choice for securing today's e-commerce and online transactions, but adding TLS to a web server imposes a significant overhead ...
  49. [49]
    [PDF] On the Origin of Kerberos | MIT
    Mar 5, 2021 · Kerberos is distributed as a component of most major operating systems, including Microsoft. Windows, Apple OS/X and IOS, IBM z/OS, and many.
  50. [50]
    Kerberos: An Authentication Service for Computer Networks
    Kerberos was developed in the mid-'80s as part of MIT's Project Athena [2]. As use of Kerberos spread to other environments, changes were needed to support ...
  51. [51]
    RFC 4120 - The Kerberos Network Authentication Service (V5)
    This document provides an overview and specification of Version 5 of the Kerberos protocol, and it obsoletes RFC 1510 to clarify aspects of the protocol and ...
  52. [52]
    Kerberos authentication overview in Windows Server - Microsoft Learn
    Jul 17, 2025 · Kerberos is an authentication protocol used to verify user or host identity. It uses a KDC and Active Directory, and is more efficient than ...
  53. [53]
    Security Assertion Markup Language (SAML) V2.0 Technical Overview
    The Security Assertion Markup Language (SAML) standard defines a framework for exchanging security information between online business partners. This document ...
  54. [54]
    RFC 6749 - The OAuth 2.0 Authorization Framework
    The OAuth 2.0 authorization framework enables a third-party application to obtain limited access to an HTTP service, either on behalf of a resource owner.Bearer Token Usage · RFC 2617 - HTTP Authentication · RFC 5849 · Oauth
  55. [55]
    RFC 7519 - JSON Web Token (JWT) - IETF Datatracker
    JSON Web Token (JWT) is a compact, URL-safe means of representing claims to be transferred between two parties.
  56. [56]
    YubiKeys | Two-Factor Authentication for Secure Login
    The series provides a range of authentication choices including strong two-factor, multi-factor and passwordless authentication, and seamless touch-to-sign.YubiKey 5 Series · How the YubiKey Works · YubiKey Bio Series · YubiKey 5C NFCMissing: PKI | Show results with:PKI
  57. [57]
    YubiKey smart card deployment guide - Yubico Support
    May 12, 2020 · This document covers the basic steps required to set up an Active Directory domain environment for smart card authentication.Missing: two- | Show results with:two-
  58. [58]
    FIDO Passkeys: Passwordless Authentication
    Explore passkeys and how they provide phishing-resistant, passwordless login with faster sign-in and enhanced security. Start your passkey implementation.
  59. [59]
    [PDF] Recommendation for Key Management: Part 1 - General
    May 5, 2020 · A function in the lifecycle of a cryptographic key; the process by which cryptographic keys are securely established among entities using manual.
  60. [60]
  61. [61]
    zeroization - Glossary | CSRC
    An action applied to a key or a piece of secret data. After a key or a piece of secret data is destroyed, no information about its value can be recovered.
  62. [62]
    Side-channel attacks on RSA - People | MIT CSAIL
    How to avoid these attacks? Timing attack on decryption time: RSA blinding. Choose random r . Multiply ciphertext by r^e mod n : c' = c*r^e mod n . Due to ...
  63. [63]
    [PDF] Timing Attacks on Software Implementation of RSA
    Timing attacks enable an attacker to extract secret information from a cryptosystem by observing timing differences with respect to different inputs given ...
  64. [64]
    M10: Insufficient Cryptography | OWASP Foundation
    Scenario #2: Brute-Force Attacks- Attackers systematically try various combinations of keys until they find the correct one to decrypt the data. Weak ...Business Impacts · Am I Vulnerable To... · How Do I Prevent...
  65. [65]
    Man-in-the-Middle TLS Protocol Downgrade Attack | Praetorian
    By tricking the browser into thinking that the server does not support a given version of SSL/TLS, an attacker can downgrade the negotiated version. Let's see ...
  66. [66]
    The Dangers of Self-Signed Certificates - SecureW2
    Nov 21, 2024 · Self-signed SSL certificates lack publicly trusted intermediaries, making them highly susceptible to man-in-the-middle (MITM) attacks. Attackers ...
  67. [67]
    CWE-323: Reusing a Nonce, Key Pair in Encryption
    Because the nonce used is always the same, an attacker can impersonate a trusted party by intercepting and resending the encrypted password. This attack avoids ...Missing: authentication | Show results with:authentication
  68. [68]
    VU#836068 - MD5 vulnerable to collision attacks
    Dec 31, 2008 · Weaknesses in the MD5 algorithm allow for collisions in output. As a result, attackers can generate cryptographic tokens or other data that illegitimately ...
  69. [69]
    Using Shor's Algorithm to Break RSA vs DH/DSA VS ECC
    Aug 24, 2021 · Shor's quantum algorithm, in particular, provides a large theoretical speedup to the brute-forcing capabilities of attackers targeting many ...
  70. [70]
    NIST Releases First 3 Finalized Post-Quantum Encryption Standards
    Aug 13, 2024 · The standard is based on the CRYSTALS-Kyber algorithm, which has been renamed ML-KEM, short for Module-Lattice-Based Key-Encapsulation Mechanism ...