Fact-checked by Grok 2 weeks ago

History of cryptography

The history of cryptography documents the progression of techniques designed to safeguard messages and data from unauthorized disclosure, evolving from primitive symbolic in ancient societies to mathematically rigorous systems underpinning digital infrastructure. This evolution has been inextricably linked to conflicts, statecraft, and technological advancement, with often driving improvements in encryption strength. Earliest evidence appears around 1900 BC in , where irregular hieroglyphs in tomb chambers likely concealed incantations from profane eyes. saw transposition methods like the Spartan —a rod-wrapped strip rearranging letters—and ciphers such as the Caesar shift, displacing letters by a fixed number to encode military directives. In the 9th century, formalized , statistically decrypting ciphers by matching ciphertext letter distributions to plaintext language norms, marking the dawn of systematic codebreaking. Renaissance innovations introduced polyalphabetic ciphers, cycling through multiple alphabets to evade frequency-based attacks, as in the Vigenère tableau. The 20th century heralded electromechanical rotors, exemplified by the German , whose daily settings encrypted Nazi commands during . Allied efforts, initiated by Polish mathematicians and advanced by British teams using Turing's devices at , routinely deciphered , yielding intercepts that accelerated defeat by years. The postwar era integrated computing, birthing via Diffie and Hellman's 1976 framework for key agreement sans trusted channels, enabling asymmetric schemes like and securing . Contemporary cryptography grapples with quantum threats, spurring lattice-based and hash-resistant primitives.

Ancient Origins

Earliest Uses in Mesopotamia and Egypt

The earliest documented cryptographic practice occurred around 1900 BC in , where scribes intentionally employed non-standard hieroglyphs in tomb inscriptions to obscure ritualistic or religious content. In the main chamber of the tomb of nobleman at , anomalous glyphs—deviating from orthodox phonetic and ideographic conventions—were carved to conceal spells or invocations, as evidenced by archaeological analysis of the artifacts. This rudimentary obscurity, lacking any systematic or , served primarily to protect sacred knowledge from profane interpretation, reflecting a causal intent to maintain ritual exclusivity rather than enable . Subsequently, circa 1500 BC in , clay tablets inscribed with enciphered recipes for pottery glazes represent another foundational example of secrecy techniques. One such tablet, unearthed near the River, utilized phonetic to mask the precise formula for ceramic production, safeguarding proprietary trade knowledge from competitors. These inscriptions, verified through , demonstrate early substitution-like methods applied to economic assets, prioritizing the causal preservation of craftsmanship recipes over or diplomatic uses. Both and Mesopotamian instances were ad hoc and non-systematic, devoid of shared keys, , or reversible encoding schemes, as confirmed by artifact studies. Their scope remained confined to ritual and , underscoring empirical motivations rooted in cultural and economic exclusivity rather than broader strategic concealment.

Classical Greek and Roman Techniques

The Spartans utilized the , a device, for secure military messaging as early as the . This method involved wrapping a strip of or around a cylindrical of fixed diameter, writing the message longitudinally along the baton, and then unwinding the strip to produce a scrambled text that appeared incoherent without the matching baton. , in his Life of Lysander, describes its use for authenticating and encrypting dispatches between Spartan commanders, such as during the , ensuring messages could only be read by recipients with an identical scytale. The technique relied on physical —the baton itself—offering transposition security through mechanical alignment rather than complex algorithms, though its simplicity limited it to short, operational commands vulnerable to physical compromise or trial-and-error reconstruction if the diameter was guessed. Hebrew scribes employed the cipher, a monoalphabetic method, around 500 BC for encoding sensitive texts. reversed the , mapping the first letter () to the last (tav), beta to shin, and so forth, creating a fixed reciprocal without variable keys. Evidence appears in the , such as chapter 25:26, where "Sheshach" substitutes for "Babel" (), likely to veil prophetic references during a period of political tension post-Exile. This straightforward mirroring provided obfuscation for religious or diplomatic writings but offered minimal resistance to , as frequent letters like yod remained prominent in , enabling decryption through educated guesses absent modern analytical tools. Romans advanced substitution techniques with Julius Caesar's cipher during the (58–50 BC), shifting each letter three positions forward in the (e.g., A to D). records in The Life of the Divine Julius that Caesar encrypted military orders and correspondence this way to thwart interception by Gallic tribes or rivals, employing runners for delivery. The fixed shift served state dispatches effectively in an era without widespread literacy, prioritizing speed over robustness, though its monoalphabetic nature preserved letter frequencies, rendering it breakable via word patterns or brute-force trials of 25 shifts. reportedly adapted a one-position variant, underscoring continuity in practice, but both lacked key secrecy beyond shared convention, exposing them to insider threats or basic if messages were voluminous. These methods conferred tactical edges in asymmetric conflicts by deterring casual readers, yet their predictability contrasted sharply with later polyalphabetic innovations that diffused statistical vulnerabilities.

Medieval and Renaissance Innovations

Islamic Contributions and Frequency Analysis

During the Islamic Golden Age, particularly in the 9th century under the Abbasid Caliphate, scholars developed systematic methods of cryptanalysis that surpassed earlier ad hoc techniques, enabling the breaking of monoalphabetic substitution ciphers prevalent in diplomatic and military correspondence. Al-Kindi (c. 801–873 CE), a Baghdad-based polymath, produced the earliest known treatise on the subject, Risālah fī Taḥrīr al-Rasāʾil ("A Manuscript on Deciphering Cryptographic Messages"), which outlined frequency analysis as a core technique. This method involved tallying letter occurrences in ciphertext and matching them against established frequency distributions in Arabic, such as the high prevalence of letters like alif and lam derived from Quranic texts, allowing decryption without keys. Surviving manuscripts of Al-Kindi's work, dating to around 1200 years old, confirm these principles through detailed tables of letter probabilities. To counter such analysis, himself proposed polyalphabetic substitution schemes, employing multiple cipher alphabets shifted variably to disrupt single-letter frequencies, marking an early recognition of statistical vulnerabilities in . These innovations stemmed from first-principles statistical reasoning applied to language patterns, contrasting with the largely empirical, non-systematic cipher use in contemporaneous , where remained undeveloped until the . Abbasid administrators integrated into state diplomacy and intelligence, as evidenced by caliphal decrees mandating secure communications, which preserved and advanced classical knowledge amid broader scholarly translations of Greek texts. In the 13th century, further refinements emerged, including homophonic —assigning multiple symbols to frequent letters to equalize frequencies—and grid-based methods for added complexity, as documented in treatises by scholars like those in the era building on Abbasid foundations. These developments facilitated systematic codebreaking operations, enabling Abbasid and successor states to intercept and decode rival messages, thereby influencing regional power dynamics through superior cryptographic intelligence. Overall, Islamic contributions emphasized empirical data over rote , laying groundwork for modern cryptology while Europe's medieval period saw cryptographic stagnation.

European Polygraphic and Mechanical Advances

In 1466 or 1467, , an polymath, described the first known system in his treatise De componendis cifris (or De cifris), utilizing a mechanical composed of two concentric rotating rings—one fixed with a standard alphabet and the other movable with a mixed alphabet including numbers and symbols—to enable variable shifts controlled by a keyword or index letters. This innovation allowed plaintext letters to be enciphered using different alphabets sequentially, thwarting simple by distributing letter frequencies across multiple alphabets, and marked an early mechanical aid to manual encryption for diplomatic and state purposes. Building on such principles, , a abbot and scholar, published Polygraphia in 1518, the first printed book dedicated to cryptography, which introduced systematic steganographic methods disguised as magical invocations alongside a tableau recta—a square table of alphabets with progressive Caesar shifts—that formed the basis for later polyalphabetic tableaux. Trithemius's progressive , where each successive plaintext letter was shifted by an increasing number (e.g., 1, 2, 3,...), effectively created a running key variant resistant to standard monoalphabetic attacks, though its periodicity limited security for long messages; he advocated these for concealing sensitive ecclesiastical and political correspondence. Blaise de Vigenère, a 16th-century , advanced these ideas in his 1586 Traicté des chiffres, describing a using a repeating keyword to select rows from a Trithemian-style tableau for encipherment, alongside an autokey variant where the itself extended the key after an initial primer, further obfuscating frequencies and enhancing resistance to analysis. Employed in diplomatic communications, Vigenère's methods addressed the vulnerabilities of fixed-period systems but proved laborious for manual operation without aids, prompting critiques of inefficiency for high-volume statecraft and spurring 17th-century refinements like Giovan Battista Bellaso's keyed variants.

Enlightenment to Industrial Era

17th-18th Century Theoretical Foundations

The 17th century marked a shift toward more sophisticated cipher designs that addressed the predictability of monoalphabetic substitutions through homophonic encoding, where frequent plaintext elements received multiple ciphertext equivalents to flatten statistical distributions. The Rossignol family's Grand Chiffre, developed for of following his 1643 ascension, represented a pinnacle of this era's theoretical refinements; it employed over 500 numerical symbols for letters, syllables, and common words, augmented by nulls and homophones to resist frequency-based attacks, rendering it impervious to contemporary . This system's enduring security—remaining unbroken until Étienne Bazeries exploited structural patterns in archived messages to solve it in 1893—illustrated the causal efficacy of diffusion-mitigating techniques in protecting state secrets against empirical decryption efforts. Cryptanalytic theory similarly progressed via statistical , as evidenced by John Wallis's contributions during the (1642–1651). Appointed to decipher Royalist intercepts for after demonstrating proficiency on a captured dispatch in 1642, Wallis applied , probable word substitutions, and rudimentary to unravel nomenclators and simple polyalphabetics without keys, aiding such as troop movements. His methods, which quantified letter occurrences and contextual redundancies, highlighted inherent vulnerabilities in low-diffusion ciphers where regularities leaked through despite enciphering, establishing a foundational causal link between mathematical probability and codebreaking success that outpaced brute-force alternatives. By the , these intertwined advancements informed state-level applications, emphasizing that strength derived not merely from key confidentiality but from resisting systematic analysis, though over-reliance on undisseminated methods exposed systems to insider compromise. Encrypted dispatches underpinned in conflicts like the Seven Years' War (1756–1763), where secure diplomatic channels enabled deception operations, such as misleading enemy dispositions, with decipherments occasionally tipping operational balances through revealed intents. Limited-diffusion designs persisted in critiques, as statistical persistence allowed reconstruction via first-order approximations, underscoring the need for layered substitutions to approximate unicity distance empirically.

19th Century Polygraphic Ciphers and Devices

In the , polygraphic ciphers, which substitute multiple letters simultaneously (typically digraphs or trigraphs), gained prominence as a to that had rendered simple monoalphabetic substitutions vulnerable. These systems disrupted single-letter statistics by treating letter groups as units, making more computationally intensive for manual methods. The rise of electric from the onward amplified the need for such ciphers in and diplomatic contexts, where rapid, secure over long distances was essential, yet manual encipherment had to remain feasible for field operators without computational aids. Charles Babbage advanced cryptanalytic techniques against polyalphabetic systems in the mid-19th century, demonstrating in unpublished work around 1846–1854 that ciphers like the Vigenère could be broken by identifying repeated n-grams to deduce the key period, followed by frequency analysis on each subsequence as a shifted Caesar cipher. This approach debunked the prevailing myth of polyalphabetic "indecipherability" propagated since the 16th century, revealing that periodicity introduced exploitable regularities even in multi-alphabet substitutions. Babbage's method, later independently formalized by Friedrich Kasiski in 1863, highlighted inherent weaknesses in periodic polygraphics, influencing subsequent designs to seek non-periodic or more complex substitutions. A key innovation was the , devised by in 1854 as a manual digraphic system tailored for British governmental use. It employs a 5×5 grid (combining I/J) filled first with a keyword, then the remaining ; plaintext digraphs are formed by inserting X or Z for doubles or completing odd-length messages, then substituted based on grid positions—same row (shift right), same column (shift down), or rectangle corners (opposing corners). Wheatstone's friend Lyon Playfair promoted it to the for secure telegraphic communications, emphasizing its speed and resistance to casual interception despite vulnerability to known-plaintext attacks or exhaustive digram analysis. Adoption in British military field manuals underscored its practicality for low-volume, operator-handled encryption, though it required pre-shared grids and was limited to English text without numerals. Mechanical aids emerged to streamline polygraphic processes, exemplified by Étienne Bazeries' cylindrical cryptograph patented in 1891, comprising 20 rotatable disks each inscribed with a permuted (omitting J or similar). Encipherment involves aligning disks to a numeric key (one per disk position), indexing letters across the aligned row to yield , enabling variable-length polygraphic substitutions via wheel permutations. This device, an evolution of 18th-century wheel ciphers like Thomas Jefferson's, offered faster manual operation than tabular methods—up to 100 characters per minute with practice—and supported longer effective keys through disk reordering, but remained prone to mechanical misalignment and key compromise if disks were captured intact. Bazeries' design addressed telegraphy's demand for reversible, error-resistant tools, yet its bulk and manual rotation limited scalability for high-volume wartime traffic. These polygraphic advancements, while enhancing security over unigr aphics, exposed scalability issues for industrialized communication: manual tabulation or disk alignment fatigued operators, introduced transcription errors in transmission, and struggled with message volumes exceeding hundreds of words daily, as seen in diplomatic cables. Such constraints, coupled with Babbage's revelations on periodicity, underscored the inadequacy of purely manual systems for emerging mass-signaling needs, paving the way for electromechanical innovations without fully resolving or operator training demands.

Early 20th Century Conflicts

World War I Mechanical Ciphers

The introduction of and field telephones in 's generated vast interceptable traffic volumes, prompting innovations in complexity to protect tactical commands amid static fronts and rapid coordination. While systems dominated due to technological constraints, the era marked the conceptual shift toward electromechanical for faster, more secure processing, driven by the need to counter enemy radio direction-finding and . Germany deployed the on March 5, 1918, designed by Fritz Nebel for encrypting high-level tactical messages between divisions, corps, and army headquarters during the . This field-usable system fractionated plaintext into digrams via a keyworded 6x6 mapping letters and digits to ADFGVX symbols, followed by columnar using a keyword-derived numerical order, yielding a resistant to standard or partial anagramming. French cryptanalyst Georges Painvin deciphered ADFGVX traffic starting with a breakthrough on April 5, 1918, employing cribs—postulated phrases from operational contexts—and exhaustive manual testing on captured messages, despite the cipher's 25! × Σ(k=2 to N) k! approximate key complexity exceeding 10^50 possibilities. Decrypts revealed German dispositions and plans, enabling preemptive Allied artillery strikes that slowed the and contributed to its failure by July 1918, with cryptanalytic intelligence causally linked to disrupting enemy concentrations and reducing projected casualties through targeted countermeasures. Edward Hebern patented the first practical in 1917, constructing a single- by that electrically substituted letters via a wired, rotating permuting disk driven by keystrokes, automating polyalphabetic shifts for typewriter-like encipherment and serving as a direct precursor to multi-rotor designs. Though not fielded during the war, U.S. military evaluations confirmed its viability for secure transmission, revealing vulnerabilities to known-plaintext attacks that informed later refinements, amid the period's parallel inventions like Scherbius's 1918 in . British and American forces relied on variants—a 5x5 digraphic keyed to a keyworded square—for tactical communications, while French units employed interrupted columnar adding diagonal encipherment to standard columns for added . These polygraphic manuals proved empirically breakable through captured codebooks, traffic volume analysis from intercepts, and material seizures during raids, yielding actionable on unit movements without relying on theoretical insecurities alone.

Interwar Period Developments

The interwar period saw the commercialization of rotor-based cipher machines, with Arthur Scherbius founding Chiffriermaschinen-Aktiengesellschaft in 1923 to produce and sell the Enigma machine for commercial use. Initial models lacked a plugboard, which was introduced in military variants by 1926 to enhance security through additional wiring permutations. These developments built on pre-war patents, focusing on electromechanical encryption for secure business and diplomatic communications amid rising global tensions. Polish cryptologists advanced rotor machine cryptanalysis in 1932, when Marian Rejewski exploited mathematical properties of permutations to reconstruct the Enigma's internal wiring without physical access. This theoretical breakthrough, leveraging and cycle structures, enabled systematic recovery of daily keys and laid groundwork for mechanical aids like the Bomba, though details remained classified until post-war declassifications. Such innovations highlighted the vulnerability of rotor systems to permutation-based attacks when message indicators provided exploitable patterns. Other nations pursued rotor technologies, with Britain developing the Typex machine, prototyped in 1937 as an Enigma derivative incorporating fixed rotors and irregular stepping for improved security. Japan introduced the Type A cipher machine, codenamed Red by Allied intelligence, in the early 1930s for diplomatic traffic, employing stepping switches in a complex asynchronous design. These efforts were hampered by export restrictions on cryptographic technologies, which limited commercial diffusion and arguably slowed broader innovation by isolating developments within national boundaries. Theoretical progress included refinement of the by U.S. Army Major Joseph Mauborgne around 1919, recognizing that truly random, non-repeating keys rendered the system unbreakable under certain conditions, a principle later formalized by . Mauborgne's work, building on Gilbert Vernam's 1917 teleprinter cipher, emphasized perfect secrecy through key entropy matching length, influencing interwar manual encryption practices despite logistical challenges in .

World War II Turning Point

Axis Powers' Systems

The German adopted the rotor machine in the late 1920s, with the Army version introduced in 1928 featuring three rotating rotors, a fixed reflector, and later a plugboard added in 1930 to increase complexity. Variants proliferated in , including models with four rotors for naval use by 1942, employing irregular stepping mechanisms to permute substitutions dynamically for each character. These engineering innovations provided vast key spaces, theoretically exceeding 10^14 possibilities for three-rotor configurations, yet inherent design limitations, such as the reflector's fixed wiring and no self-encryption of letters, created exploitable patterns under known-plaintext conditions. Japan deployed the Type B cipher machine, codenamed PURPLE by adversaries, in February 1939 for diplomatic communications, utilizing 25-position stepping switches to emulate rotor-like polyalphabetic substitution rather than true rotors. Developed in the mid-1930s, it featured six banks of switches for enciphering, with a plugboard for additional permutations, achieving complexity through asynchronous stepping that advanced subsets of switches per keystroke. Despite mechanical sophistication, PURPLE's reliance on telephone-style stepping switches introduced periodicities vulnerable to analysis when keys were reused or messages aligned predictably. Italy employed the Hagelin C-38 mechanical cipher device during , particularly for naval and military traffic, which used pin-and-lug wheels to generate pseudorandom keystreams added modulo 26 to . Introduced in the late 1930s, the C-38 offered portability and speed via six lugs controlling six wheels, but its short period and lack of true randomness allowed recovery via depth attacks—multiple messages enciphered with identical settings yielding identical for matching segments. Italian overreliance on such mechanical aids, assuming complexity sufficed without rigorous , amplified these flaws. Across systems, operator practices exacerbated cryptographic weaknesses; German users frequently reused message keys or selected predictable indicators, such as repeating phrases in headers, reducing effective security despite daily key changes. Similar procedural lapses in and C-38 operations, including key reuse and failure to vary settings adequately, enabled that compromised communications, contributing to intelligence failures like the penetrations of Axis networks. These human factors, rooted in overconfidence in mechanical ingenuity over disciplined procedures, causally undermined the systems' theoretical strengths.

Allied Codebreaking and Machines

The British Bombe, an electromechanical device designed primarily by Alan Turing, became operational in March 1940 to decrypt German Enigma-encrypted messages by testing rotor settings against known plaintext "cribs." Building on Polish cryptanalytic techniques, the Bombe automated the exhaustive search process, simulating multiple Enigma machines simultaneously to identify daily keys. By 1945, over 200 Bombes were deployed across British and American sites, processing thousands of intercepts daily and enabling rapid decryption of naval and military traffic. In parallel, British engineer developed the Colossus series starting in 1943, marking the advent of the world's first large-scale programmable electronic digital computer dedicated to . Targeted at the Lorenz SZ40/42 —used for high-command communications between Hitler and field generals—Colossus employed 1,500–2,400 vacuum tubes to perform statistical analysis on encrypted streams, exploiting non-uniform character frequencies via methods like . The initial machine went live in December 1943 at , with nine more constructed by war's end, decrypting an estimated 63 million characters per week by 1945. The introduced the (ECM Mark II in the Navy) rotor-based cipher machine in the late 1930s, incorporating 10–15 rotors with irregular stepping controlled by independent brush arrays, which thwarted cryptanalytic attacks throughout . Unlike the more predictable , 's design ensured output periods exceeding 10^26, rendering depth analysis infeasible without massive resources. Approximately 10,000 units were fielded by 1945, securing Allied command communications and proving superior in resistance to compromise compared to contemporaneous systems. Decryptions from these machines, aggregated under the Ultra program, yielded actionable intelligence that British official historian Sir Harry Hinsley estimated shortened the European war by two to four years. In the Atlantic theater, Ultra directed convoys away from wolfpacks and guided antisubmarine strikes, contributing to the sinking of over 700 German submarines and averting millions of tons in Allied merchant shipping losses; for example, saw 41 destroyed amid a sharp decline in convoy sinkings from prior peaks of 350,000+ tons monthly.

Human Elements and Intelligence Impacts

At Bletchley Park, the British codebreaking center, personnel numbered around 9,000 by 1943, expanding to nearly 10,000 by 1945, with women comprising approximately 75% of the workforce in roles ranging from machine operation to analysis. Mathematicians such as Alan Turing, who headed Hut 8 and contributed to decrypting Naval Enigma traffic used by U-boats, exemplified the intellectual core, while the volume of intercepts processed—reaching thousands of messages daily at peak—relied on the operational efficiency of support staff despite human fatigue and compartmentalized workflows. German procedural lapses and operator errors significantly facilitated Allied penetrations of systems, including the reuse of message keys, predictable "cribs" from repeated phrases like weather reports, and failure to vary rotor settings systematically, which reduced the cipher's effective security despite its mechanical complexity. Earlier betrayals compounded these vulnerabilities; in 1931, German official supplied intelligence with Enigma operating manuals and daily keys, enabling initial and subsequent Polish cryptanalytic advances that were shared with in 1939. The resulting Ultra intelligence yielded tactical advantages, such as routing convoys around wolfpacks during the , but its causal role in Allied victory was enabling rather than determinative, as material superiority in production and manpower—evident in the U.S. outproducing in by over 10:1 from onward—provided the decisive edge. Estimates that codebreaking shortened the European war by two to four years, while cited by figures like , overstate isolated impact absent corroborating empirical breakdowns of alternative scenarios, where intelligence alone could not offset strategic errors or Allied logistics. Post-war, the bound veterans to silence until the 1970s, suppressing publication of cryptanalytic techniques and delaying integration into civilian fields like , where academic progress stagnated without access to wartime heuristics until declassifications like F.W. Winterbotham's 1974 disclosures. This veil prioritized ongoing operations over broader dissemination, arguably retarding non-military cryptographic innovation for decades.

Post-War Theoretical Foundations

Claude Shannon and Information Security

Claude Elwood Shannon, a mathematician and electrical engineer at Bell Laboratories, published his foundational paper "Communication Theory of Secrecy Systems" in the Bell System Technical Journal in July and October 1949. In this work, Shannon applied principles of information theory to analyze secrecy systems, defining perfect secrecy as a condition where the ciphertext yields no information about the plaintext to an adversary, even with unlimited computational power; formally, the a posteriori probability of any plaintext given the ciphertext equals its a priori probability. He proved that perfect secrecy requires the key space to be at least as large as the message space, establishing a fundamental limit: no cryptosystem can achieve unconditional security without keys of comparable length to the data being protected. Shannon demonstrated that the one-time pad—a cipher using a random key as long as the message, added modulo the alphabet size (e.g., XOR for binary)—attains perfect secrecy when the key is truly random, used only once, and kept secret from the adversary. This proof showed the one-time pad to be unbreakable in theory, as every possible plaintext is equally likely given any ciphertext, rendering cryptanalytic attacks futile without key compromise. However, Shannon emphasized practical constraints: generating, distributing, and storing such keys securely imposes causal burdens, limiting viability to scenarios with trusted channels, such as diplomatic pouches or pre-shared materials in low-volume military contexts. To guide the design of practical ciphers approximating security without perfect secrecy, Shannon introduced the principles of and . Confusion involves operations that complicate the statistical relationship between the key, , and , such as nonlinear substitutions that obscure direct mappings. Diffusion ensures that local changes in influence many positions, typically via permutations or linear mixing, thereby dissipating statistical patterns across the output. These concepts, rooted in Shannon's analysis of product ciphers, provided a theoretical framework for resisting known-plaintext and statistical attacks, influencing subsequent symmetric designs by quantifying how redundancy reduction enhances resistance to exhaustive search or . Shannon's theoretical contributions established rigorous bounds on cryptographic strength, shifting the field from methods to quantifiable information-theoretic measures, and informed the post-World War II prioritization of secure communications in U.S. . While his work underpinned the mathematical foundations for agencies like the , formed on November 4, 1952, to centralize cryptologic efforts, it highlighted the tension between theoretical ideals and operational realities. Critiques note that perfect secrecy remains impractical for scalable, non-military applications due to overhead—secure demands channels as protected as the messages themselves, often infeasible without physical proximity or trusted couriers—and the requirements for exceed routine generation capabilities. Thus, real-world systems trade unconditional security for computational assumptions, accepting probabilistic risks under resource constraints.

Early Computer-Aided Cryptography

The transition to computer-aided cryptography in the 1950s and 1960s reflected the imperative to leverage emerging electronic computing for both and , moving beyond purely mechanical rotors to hybrid and fully digital prototypes primarily developed by government agencies like the newly formed (NSA) in 1952. These efforts built on experiences with devices like Colossus but shifted toward vacuum-tube-based systems for handling larger data volumes and key spaces, enabling feasibility assessments of brute-force resistance amid exponential growth in computational power—early analyses indicated that key lengths beyond 50-60 bits could resist exhaustive search with 1960s-era machines costing millions in equivalent resources. NSA's first-generation systems, introduced in the mid-1950s, retained electromechanical elements for compatibility with legacy teletype networks but incorporated punched-card readers for key loading, as in the TSEC/ off-line cipher machine deployed for secure tactical and diplomatic use. A pivotal application of early was the , initiated by U.S. Army cryptanalysts in 1943 and expanded under NSA oversight into the 1950s and 1960s, which decrypted over 3,000 Soviet messages by exploiting repeated one-time pad keys through punched-card tabulators for and permutation testing—processing rates reached thousands of comparisons per day by the late 1940s, revealing espionage networks without full algorithmic breaks. By the early 1960s, NSA procured supercomputers like the to accelerate such tasks, marking cryptology's influence on procurement and highlighting automation's edge over manual methods, though limitations in storage and speed constrained real-time applications. Electronic prototypes evolved to vacuum-tube cipher machines for on-line encryption, using punched cards or tape for to mitigate interception risks, with designs emphasizing resistance to known-plaintext attacks feasible via early digital simulation. Government prototypes prioritized classified robustness, but declassified records reveal internal debates on backdoor vulnerabilities in custom , where control over specifications raised long-term risks of insider compromise or foreign reverse-engineering, contrasting with academic explorations of pure algorithmic security devoid of such oversight. Feasibility studies underscored causal trade-offs: doubling key sizes quadrupled minimal cracking needs under projections, prompting prototypes with variable lengths up to 128 bits, though implementation favored shorter keys for operational speed. These systems laid groundwork for digital ciphers, with partial declassifications affirming their role in securing early precursors against Soviet intercepts, albeit with acknowledged gaps in quantum-resistant foresight.

Standardization and Symmetric Advances

DES and the Data Encryption Standard

In 1973, the National Bureau of Standards (NBS), now part of the National Institute of Standards and Technology (NIST), solicited proposals from industry for a federal data encryption standard suitable for non-classified applications, aiming to protect electronic data in government and commercial systems. submitted a variant of its earlier , originally designed with a 128-bit key length in the late , which had been evaluated by the (NSA) during patent reviews. The submitted algorithm featured a 64-bit block size and underwent modifications, including a reduction of the effective key length to 56 bits (with 8 parity bits in the 64-bit input), to enable efficient implementation in hardware on a single , addressing computational constraints of the era. The NBS process was notably open, incorporating public comments and independent reviews following the submission, with the NSA providing non-binding technical input on aspects. Modifications included redesigned boxes (S-boxes) suggested by the NSA, which reduced the algorithm's vulnerability to nascent cryptanalytic techniques while maintaining compatibility goals. On January 15, 1977, NBS published the finalized algorithm as Federal Information Processing Standard (FIPS) 46, mandating its use for unclassified federal data and encouraging voluntary adoption in the . Cryptographers and critiqued the 56-bit in their June 1977 paper, arguing it offered inadequate long-term security against brute-force attacks, estimating that specialized hardware costing under $1 million could exhaust the keyspace by the mid-1980s to early —a prediction validated empirically in 1998 when a distributed effort using 260,000 machines cracked a challenge key in 56 hours, and fully in 1999 with purpose-built achieving breaks in days. Suspicions arose over NSA influence on the S-boxes and key reduction, with fears of embedded weaknesses favoring government decryption; however, declassified analyses and independent research in the , including Biham and Shamir's discovery of differential cryptanalysis, demonstrated the S-boxes specifically countered this attack—knowledge the NSA had anticipated but not publicized—without introducing exploitable backdoors, affirming DES's robustness against known threats at adoption. saw rapid uptake in banking and financial sectors for securing transactions, such as PIN verification and electronic funds transfers, establishing a foundation for scalable symmetric in emerging despite its eventual obsolescence.

AES Competition and Adoption

In 1997, the National Institute of Standards and Technology (NIST) initiated a public competition to select a successor to the (DES), driven by DES's 56-bit effective key length proving insufficient against brute-force attacks as computational capabilities advanced, with DES also susceptible to differential cryptanalysis requiring approximately 2^47 chosen plaintexts. On January 2, 1997, NIST announced the effort's start, followed by a formal call for algorithm submissions on September 12, 1997, emphasizing an open international process inviting global participation. Fifteen candidate algorithms were received and accepted for initial evaluation in 1998, with the cryptographic community contributing extensive analysis. In August 1999, NIST announced five finalists—Rijndael, , , , and —after narrowing from the candidates based on preliminary security and performance assessments. Following additional rounds of scrutiny, including workshops and peer review through May 2000, NIST selected Rijndael, developed by Belgian cryptographers Joan Daemen and , as the winner in October 2000. Selection criteria prioritized a combination of security against known attacks, high software and hardware performance, low memory requirements, ease of implementation across platforms, and flexibility in block and key sizes, where Rijndael excelled over competitors like and . Standardized as Federal Information Processing Standard (FIPS) 197 in November 2001, AES defines a symmetric-key operating on 128-bit blocks with key sizes of 128, 192, or 256 bits, supporting 10, 12, or 14 rounds respectively to provide robust and . This addressed DES's limitations by offering exponentially larger key spaces—2^128 to 2^256 possibilities—rendering brute-force infeasible with foreseeable technology. AES achieved rapid global adoption post-standardization, integrating into protocols including (TLS) for web encryption, for VPNs, and (WPA2/3), displacing and its variants like due to superior efficiency and security. Its design enables fast execution in both software and hardware, though implementations face side-channel vulnerabilities such as cache-timing, , and differential fault attacks that leak key information via physical measurements rather than algorithmic weaknesses. Hardware accelerations like Intel's AES-NI instruction set, introduced in 2010 with the Westmere architecture, deliver 3- to 10-fold performance gains over pure software implementations by offloading rounds to dedicated CPU operations, sustaining AES's relevance in high-throughput applications.

Asymmetric Revolution

Public-Key Cryptography Invention

In the mid-1970s, the longstanding challenge of securely distributing symmetric keys over potentially compromised channels—previously reliant on trusted couriers or pre-shared secrets—was addressed through novel protocols that leveraged computational asymmetry rather than physical security. Ralph Merkle, then a graduate student at the University of California, Berkeley, proposed one of the earliest such systems in 1974, using "puzzles" to enable key agreement. In Merkle's scheme, one party generates a large set (e.g., 2^{20}) of short encrypted messages, each concealing a puzzle identifier and a potential key from a restricted keyspace; these are publicly broadcast. The recipient selects one at random, expends effort to decrypt it (requiring brute-force search over the small keyspace), and responds with the puzzle's identifier, allowing the sender to identify the shared key. While inefficient—demanding exponential computational work proportional to the number of puzzles—this demonstrated that secure key exchange could occur without prior secrets or trusted intermediaries, albeit at high cost. Building on similar insights, and formalized the broader paradigm of in their November 1976 paper "New Directions in Cryptography," published in IEEE Transactions on . They introduced the concept of one-way functions—easy to compute in one direction but intractable to invert—and trapdoor variants where inversion becomes feasible with secret knowledge, laying the theoretical groundwork for asymmetric systems. Central to their contribution was the Diffie-Hellman , which uses over a : two parties publicly agree on a large prime p and a generator g; Alice privately selects exponent a and sends ga mod p to Bob; Bob selects b and sends gb mod p to Alice; each then computes the shared secret (ga)b mod p = (gb)a mod p = gab mod p. This commutative property ensures a common key without direct exchange, with security predicated on the computational difficulty of the problem—extracting a from ga mod p for large p. These inventions marked a causal pivot from symmetric cryptography's dependency on centralized or physically secure , which had perpetuated myths of inevitable trust in third parties or channels for large-scale networks. Merkle and Diffie-Hellman's approaches enabled decentralized , facilitating applications like open protocols without universal trusted authorities. However, the protocols' efficacy rests on unproven computational assumptions, such as the intractability of logarithms under classical ; advances in algorithms or hardware could undermine them, as no guarantees perpetual hardness. Despite such critiques, the verifiable mathematics of provides empirical robustness for parameter choices resistant to known attacks as of 2025.

Diffie-Hellman and RSA Developments

The public-key cryptosystem, developed in 1977 by , , and at , relies on the computational difficulty of factoring the product of two large prime numbers to ensure security. The algorithm was first publicly described in their paper "A Method for Obtaining Digital Signatures and Public-Key Cryptosystems," published in the February 1978 issue of Communications of the ACM, which included a practical software implementation demonstrating encryption and decryption using 129-decimal-digit keys. This marked the initial viable realization of asymmetric encryption beyond theoretical key exchange protocols like Diffie-Hellman. RSA's commercial deployment accelerated with the formation of in by the inventors, who secured U.S. 4,405,829 on September 20, 1983, covering the core method of public-key via one-way functions. The patent's expiration on September 21, 2000, eliminated royalty requirements, spurring widespread unlicensed adoption in software, hardware, and protocols such as SSL/TLS precursors. Prior to expiration, licensing disputes arose, notably with Phil Zimmermann's 1991 release of (PGP), an tool employing for and digital signatures, which prompted a U.S. Customs and State Department investigation for alleged munitions export violations under the due to its . The three-year probe, initiated after reported unauthorized use, ended without charges in , underscoring early resistance to restrictions on cryptographic dissemination. Post-1977 developments in Diffie-Hellman focused on integrating the 1976 into hybrid systems, where it generated symmetric session keys protected by asymmetric methods like , enabling over insecure channels without prior shared secrets. In the 1990s, elliptic curve variants enhanced efficiency, with elliptic curve Diffie-Hellman (ECDH) proposed as a discrete logarithm-based analogue using s over finite fields, offering equivalent security to classical Diffie-Hellman or at reduced computational cost and key sizes—such as 256-bit keys matching the strength of 3072-bit moduli—making it suitable for resource-limited mobile and embedded devices. Standardization efforts, including NIST's adoption of parameters in the late 1990s, verified these equivalences through rigorous analysis of curve security against known attacks.

Hashing and Integrity Mechanisms

Early Hash Functions

The concept of cryptographic hash functions as one-way primitives for and traces back to the late , with early proposals emphasizing resistance to collisions and preimage attacks to distinguish them from non-cryptographic checksums. These functions were motivated by the need for efficient in systems, particularly to support compact representations of messages for signing without exposing full content. Initial designs, such as those explored in the using block ciphers like for hashing (e.g., MDC variants), laid groundwork but lacked formalized constructions for broad security proofs. A pivotal advancement came with the Merkle-Damgård construction in 1989, introduced by as a method to build iterated hash functions from a collision-resistant function. This paradigm processes input messages by dividing them into fixed-size blocks, padding the final block, appending the message length, and iteratively applying the function initialized with an initial value (often constants or an empty ). Merkle's design proved that if the underlying function resists collisions, the overall inherits this property, providing a foundation for essential for integrity applications. Independently, Ivan Damgård demonstrated in 1989 that the construction preserves under the assumption of a secure fixed-point-free in the function. Building on this, Ronald Rivest developed in 1990 as a fast, 128-bit tailored for software implementation on 32-bit processors. operates on 512-bit message blocks through three rounds of 16 operations each, combining bitwise functions (AND, OR, XOR), modular addition, and left rotations on 32-bit words, with constants derived from the of primes for . It evolved into in 1991, incorporating a fourth round to bolster security against differential attacks observed in analysis, though both retained the Merkle-Damgård structure. Early evaluations revealed flaws in , including semi-free-start collisions by 1991, highlighting the challenges in achieving full . A inherent limitation of Merkle-Damgård-based designs like MD4 and MD5 is vulnerability to length-extension attacks, where knowledge of a hash value H(M) for message M and its length allows computation of H(M || padding || additional data) without knowing M, due to the appended length and block-wise processing exposing internal state. This property undermines certain uses, such as naive MAC constructions (e.g., Hash(secret || message)), as attackers could forge extensions without the secret. Despite these issues, early hash functions enabled practical digital signatures by digesting variable-length messages into fixed 128-bit values for efficient public-key operations, aligning with emerging standards for verifiable authenticity in protocols.

Secure Hash Algorithms Evolution

The Secure Hash Algorithm (SHA) family, developed by the National Institute of Standards and Technology (NIST), evolved in response to emerging cryptanalytic vulnerabilities in earlier designs. , specified in Information Processing Standard (FIPS) 180-1 in 1995, produced 160-bit digests using the Merkle-Damgård construction but faced theoretical collision attacks by 2005, culminating in the first practical collision demonstrated in February 2017 by researchers from and the CWI Institute, who generated two dissimilar PDF files yielding identical hashes after approximately 6,500 years of simulated compute time equivalent to GPU clusters. NIST subsequently mandated phasing out by December 31, 2030, for all applications, citing its inadequacy for security-critical uses like digital signatures. To address these weaknesses, NIST published the family in 2001 via FIPS 180-2, featuring variants such as SHA-256 (256-bit output) and SHA-512 (512-bit output), which retained the Merkle-Damgård construction but incorporated longer digests and modified compression functions for enhanced resistance to collision and preimage attacks. These algorithms achieved provable security margins exceeding 2^128 operations for collisions in SHA-256, far surpassing SHA-1's 2^80 bound, and became integral to standards like TLS and digital certificates. However, growing concerns over length-extension attacks inherent to Merkle-Damgård—exemplified by exploits in —prompted NIST to seek structural diversity beyond incremental improvements. In November 2007, NIST launched an open competition for a new hash standard, , receiving 64 submissions and narrowing to 51 first-round candidates by December 2008, 14 second-round in 2010, and 5 finalists in 2011. On October 2, 2012, NIST selected Keccak, designed by Guido Bertoni, Joan Daemen, Michaël Peeters, and Gilles Van Assche, as the winner, finalizing it in FIPS 202 on August 5, 2015. Unlike SHA-2's Merkle-Damgård paradigm, Keccak employs a sponge construction, absorbing input into a state via a function and squeezing output iteratively, which mitigates length-extension vulnerabilities and enables variable-rate hashing with tunable security levels (e.g., SHA3-256 offers 128-bit ). This shift to sponge construction diversified the SHA family, providing provable bounds against and algebraic attacks under the wide-trail strategy, while supporting extensions like tree hashing without domain-specific tweaks. Nonetheless, variants exhibit performance trade-offs, often 2-5 times slower than on commodity due to the permutation's bit-oriented operations, though hardware accelerations (e.g., via AVX instructions) narrow the gap for SHA3-256 to near parity in some implementations. NIST recommends for most uses where performance matters, reserving for scenarios requiring construction-independent security or future-proofing against unforeseen Merkle-Damgård flaws, with both families undergoing ongoing evaluation for quantum-era adjustments in .

Cryptanalytic and Political Challenges

Modern Cryptanalysis Techniques

Differential cryptanalysis, developed by Eli Biham and in 1990, exploits probabilistic relationships between plaintext pairs and their corresponding ciphertexts to recover keys in block ciphers. Applied to the (DES), it demonstrated that the full 16-round version could be broken using approximately 2^47 chosen plaintexts, faster than exhaustive search, though practical implementation required significant computational resources. The technique also effectively targeted FEAL ciphers, breaking FEAL-4 with 2^6 chosen plaintexts and revealing structural weaknesses in its Feistel network. Linear cryptanalysis, introduced by Mitsuru Matsui in 1993, approximates the cipher's operations with linear equations over GF(2) to correlate key bits with known plaintext-ciphertext pairs. For , Matsui's method broke the full 16 rounds using about 2^43 known plaintexts, an improvement over approaches in data efficiency, and experimentally verified on reduced-round variants. These chosen- and known-plaintext attacks highlighted DES's vulnerability to non-brute-force methods, contributing empirically to its deprecation by prompting transitions to stronger standards like . Side-channel attacks, pioneered by Paul Kocher in 1996, leverage physical implementations' unintended information leaks rather than algorithmic weaknesses. Timing analysis exploits variations in execution time correlated with secret data, such as in modular exponentiation, where Kocher demonstrated key recovery from remote observations without physical access. extends this by measuring consumption patterns; simple power analysis distinguishes operations like squaring versus multiplication in exponentiation, while differential power analysis statistically aggregates traces to extract keys from devices like smart cards, validated through laboratory demonstrations on real hardware. A notable empirical break involved , a standardized by NIST in 2007 using points P and Q. Cryptanalysts identified in 2007 that specific P and Q values enabled prediction of outputs if the NSA possessed a secret , effectively creating a backdoor; this was confirmed in 2013 through leaked documents revealing NSA influence in parameter selection, leading to its withdrawal and RSA's recommendation against use. Cryptanalysis contests, such as NIST's AES selection process from 1997 to 2000, rigorously tested candidates against , linear, and other attacks, selecting Rijndael for its resistance—enduring no practical breaks despite extensive scrutiny—which directly caused retirement of and triple-DES by 2005 and 2017, respectively, due to proven vulnerabilities. These evaluations empirically strengthened standards by exposing flaws early, fostering designs with wider security margins against evolving techniques.

Government Controls and Crypto Wars

In the 1990s, the classified strong cryptographic software and hardware as munitions under the (ITAR), administered by the State Department, subjecting exports to stringent licensing requirements aimed at preventing adversaries from acquiring tools that could evade intelligence collection. This policy stemmed from concerns, including the fear that widespread strong would blind U.S. to terrorist and foreign threats, but it imposed economic burdens on American firms by forcing them to develop weakened "export-grade" versions for international markets, such as 40-bit keys vulnerable to brute-force attacks. Critics argued these controls prioritized government access over innovation and , reflecting a causal between state surveillance capabilities and individual rights, with empirical evidence from industry losses—estimated in billions—demonstrating how restrictions ceded market share to foreign competitors unhindered by similar regimes. A pivotal episode was the 1993 Clipper Chip initiative, proposed by the National Security Agency (NSA) and endorsed by the Clinton administration, which sought to mandate a hardware encryption standard for voice and data communications featuring a built-in backdoor via family key escrow, where unique device keys would be split and held by government-approved escrows for law enforcement access under court order. The Skipjack algorithm powering Clipper was classified until partially declassified in 1994, but the proposal faltered amid revelations of a deliberate vulnerability exploitable by foreign actors, widespread opposition from privacy advocates and technologists who viewed escrow as an unreliable trust mechanism prone to abuse or compromise, and minimal market adoption, leading to its official abandonment by 1996. Proponents justified it as balancing security needs against crime, yet the failure underscored first-hand that mandated backdoors erode user confidence without empirically enhancing national defense, as private-sector alternatives proliferated unchecked. The case of Philip Zimmermann exemplified enforcement zeal, as the creator of (PGP)—released in 1991 as enabling strong public-key encryption for —faced a federal criminal probe in 1993 for allegedly violating export controls after PGP circulated globally via the , despite his intent for domestic use against perceived surveillance threats. The investigation, involving the Customs Service and Justice Department, treated code publication as tantamount to munitions export, potentially carrying penalties of fines and imprisonment, but was dropped in January 1996 after a declined , partly due to mounting evidence that online dissemination constituted protected speech under the First Amendment. This saga catalyzed legal challenges, including Phil Karn's lawsuit against ITAR's classification of published , reinforcing arguments that export rules stifled free expression and innovation without verifiable security gains, as PGP's viral spread demonstrated the futility of containment in a networked era. Revelations about programs like , a network operational since the but publicly scrutinized in the through journalistic exposés and European parliamentary inquiries, heightened crypto wars tensions by exposing allied mass of global communications, including commercial traffic, which strong encryption could thwart. Initially designed for targeting Soviet signals, ECHELON's dictionary-based filtering of vast data streams raised alarms over indiscriminate surveillance of citizens, fueling demands for robust crypto as a bulwark; later confirmations via 2013 leaks on successor systems like validated these concerns, revealing persistent bulk collection that export controls implicitly supported by limiting encryption's reach. Governments countered that such capabilities were essential for , citing post-9/11 threats, yet empirical critiques noted overreach's role in eroding public trust and driving underground adoption of unregulated tools. By 2000, mounting pressure from the tech sector, demonstrated by firms like shipping crippled products abroad, and recognition of competitive disadvantages led to liberalization: new Commerce Department regulations effective January 14, 2000, reclassified most commercial under dual-use controls, allowing license-free exports of strong crypto (e.g., 128-bit ) to non-embargoed nations after streamlined reviews, effectively ending ITAR's munitions stranglehold. This shift acknowledged that technological diffusion had rendered controls obsolete, prioritizing economic vitality over absolute control, though vestigial restrictions persisted for military-grade items. In retrospect, the era's battles affirmed individual privacy's precedence in democratic frameworks against expansive state rationales, with liberalization empirically boosting U.S. dominance in secure communications without compromising core intelligence functions, as adversaries adapted independently regardless.

Quantum Era and Beyond

Shor's Algorithm and Quantum Threats

published an algorithm in 1994 that factors large composite integers into primes in polynomial time on a quantum computer, exploiting the to efficiently find the period of the function f(x) = a^x \mod N, where N is the number to factor and a is a coprime base. This capability directly undermines public-key cryptosystems like , whose security rests on the presumed hardness of factoring products of two large primes, as recovering the private key from the public modulus becomes feasible with sufficient quantum resources. Similarly, the algorithm solves the problem, threatening systems such as Diffie-Hellman and . Complementing Shor's work, Lov Grover described a quantum algorithm in 1996 for unstructured search problems, achieving a quadratic speedup over classical methods by iteratively amplifying the amplitude of the target state in a superposition. Applied to symmetric cryptography, this reduces the effective security level of key-search attacks by a factor of two; for example, breaking a 128-bit AES key classically requires $2^{128} operations but only $2^{64} quantum queries via Grover's iteration. While less devastating than Shor's exponential speedup for factoring, it necessitates larger keys—typically doubling from 128 to 256 bits—for equivalent post-quantum resistance in block ciphers and hash functions against preimage or collision searches. Experimental implementations have verified on small instances, such as factoring 21 using five qubits on quantum processors, confirming entanglement and period-finding but limited to toy problems due to and decoherence. By 2023, leading systems like 's exceeded 1,000 physical qubits yet fell short of the millions of error-corrected logical qubits needed for cryptographically relevant factoring of 2048-bit moduli, as overhead from amplifies requirements exponentially. These hardware constraints temper immediate risks but highlight causal imperatives: scalable fault-tolerant quantum computers, once realized, would render vulnerable keys irretrievably compromised, prompting scrutiny of timelines amid periodic overstatements of near-term breakthroughs.

Post-Quantum Cryptography Standardization

In December 2016, the National Institute of Standards and Technology (NIST) issued a call for proposals to develop public-key cryptographic algorithms resistant to attacks by both classical and quantum computers, initiating a multi-round standardization process to address vulnerabilities exposed by . The effort received 82 submissions by the November 2017 deadline, with NIST advancing candidates through rounds of evaluation focused on security, performance, and implementation feasibility, emphasizing algorithms based on mathematical problems presumed hard even for quantum adversaries, such as . By July 2022, after three rounds of analysis, NIST selected CRYSTALS-Kyber as the primary key-encapsulation mechanism (KEM) for general encryption and CRYSTALS-Dilithium as the primary scheme, both relying on the hardness of lattice problems like the (LWE) problem, which offer provable security reductions under reasonable computational assumptions unlike some classical schemes. Additional selections included and SPHINCS+ for signatures, providing diversity: as a lattice-based alternative and SPHINCS+ as a hash-based option stateless against quantum threats. These choices prioritized algorithms with strong empirical resistance to known attacks, smaller sizes relative to other finalists, and efficient performance on resource-constrained devices, though they generally require larger keys and signatures—e.g., Kyber-512 public keys at 800 bytes versus 32 bytes for X25519—potentially increasing bandwidth and storage demands during transition. NIST published the first three Federal Information Processing Standards (FIPS) for post-quantum cryptography on August 13, 2024: FIPS 203 (ML-KEM, derived from Kyber), FIPS 204 (ML-DSA, from Dilithium), and FIPS 205 (SLH-DSA, from SPHINCS+), mandating their use in federal systems to replace vulnerable RSA and elliptic curve schemes. To mitigate risks from "harvest now, decrypt later" attacks, where data encrypted today could be retroactively broken by future quantum computers, NIST recommends hybrid modes combining post-quantum algorithms with classical ones (e.g., X25519 + Kyber for key exchange) during migration, ensuring backward compatibility while building quantum resistance; this approach trades minor efficiency losses for layered security until full replacement. Early adoptions demonstrate practical viability: In August 2023, enabled hybrid X25519-Kyber in for a subset of users, marking a milestone in web-scale testing without widespread compatibility issues, and began supporting post-quantum connections to origin servers in September 2023. These implementations causally drive ecosystem readiness by validating performance in real networks and encouraging updates like TLS 1.3 extensions, though challenges persist in and side-channel resistance, underscoring the need for ongoing . NIST's process, informed by public submissions and independent reviews, contrasts with prior closed-government efforts by fostering global collaboration, yet requires vigilant monitoring as quantum hardware advances remain speculative but non-zero risk.

References

  1. [1]
    The History of Cryptography | IBM
    1900 BC: One of the first implementations of cryptography was found in the use of non-standard hieroglyphs carved into the wall of a tomb from the Old Kingdom ...Ancient cryptography · Medieval cryptography
  2. [2]
    The History of Cryptography - Stanford University
    This page is meant to give some insight into the history of cryptography, why it is needed, for what it is used, and what techniques have been used.
  3. [3]
    A Brief History of Cryptography - Red Hat
    The first known evidence of the use of cryptography (in some form) was found in an inscription carved around 1900 BC, in the main chamber of the tomb of the ...
  4. [4]
    The History of Cryptography: Timeline & Overview - Entrust
    Cryptography started well before the internet. This practice of communicating using secret code dates back to ancient times, as scribes, rulers, ...
  5. [5]
    The History of Cryptography - DigiCert
    Dec 29, 2022 · In 100 BC, Julius Caesar used a form of encryption to share secret messages with his army generals at war. Perhaps you have heard of the Caesar ...
  6. [6]
    Al-Kindi, Cryptography, Code Breaking and Ciphers - Muslim Heritage
    Jun 9, 2003 · Al-Kindi's technique came to be known as frequency analysis, which simply involves calculating the percentages of letters of a particular ...Missing: source | Show results with:source
  7. [7]
    [PDF] How Ultra's Decryption of Enigma Impacted the Outcome of World ...
    May 3, 2024 · The efforts of Ultra in code-breaking the German Enigma cipher resulted in the Allied victory in World War II by providing valuable intelligence ...
  8. [8]
    Impact over metrics: Turing and the ultimate contribution of cryptology
    Oct 17, 2024 · The work on ENIGMA and Tunny codebreaking is widely believed to be fundamental in shortening the duration of the war by 2 years. Thus, saving ...
  9. [9]
    [PDF] New Directions in Cryptography - Stanford Electrical Engineering
    As such, a public key cryptosystem is a multiple access cipher. A private conversation can there- fore be held between any two individuals regardless of whether ...
  10. [10]
    Cryptographic Standards and a 50-Year Evolution - NCCoE
    May 26, 2022 · Public-key cryptography, invented in 1976, enabled a game-changing breakthrough in the 21st century, allowing different parties to establish ...<|control11|><|separator|>
  11. [11]
    Cryptography - Engineering and Technology History Wiki
    Jan 7, 2015 · The earliest known use of cryptography is found in non-standard hieroglyphs carved into monuments from Egypt's Old Kingdom (ca 4500+ years ago).
  12. [12]
    Cryptography, Then and Now - HID Global
    Mar 31, 2020 · One of the earliest examples of cryptography comes from Mesopotamia circa 1500 B.C. A clay tablet was found near the banks of the Tigris with a ...
  13. [13]
    Ancient Cybersecurity? Deciphering the Spartan Scytale – Antigone
    Jun 27, 2021 · From Plutarch we know that scytalae were very probably used as tools for cryptography during wartime. In his Parallel Lives we find various ...
  14. [14]
    The Skytale: An Early Greek Cryptographic Device Used in Warfare
    Plutarch, writing in the first century CE, provided the first detailed description of the operation of the skytale: The dispatch-scroll is of the following ...
  15. [15]
    [PDF] Myths and Histories of the Spartan scytale
    Feb 1, 2021 · It will be shown that, contrary to the accepted point of view, scytale encryption is as complex and secure as other known ancient ciphers. The.
  16. [16]
    Jeremiah: 25, 51 An ancient cipher code called atbash - Creation Pie
    There are known codes and ciphers in the Bible dating to 500-600 B.C. ... The Hebrew atbash is a reciprocal substitution cipher where the 22 letters of the Hebrew ...
  17. [17]
    8 Ciphers That Shaped History
    Mar 28, 2024 · 8 Ciphers That Shaped History · Scytale · Caesar Cipher · Did you know? · Freemason Cipher/Pigpen Cipher · Great Cipher of Louis XIV · Did you know?
  18. [18]
    Ancient Cybersecurity II: Cracking the Caesar Cipher – Antigone
    Sep 16, 2021 · Caesar, for example, used a right shift of three letters – as we have already seen in the descriptions from Suetonius, Cassius Dio and Aulus ...
  19. [19]
    The Encryption System Used by Julius Caesar in his Letters to Hide ...
    Mar 11, 2024 · Suetonius also says that Augustus used the same system, but with a shift of 1 instead of 3 as Caesar did. That is, B is replaced by A, C by B, ...
  20. [20]
    Arab Origins of Cryptology - Muslim Heritage
    May 14, 2018 · The first mention of cryptology in a formal sense started in the 9th century CE, by Arab Cryptologists. They designed new codes and systematically described ...Missing: advancements | Show results with:advancements
  21. [21]
    Al-Kindi Writes the First Treatise on Cryptanalysis
    The first recorded exposition of any kind of cryptanalysis was the discussion of frequency analysis Offsite Link by the Muslim Arab philosopher.
  22. [22]
    Code Breaking a Thousand Years Ago - 1001 Inventions
    From studying the Arabic text of the Quran closely, Al-Kindi noticed the characteristic letter frequency, and laid cryptography's foundations, which led many ...
  23. [23]
    Al-Kindi's Cryptanalysis Treatise | PDF | Cryptography - Scribd
    Rating 5.0 (7) Al-Kindi's Cryptanalysis Treatise. DIscovery of 1200 year old Arabic manuscripts on Cryptology. The manuscripts were edited and translated into English ...
  24. [24]
  25. [25]
    [PDF] Contribution of Muslims and European in the Evolution of Cryptology
    Mar 7, 2024 · This paper examines the important contributions Muslims have made to the field of cryptology. Keywords: Cryptology, Cryptography, Cipher text, ...<|control11|><|separator|>
  26. [26]
    Leon Battista Alberti Describes "The Alberti Cipher"
    Leon Battista Alberti Offsite Link wrote De Cifris describing the first polyalphabetic substitution with mixed alphabets and variable period.
  27. [27]
    Polyalphabetic Ciphers before 1600 - Cryptiana
    Aug 28, 2025 · Leon Battista Alberti. Leon Battista Alberti's treatise, De componendis cifris or De cifris, written in 1466 or 1467, describes a cipher disk ...<|separator|>
  28. [28]
    Leon Battista Alberti's cipher disk - Telsy
    Jul 18, 2022 · Leon Battista Alberti's cipher disk, described in “De cifris” around 1467, is the first polyalphabetic encryption system.
  29. [29]
    Johannes Trithemius Issues the First Book on Cryptography
    A book on many forms of writing, but actually the first book on codes and cryptography, was posthumously published in Basel in 1518, two years after his death.Missing: steganography tableau
  30. [30]
    01 What is the Trithemius Cipher? - GC Wizard
    He proposed to encrypt the first letter of the message to be encoded using the first line of the tabula recta, the second with the second line, and so on.
  31. [31]
    Blaise de Vigenère Describes What is Later ... - History of Information
    Vigenère's book described a text autokey cipher that became known as the Vigenère cipher Offsite Link because it was misattributed to Vigenère in the 19th ...Missing: variant | Show results with:variant
  32. [32]
    Vigenère and the Age of Polyalphabetic Ciphers - Probabilistic World
    Apr 20, 2020 · The name of the cipher comes from the 16th century French cryptographer Blaise de Vigenère. ... autokey cipher was invented by Blaise de Vigenère, ...
  33. [33]
    Vigenere, Ciphers, Encryption - Cryptology - Britannica
    Oct 8, 2025 · The best-known polyalphabetics are the simple Vigenère ciphers, named for the 16th-century French cryptographer Blaise de Vigenère.
  34. [34]
    French Ciphers during the Reign of Louis XIV - Cryptiana
    The codes/ciphers during the reign of Louis XIV used 300 to about 900 numbers to represent letters, syllables, and frequently used words and names.
  35. [35]
    Louis XIV's Great Cipher Baffled Codebreakers Until the 19th Century
    Explore Louis XIV's Great Cipher, a 17th-century marvel created by the Rossignol brothers, whose legacy lay in safeguarding French royalty's secrets.
  36. [36]
    Historical Encryption: The Great Cipher - The SSL Store
    Sep 6, 2018 · Invented sometime around the middle of the 17 th century, under King Louis XIV, The Great Cipher remained unsolved until 1893.
  37. [37]
    John Wallis and Cryptanalysis - Cryptiana - FC2
    Wallis observes that the Civil War brought about wide use of cipher, which had been little known to any but the secretary of princes etc. (For use of ciphers ...
  38. [38]
    Cromwell's Code Breaker. - The Renaissance Mathematicus
    Nov 23, 2013 · John Wallis was a one-man Parliamentary Bletchley Park during the English Civil War, using his extraordinary mathematical talents to decipher the intercepted ...Missing: cryptography | Show results with:cryptography
  39. [39]
    Cryptography in Theory and Practice: The German-French Context ...
    This research explores the evolution of cryptographic practices and theories in the German-French context from 1300 to 1800. It emphasizes the significance ...<|control11|><|separator|>
  40. [40]
    [PDF] How mathematics spread and transformed cryptographic activities
    Feb 8, 2023 · The codebooks remained the major tool of cryptography until the mid-18th century. The last codebook established in France was designed in ...
  41. [41]
    The Black Chamber - Cracking the Vigenère Cipher - Simon Singh
    The Vigenère Cipher was finally cracked by the British cryptographer Charles Babbage. Babbage employed a mix of cryptographic genius, intuition and sheer ...Missing: 1840s | Show results with:1840s
  42. [42]
    The Vigenère Cipher: Introduction
    Note that Charles Babbage also used a similar technique and successfully broke the Vigenère cipher in 1846; but he did not publish his work. This tutorial ...Missing: 1840s | Show results with:1840s
  43. [43]
    NOVA Online | Decoding Nazi Secrets | The Playfair Cipher - PBS
    In 1854, Sir Charles Wheatstone invented the cipher known as "Playfair," named for his friend Lyon Playfair, first Baron Playfair of St. Andrews, who ...
  44. [44]
    Playfair Cipher - Practical Cryptography
    The Playfair cipher was the first practical digraph substitution cipher. The scheme was invented in 1854 by Charles Wheatstone.
  45. [45]
    Early Cryptography Cipher Devices at the National ... - Virmuze
    In 1890, French army officer Etienne Bazeries developed his “Cylindrical Cryptograph”. It is unlikely that he knew of Jefferson's cypher wheel. However, his ...
  46. [46]
    Cryptography During World War I
    May 10, 2021 · The Playfair cipher was known and used even before World War I. As with all field ciphers, the British used it for tactical communication. Later ...Missing: mechanical | Show results with:mechanical
  47. [47]
    [4.0] Codes & Codebreakers In World War I - Vectors
    Jul 1, 2023 · The race between codemaker and codebreaker accelerated in World War I, with combatants devising ever more devious ciphers, and their adversaries cracking them.
  48. [48]
    [PDF] Deciphering ADFGVX messages from the Eastern Front of World War I
    Oct 1, 2021 · Georges Painvin worked under great pressure, while the German Army launched their 1918. Spring Offensive, to break into the new ADFGX cipher, ...
  49. [49]
    Before ENIGMA: Breaking the Hebern Rotor Machine - CHM
    Aug 8, 2017 · The Hebern Rotor Machine was a major innovative leap in cipher technology and was also the first time electrical circuitry was used in a cipher device.
  50. [50]
    Enigma History - Crypto Museum
    Mar 14, 2012 · The history starts with the Dutch invention of the rotor machine in 1915, followed by several inventions of similar machines in 1917.
  51. [51]
    Enigma Machine - an overview | ScienceDirect Topics
    The famous German Enigma machine was invented by the German engineer Arthur Scherbius at the end of World War I in 1918. Its early adoption was for commercial ...<|separator|>
  52. [52]
    [PDF] An Application of the Theory of Permutations in Breaking the Enigma ...
    Its application by Polish cryptologists enabled, in turn of years 1932–33, to break the. German Enigma cipher, which subsequently exerted a considerable ...
  53. [53]
    [PDF] Solving the Enigma: History of Cryptanalytic Bombe
    Enigma replicas but also a machine that could break the Enigma settings. Returning home with copies of the Enigma, each renewed efforts to break the German ...
  54. [54]
    The Story of TypeX - RN Communications Branch Museum/Library
    Prototyped in 1937, the TypeX 1 cipher machine was used by the British military during the Second World War to encrypt their secret communications.Missing: interwar | Show results with:interwar
  55. [55]
    Development of the First Japanese Cipher Machine: RED - Cryptiana
    Mar 27, 2014 · The RED cipher machine, as designated by US codebreakers, was the first cipher machine used for Japanese diplomatic communications.Missing: 1920s- | Show results with:1920s-
  56. [56]
    [PDF] The Export of Cryptography in the 20 - Susan Landau
    On the 14th of January 2000, the Bureau of Export Administration issued long-awaited revisions to the rules on exporting cryptographic hardware and.
  57. [57]
    MG Joseph O. Mauborgne, USA - National Security Agency
    In 1919, Mauborgne perfected a one-time pad system. OTPs, because of their one-time-only usage of randomly selected key, are extremely difficult to break, even ...Missing: 1930s | Show results with:1930s
  58. [58]
    [PDF] Vernam, Mauborgne, and Friedman: The One-Time Pad and the ...
    May 7, 2014 · The one-time pad as we know it today is generally credited to Gilbert Vernam and Joseph O. Mauborgne [22]. (I omit any discussion of whether or ...
  59. [59]
    How the enigma works | NOVA - PBS
    Nov 9, 1999 · The Enigma machine, first patented in 1919, was after various improvements adopted by the German Navy in 1926, the Army in 1928, and the Air Force in 1935.
  60. [60]
    Enigma I - Crypto Museum
    Aug 11, 2009 · The machine was initially supplied with three cipher wheels (rotors), that could be inserted in 6 possible orders (3 x 2 x 1). In December 1938, ...
  61. [61]
    The History and Technology of the Enigma Cipher Machine
    The Enigma cipher machine was invented by a German engineer, Arthur Scherbius, who applied for his patent on February 23, 1918.Missing: commercialization | Show results with:commercialization
  62. [62]
    The Magic of Purple - National Security Agency
    Aug 4, 2021 · In 1939 the Japanese upgraded to a new machine driven cipher that SIS called "Purple." Despite their earlier success, the team realized that ...Missing: decryption | Show results with:decryption
  63. [63]
    Cipher machines of WWII - Christos military and intelligence corner
    May 1, 2017 · What were the relative advantages and disadvantages of each of these machines ... Hagelin C-38 had the flaw that it could be solved through depths ...
  64. [64]
    Swedish Military - Hans Högmans släktforskning
    Jan 18, 2021 · ... German machine operators often reused the encryption keys, known as indicators. These mistakes made it possible to break the code. Using ...
  65. [65]
    How Alan Turing Cracked The Enigma Code | Imperial War Museums
    Turing travelled to the United States in December 1942, to advise US military intelligence in the use of Bombe machines and to share his knowledge of Enigma.
  66. [66]
    Alan Turing: The codebreaker who saved 'millions of lives' - BBC News
    Jun 19, 2012 · The prototype model of his anti-Enigma "bombe", named simply Victory, was installed in the spring of 1940. His bombes turned Bletchley Park into ...
  67. [67]
    Colossus - The National Museum of Computing
    Colossus, the world's first electronic computer, had a single purpose: to help decipher the Lorenz-encrypted (Tunny) messages between Hitler and his generals ...
  68. [68]
    Thomas H. Flowers: the hidden story of the Bletchley Park engineer ...
    Aug 16, 2018 · “Tommy” Flowers, the engineer who designed the Colossus code-breaking machines. ... Lorenz teleprinter cipher, which earned the codename ...
  69. [69]
    1944 | Timeline of Computer History
    Designed by British engineer Tommy Flowers, the Colossus is designed to break the complex Lorenz ciphers used by the Nazis during World War II.
  70. [70]
    sigaba/ecm - National Security Agency
    The US Army's SIGABA, called the ECM (Electric Cipher Machine) in the Navy, was the only machine system used during World War II to remain completely unbroken ...
  71. [71]
    [PDF] The SIGABA / ECM II Cipher Machine : “A Beautiful Idea”
    This publication presents a historical perspective for informational and educational purposes, is the result of independent research, ...<|control11|><|separator|>
  72. [72]
    [PDF] The Influence of ULTRA in the Second World War
    Oct 19, 1993 · The second real conclusion that stands out is that Ultra was decisive in shortening the war from the time, beginning in the summer of 1941, the ...Missing: impact tonnage
  73. [73]
    Ultra and the Campaign Against U-boats in World War II
    Against this loss the Allies had sunk only 85 U-boats, giving the Germans the best end of a very uneven trade of one U-boat sunk to nearly 10,000 tons of ...
  74. [74]
    The Codebreakers' War in the Atlantic - Warfare History Network
    It is widely known that British codebreakers were ultimately successful in reading the German naval codes. It is less well known that the German codebreakers ...Missing: Somme | Show results with:Somme<|control11|><|separator|>
  75. [75]
    Bringing WWII codebreaking to life at Bletchley Park | blooloop
    Jan 30, 2024 · From 1943 onwards, the number of personnel at Bletchley rose to 9,000 people. Each of these was part of a team, from senior cryptanalysts to ...Missing: total | Show results with:total
  76. [76]
    The women of Bletchley Park - Google Arts & Culture
    In World War Two, codebreaking at Bletchley Park grew to industrial scale. Of nearly 10,000 personnel in 1945, 75% were women. Few were older than 24. Recruited ...
  77. [77]
    Alan Turing and the Hidden Heroes of Bletchley Park | New Orleans
    Jun 24, 2020 · ... Bletchley needed huge numbers of junior staff for fairly routine roles. A lot of these were from the Women's Royal Naval Service (the Wrens) ...
  78. [78]
    Human Error and Forced Flaws – Cryptography - Derek Bruff
    Oct 22, 2014 · German operators made predictable mistakes, like using consecutive letters or the same key, and their attempts to secure the Enigma backfired, ...
  79. [79]
    Human factors and missed solutions to Enigma design weaknesses
    Oct 19, 2015 · The German World War II Enigma suffered from design weaknesses that facilitated its large-scale decryption by the British throughout the war ...
  80. [80]
    History - World Wars: Breaking Germany's Enigma Code - BBC
    Feb 17, 2011 · Britain and her allies first understood the problems posed by this machine in 1931, when a German spy, Hans Thilo Schmidt, allowed his ...
  81. [81]
    Ultra Code Breakers: The Misunderstood Allied Secret Weapon
    May 9, 2022 · THE FULL CONTRIBUTION OF INTELLIGENCE to the winning of World War II is clear only now, nearly 75 years after that conflict.
  82. [82]
    [PDF] The Influence of Ultra on World War II - DTIC
    Despite slips, from time to time, between the cup and the lip, such occasions seem to have been too rare and too trivial to prolong the war to any extent.
  83. [83]
    Byte Out of History: Using Ultra Intelligence in World War II - FBI
    Oct 6, 2011 · Winston Churchill and Dwight Eisenhower thought the intelligence was vital to Allied victory in World War II. Eisenhower is said to have ...
  84. [84]
    [PDF] The Historical Impact of Revealing The Ultra Secret
    Oct 26, 2006 · This cursory survey of Ultra's part in what was, in so many ways, the decisive middle period of the war has, as it proceeded, touched on much ...
  85. [85]
    [PDF] Communication Theory of Secrecy Systems* - By CE SHANNON
    Communication Theory of Secrecy Systems*. By C. E. SHANNON. 1. INTRODUCTION ... "Perfect Secrecy" is defined by requiring of a system that after a crypto ...
  86. [86]
    [PDF] Communication Theory of Secrecy Systems - cs.wisc.edu
    perfect secrecy. We suppose, then ... Though he has left the world, I believe this classical paper, “Communication Theory of Secrecy Systems”, will not.
  87. [87]
    Claude Shannon Writes the Communication Theory of Secrecy ...
    ... 1949 as "Communication Theory of Secrecy Systems Offsite Link " in Bell System ... unbreakable ciphers must have the same requirements as the one-time pad.
  88. [88]
    [PDF] 1 Shannon security and one-time pads - Cornell: Computer Science
    On the other hand, as we'll see, achieving perfect statistical security is often impractical, which motivates the computational definition. 1 Shannon security ...
  89. [89]
    Confusion and Diffusion - Wentz Wu
    Sep 2, 2019 · In cryptography, confusion and diffusion are two properties of the operation of a secure cipher identified by Claude Shannon in his 1945 classified report.
  90. [90]
    [PDF] Claude Shannon: His Work and Its Legacy1
    Shannon's key contributions include demonstrating communication is possible with noise and the concept of channel capacity, which is still used today. He also ...
  91. [91]
    [PDF] The Early History of NSA - National Security Agency
    The NSA's roots trace back to a 1917 Army Cipher Bureau, with the NSA officially named in 1952. The Navy's early unit, OP-20-G, worked by 1924.
  92. [92]
    [PDF] Shannon Perfect Secrecy in a Discrete Hilbert Space - arXiv
    Although OTP provides mathematically proven perfect secrecy, OTP remains impractical for reasons that it requires secure true random generation and exchange ...Missing: Critiques | Show results with:Critiques
  93. [93]
    [PDF] cryptologys-role-in-the-early-development-of-computer-capabilities ...
    world's first supercomputer, the CDC 6600, introduced in the early. 1960s. NSA was an early customer. But now CDC was not alone, for. IBM was fully involved ...
  94. [94]
    NSA - Crypto Museum
    Aug 12, 2012 · In the 1960s and 1970s, electronic cipher machines with vacuum tubes (valves) were developed. Punched cards were used for key distribution. Some ...
  95. [95]
    Venona Documents - National Security Agency
    Venona was a secret program to exploit Soviet communications, providing insight into Soviet intentions and treasonous activities until 1980.Missing: punched tape
  96. [96]
    USA cipher machines - Crypto Museum
    Aug 4, 2009 · The TSEC/KL-7 was an off-line cipher machine built in the 1950s by the US National Security Agency (NSA) and served during an important part of ...
  97. [97]
    [PDF] American Cryptology during the Cold War, 1945-1989
    Jul 1, 2025 · These publications have treated specific events, organizational issues, and technical developments in peace and war; most have been pioneering ...
  98. [98]
    [PDF] The Early Struggle to Automate Cryptanalysis - Government Attic
    May 29, 2013 · In response to your 4 August 2012 declassification request, we have reviewed the NSA cryptologic history entitled: It Wasn't All Magic: The ...
  99. [99]
    HISTORY OF DES - UMSL
    The altered version of LUCIFER was put forward as a proposal for the new national encryption standard requested by the National Bureau of Standards (NBS). It ...
  100. [100]
    [PDF] The Data Encryption Standard - Princeton University
    financial transactions and a 128-bit encryption al- gorithm called Lucifer.7 As part of the patenting process, IBM's algorithms were submitted to NSA for ...Missing: history | Show results with:history
  101. [101]
    Cryptography | CSRC - NIST Computer Security Resource Center
    As early as 1977, when DES was specified in FIPS 46, NBS began validating hardware implementations of its cryptographic specifications in commercial products.
  102. [102]
    Q&A: Finding New Directions in Cryptography
    Jun 1, 2016 · Among them was a vigorous critique of the Data Encryption Standard (DES), a symmetric-key algorithm developed at IBM. HELLMAN: DES came full- ...
  103. [103]
    The Strange Story of Dual_EC_DRBG - Schneier on Security -
    Nov 15, 2007 · It is known that the designers of DES actually discovered the weaknesses in the S-Boxes after developing their own “new” attack (T or tickle) ...
  104. [104]
    AES Development - Cryptographic Standards and Guidelines | CSRC
    Dec 29, 2016 · On January 2, 1997, NIST announced the initiation of the AES development effort and received numerous comments. NIST then and made a formal call ...Missing: timeline | Show results with:timeline
  105. [105]
    The Story and Math of Differential Cryptanalysis — Blog - Evervault
    Sep 27, 2023 · In this post, I will tell you how DC works, how it was used to disgrace IBM and the NSA, how we don't know who truly discovered it, and how AES prevents it.
  106. [106]
    NIST Announces Encryption Standard Finalists
    Aug 9, 1999 · Analysis of the finalists will be presented at a conference in April 2000. NIST is accepting comments on the candidates through May 15, 2000.Missing: timeline | Show results with:timeline
  107. [107]
    AES: the Advanced Encryption Standard - Cryptographic competitions
    Timeline · M-17, 1997.01.02: NIST announces AES competition. · M-14, 1997.04.15: AES Evaluation Criteria/Submission Requirements Workshop. · M-9, 1997.09.12: NIST ...
  108. [108]
    ADVANCED ENCRYPTION STANDARD (AES) Fact Sheet - CSRC
    Jan 28, 2002 · When considered together, Rijndael's combination of security, performance, efficiency, ease of implementation and flexibility make it an ...
  109. [109]
    AES: Who won? - InfoWorld
    Oct 27, 2000 · Discover the results of the Advanced Encryption Standard contest. Private-key cryptography has been used for ages, especially on the battlefield ...Missing: timeline | Show results with:timeline<|separator|>
  110. [110]
    [PDF] Advanced Encryption Standard (AES)
    May 9, 2023 · In 2000, NIST announced the selection of the Rijndael block cipher family as the winner of the. Advanced Encryption Standard (AES) competition.Missing: timeline | Show results with:timeline
  111. [111]
    Encryption Standards: AES, RSA, ECC, SHA and Other Protocols
    Nov 28, 2024 · Widely adopted and implemented in protocols and standards like TLS, SSH, IPsec, WiFi security. AES is used ubiquitously today by ...
  112. [112]
    Impedance vs. Power Side-channel Vulnerabilities - arXiv
    May 10, 2024 · This section discusses the attack methodology performed against the AES implementation utilizing power and impedance side channels. We present ...
  113. [113]
    Intel® Advanced Encryption Standard Instructions (AES-NI)
    Feb 2, 2012 · AES-NI can be used to accelerate the performance of an implementation of AES by 3 to 10x over a completely software implementation. The AES ...
  114. [114]
    [PDF] Secure Communications Over Insecure Channels - Ralph C. Merkle
    To distinguish the puzzle keys, one for each puzzle, from the keys randomly selected from the re- stricted key space to create the puzzles, we will call the.
  115. [115]
    [PDF] The Complexity of Public-Key Cryptography
    Apr 27, 2017 · We survey the computational foundations for public-key cryptography. We discuss the com- putational assumptions that have been used as bases for ...
  116. [116]
    RSA Algorithm in Cryptography: Rivest Shamir Adleman Explained
    First introduced in 1977 by Ron Rivest, Adi Shamir and Leonard Adleman of the Massachusetts Institute of Technology, RSA is named after their last initials. The ...Missing: publication | Show results with:publication
  117. [117]
    RSA Algorithm - di-mgt.com.au
    The RSA algorithm is named after Ron Rivest, Adi Shamir and Len Adleman, who invented it in 1977 [RIVE78]. The basic technique was first discovered in 1973 by ...Missing: history | Show results with:history<|separator|>
  118. [118]
    1983: Three Inventors Receive Patent for Encryption Algorithm RSA
    20 September 1983: Ronald Rivest, Adi Shamir, and Leonard Adleman received a patent for the encryption algorithm RSA (named after the ...
  119. [119]
    RSA Public Key Encryption
    Jun 5, 2000 · The RSA patent on Public-Key cryptosystems will expire on September 21, 2000. RSA may then be used for commercial and non-commercial use for ...Missing: impact | Show results with:impact
  120. [120]
    History - OpenPGP
    Aug 2, 2024 · It is based on the Pretty Good Privacy (PGP) freeware software as originally developed in 1991 by Phil Zimmermann. For that, he was the ...
  121. [121]
    Author's preface to the book: "PGP Source Code and Internals"
    One way that they discourage it is by the use of export restrictions on cryptographic software. This draws PGP into the press spotlight. The US State Department ...
  122. [122]
    Cryptographic Advancements Enabled by Diffie–Hellman - ISACA
    Jun 6, 2024 · Symmetric key cryptography was enabled by concepts first proposed in 1976 in “New Directions in Cryptography” by researchers Whitfield Diffie and Martin E. ...
  123. [123]
    [PDF] Elliptic Curve Cryptography
    Elliptic Curve Cryptography (ECC) is a newer approach, with a novelty of low key size for the user, and hard exponential time challenge for an intruder to ...Missing: history | Show results with:history
  124. [124]
    (PDF) A Survey Report On Elliptic Curve Cryptography
    Aug 7, 2025 · The Elliptic Curve Cryptography (ECC) was introduced in the 1980s. It has superior strength per bit compared to the existing public key ...
  125. [125]
    Cryptanalysis of MD4 | Journal of Cryptology
    Sep 1, 1998 · In 1990 Rivest introduced the hash function MD4. Two years later RIPEMD, a European proposal, was designed as a stronger mode of MD4.
  126. [126]
    NIST Transitioning Away from SHA-1 for All Applications
    Dec 15, 2022 · SHA-1 was first specified in 1995 in Federal Information Processing Standard (FIPS) 180-1 , Secure Hash Standard (SHS). In 2005, a serious ...
  127. [127]
    Announcing the first SHA1 collision - Google Online Security Blog
    Feb 23, 2017 · We are announcing the first practical technique for generating a collision. This represents the culmination of two years of research that sprung from a ...
  128. [128]
    NIST Retires SHA-1 Cryptographic Algorithm
    Dec 15, 2022 · NIST is announcing that SHA-1 should be phased out by Dec. 31, 2030, in favor of the more secure SHA-2 and SHA-3 groups of algorithms.
  129. [129]
    Hash Functions | CSRC - NIST Computer Security Resource Center
    After 12/31/2030, any FIPS 140 validated cryptographic module that has SHA-1 as an approved algorithm will be moved to the historical list. NIST recommends that ...
  130. [130]
    Cryptographic Hash Algorithm Competition | NIST
    The new hash algorithm would be referred to as SHA-3. NIST announced the SHA-3 Cryptographic Hash Algorithm Competition on November 2, 2007, and ended the ...Missing: timeline | Show results with:timeline
  131. [131]
    NIST Selects Winner of Secure Hash Algorithm (SHA-3) Competition
    Oct 2, 2012 · Keccak will now become NIST's SHA-3 hash algorithm. Hash algorithms are used widely for cryptographic applications that ensure the authenticity ...Missing: timeline | Show results with:timeline
  132. [132]
    SHA-384 vs SHA3-256 - A Comprehensive Comparison - MojoAuth
    Unlike SHA-2, which is based on the Merkle-Damgård construction, SHA-3 uses a different approach called the Keccak sponge construction. This results in a ...
  133. [133]
    [PDF] Differential Cryptanalysis of the Data Encryption Standard - Eli Biham
    Dec 7, 2009 · In particular, we show that almost any structural modification of DES leads to a much weaker cryptosystem, and that DES reduced to eight rounds ...
  134. [134]
    Differential Cryptanalysis of the Data Encryption Standard
    This book presents the first successful attack which can break the full 16 round DES faster than via exhaustive search.
  135. [135]
    Differential Cryptanalysis of FEAL - SpringerLink
    The structure of FEAL is similar to DES with a modified F function, initial and final permutations and key scheduling algorithm. In the F function, the P ...
  136. [136]
    Linear Cryptanalysis Method for DES Cipher - SpringerLink
    Jul 13, 2001 · We introduce a new method for cryptanalysis of DES cipher, which is essentially a known-plaintext attack. As a result, it is possible to break 8-round DES ...
  137. [137]
    [PDF] Differential Cryptanalysis of the Data Encryption Standard
    This book introduces a new cryptographic method, called differential cryptanalysis, which can be applied to analyze cryptosystems, and describes the ...
  138. [138]
    [PDF] The Economic Impacts of the Advanced Encryption Standard, 1996
    In 1996, CSD began to plan seriously for the process that would replace DES with AES (FIPS-197), and the process for assuring conformance to AES as spelled out ...Missing: contests | Show results with:contests
  139. [139]
    [PDF] Timing Attacks on Implementations of Diffie-Hellman, RSA, DSS ...
    Some cryptosystems will need to be re- vised to protect against the attack, and new protocols and algorithms may need to incorporate measures to prevent timing ...Missing: analysis | Show results with:analysis
  140. [140]
    [PDF] Side-Channel Attacks: Ten Years After Its Publication and the ...
    Timing attacks were introduced in 1996 by Kocher [49], where RSA modular exponentiation was being attacked. Schindler presented timing attacks on implementation ...
  141. [141]
    [PDF] Side Channel Attacks and Countermeasures - Mark M. Tehranipoor
    Apr 17, 2018 · • Not average power over time, not peak power. • Instantaneous ... Simple Power Analysis (SPA). ▫ Originally proposed by Paul Kocher, 1996.
  142. [142]
    [PDF] Dual EC: A Standardized Back Door - Cryptology ePrint Archive
    Jul 31, 2015 · Abstract. Dual EC is an algorithm to compute pseudorandom num- bers starting from some random input. Dual EC was standardized by.
  143. [143]
    How the NSA (may have) put a backdoor in RSA's cryptography
    Jan 6, 2014 · One algorithm, Dual_EC_DRBG, was ratified by the NIST in 2007 ... A working proof of concept backdoor was published in late 2013 using ...
  144. [144]
    Report on the Development of the Advanced Encryption Standard ...
    On January 2, 1997, NIST announced the initiation of an effort to develop the AES [31] and made a formal call for algorithms on September 12, 1997 [32]. The ...Missing: timeline | Show results with:timeline
  145. [145]
    [PDF] Transition to Advanced Encryption Standard (AES), May 2024 - CISA
    1 NIST's present guidance is that current applications can continue to use AES with key sizes 128, 192, or 256 bits. NIST will issue guidance regarding any ...
  146. [146]
    Encryption Export Controls - EveryCRSReport.com
    Jan 11, 2001 · Regulations issued in October 2000 further streamlined controls over encryption exports to 23 countries including European Member states.
  147. [147]
    History of the First Crypto War - Schneier on Security -
    Jun 22, 2015 · The act that truly launched the Crypto Wars was the White House's introduction of the “Clipper Chip” in 1993.
  148. [148]
    The Crypto Wars Are Over - CSIS
    Feb 4, 2021 · In the early 1990s, the NSA and the FBI proposed an alternative approach that would add a special NSA-designed chip to internet-connected ...
  149. [149]
    A brief history of U.S. encryption policy - Brookings Institution
    Apr 19, 2016 · The National Security Agency (NSA) announced the Clipper chip in 1993. The chip was a piece of hardware designed for phones which would ...<|separator|>
  150. [150]
    What the government should've learned about backdoors from the ...
    Dec 14, 2015 · As of 1993, the algorithm was still classified. But the extra twist added for the Clipper Chip was key escrow, a feature promoted heavily by ...
  151. [151]
    PGP Marks 30th Anniversary - Philip Zimmermann
    Jun 6, 2021 · I became the target of a criminal investigation for violating the Arms Export Control Act by allowing PGP to spread around the world. This ...
  152. [152]
    Data-Secrecy Export Case Dropped by U.S. - The New York Times
    Jan 12, 1996 · Mr. Zimmermann maintained that he did not put the software on the Internet. The Justice Department began its investigation three years ago, and ...
  153. [153]
    The Applied Cryptography Case - Phil Karn, KA9Q
    In January 14, 2000, new US crypto export regulations went into effect. Publicly available encryption source code, such as that at issue in my lawsuit, ...
  154. [154]
    [PDF] The ECHELON Affair - Archives of the European Parliament
    The Echelon system works by indiscriminately intercepting very large quantities of communications and then siphoning out what is valuable using artificial ...
  155. [155]
    [PDF] Encryption Export: The New Regulations and Their Ramifications
    The decision by the United States to liberalize its own encryption export regulations in January 2000 had the effect of weakening the position of those who ...
  156. [156]
    Algorithms for quantum computation: discrete logarithms and factoring
    This paper gives Las Vegas algorithms for finding discrete logarithms and factoring integers on a quantum computer that take a number of steps which is ...
  157. [157]
    Peter Shor's Publications - MIT Mathematics
    This paper is the original paper showing that factoring and discrete logarithms can be done quickly on a quantum computer. It appeared in the 1994 Symposium on ...
  158. [158]
    [quant-ph/9508027] Polynomial-Time Algorithms for Prime ... - arXiv
    Aug 30, 1995 · This paper considers factoring integers and finding discrete logarithms, two problems which are generally thought to be hard on a classical computer.
  159. [159]
    A fast quantum mechanical algorithm for database search - arXiv
    Nov 19, 1996 · A fast quantum mechanical algorithm for database search. Authors:Lov K. Grover (Bell Labs, Murray Hill NJ).
  160. [160]
    Grover's Algorithm and Its Impact on Cybersecurity - PostQuantum.com
    In summary, the impact on symmetric encryption is serious but manageable: Grover's algorithm means that 128-bit keys will no longer be sufficient in the long ...Cybersecurity Implications of... · Mitigation Strategies Against...
  161. [161]
    [PDF] On the practical cost of Grover for AES key recovery
    Mar 22, 2024 · In most cases, the best-known quantum key recovery attack uses Grover's algorithm [14] which provides a generic square-root speed-up over ...
  162. [162]
    Demonstration of Shor's factoring algorithm for N $$=$$ 21 on IBM ...
    Aug 16, 2021 · We implemented the algorithm on IBM quantum processors using only five qubits and successfully verified the presence of entanglement between the ...
  163. [163]
    Shor's algorithm | IBM Quantum Documentation
    Attempting to factor a 2048-bit number with Shor's algorithm will result in a quantum circuit with millions of qubits, including the error correction overhead ...
  164. [164]
    Quantum Computing's Hard, Cold Reality Check - IEEE Spectrum
    Dec 22, 2023 · Quantum Computing's Hard, Cold Reality Check · Hype is everywhere, skeptics say, and practical applications are still far away · Terahertz Tech ...
  165. [165]
    NIST Post-Quantum Cryptography Standardization
    FIPS 203, FIPS 204 and FIPS 205, which specify algorithms derived from CRYSTALS-Dilithium, CRYSTALS-KYBER and SPHINCS+, were published August 13, 2024.Round 3 Submissions · Call for Proposals · Round 1 Submissions
  166. [166]
    Selected Algorithms - Post-Quantum Cryptography | CSRC
    Selected Algorithms: Key-Encapsulation Mechansims ; CRYSTALS-KYBER (2022) FIPS 203 · PQC License Summary & Excerpts · Zip File (7MB) IP Statements · Website.Missing: 2024 | Show results with:2024
  167. [167]
    NIST Releases First 3 Finalized Post-Quantum Encryption Standards
    CRYSTALS-Kyber, CRYSTALS-Dilithium, Sphincs+ and FALCON — slated for standardization in ...
  168. [168]
    [PDF] NIST IR 8547 initial public draft, Transition to Post-Quantum ...
    Nov 12, 2024 · In response, NIST has released three PQC standards to start the next and significantly large stage of working on the transition to post-quantum ...
  169. [169]
    Protecting Chrome Traffic with Hybrid Kyber KEM - Chromium Blog
    Aug 10, 2023 · Thursday, August 10, 2023. Teams across Google are working hard to prepare the web for the migration to quantum-resistant cryptography.Missing: adoption | Show results with:adoption
  170. [170]
    Cloudflare now uses post-quantum cryptography to talk to your ...
    Sep 29, 2023 · Starting today, you can secure the connection between Cloudflare and your origin server with post-quantum cryptography.
  171. [171]
    Hybrid Cryptography for the Post-Quantum Era
    Google's Chrome has enabled a hybrid X25519+Kyber exchange for a subset of users, and Cloudflare reported that by early 2024 about 1.8% of all TLS 1.3 ...
  172. [172]
    Workshops and Timeline - Post-Quantum Cryptography | CSRC
    Date. September 24-26, 2025. Sixth PQC Standardization Conference (In-Person / Virtual) Venue: NIST Gaithersburg, Maryland, USA. Call for Papers.