Fact-checked by Grok 2 weeks ago

History of computer science

The history of computer science traces the development of algorithms, computational theory, and digital systems from ancient precursors and early mechanical devices through the to the sophisticated hardware and software ecosystems of the , fundamentally transforming human problem-solving and information processing. Pioneered by mathematicians and engineers, it encompasses foundational concepts like and , evolving through wartime innovations, commercial adoption, and the rise of personal and networked computing. Key early milestones include 's conceptualization of the in the 1830s, an ambitious mechanical general-purpose computer design that incorporated punched cards for input and laid groundwork for programmable machines, with providing the first intended for such a device in 1843. In the 1930s, theoretical foundations solidified with Alan Turing's 1936 paper on the , which formalized the notion of and influenced modern , while Claude Shannon's 1937 master's thesis linked to electrical switching circuits, enabling digital logic design. These ideas converged in the 1940s with the construction of electronic computers: Konrad Zuse's Z3 in 1941 became the first functional program-controlled digital computer using relays, and the , completed in 1945 by and , marked the first general-purpose electronic computer with 18,000 vacuum tubes for high-speed calculations. John von Neumann's 1945 report further advanced the field by outlining the stored-program architecture, where instructions and data reside in the same memory, a principle still central to today's computers. The post-World War II era saw emerge as a discipline, with the 1951 delivery of the —the first commercial computer—to the U.S. Census Bureau, demonstrating practical applications in and gaining public attention through its 1952 election night predictions. advanced concurrently; Noam Chomsky's 1950s work on formal languages and provided frameworks for compiler design and programming languages, while John Backus's development of in 1956 introduced the first , abstracting for broader accessibility. By the , institutions formalized the field: Purdue and Stanford established the first computer science departments in 1962, and ARPANET's launch in 1969 by the U.S. Department of Defense laid the foundation for the through packet-switching networks. The 1970s and 1980s brought miniaturization and democratization, highlighted by Intel's 1971 release of the 4004 microprocessor, the first single-chip CPU with 2,300 transistors, enabling compact computing devices. This paved the way for personal computers: the 1976 Apple I by Steve Wozniak and Steve Jobs targeted hobbyists, while IBM's 1981 PC standardized the market with open architecture, fostering software ecosystems like MS-DOS. Theoretical breakthroughs included Stephen Cook's 1971 identification of NP-complete problems, shaping complexity theory and algorithm design, and the 1977 invention of the RSA cryptosystem by Rivest, Shamir, and Adleman, revolutionizing secure communications. Subsequent decades witnessed explosive growth: Tim Berners-Lee's 1989 proposal of the transformed information sharing, while the saw the internet's commercialization and the rise of graphical user interfaces. The 21st century introduced with the 2007 , cloud services, and ; Google's 2019 demonstration of using the marked a leap in computational power, the 2022 activation of the exascale became the first to exceed one quintillion calculations per second, and as of November 2025, holds the title of the world's fastest with performance over 1.7 exaFLOPS. Today, computer science integrates with fields like and cybersecurity, continuing to drive innovation amid challenges in scalability and ethics.

Ancient Precursors

Early Calculation Devices

The , one of the earliest known mechanical aids for , originated in around 2300 BCE during the period, where it served as a counting board for basic operations like and using pebbles or tokens moved across marked lines or grooves. This device evolved over millennia into various forms across ancient civilizations, including adaptations in , , and particularly in , where it became a portable tool for manual calculation in commerce and daily accounting. In , the abacus took the form of a handheld bronze tablet with grooves and loose calculi (small stones) for tracking values, facilitating trade and taxation computations by the 1st century CE. The suanpan, a sophisticated bead-based variant, exemplifies the abacus's refinement and widespread adoption in economic activities. Emerging around the 2nd century BCE and maturing by the (960–1279 CE), the suanpan consists of a wooden frame with multiple vertical rods, each threaded with beads divided by a horizontal beam: two beads above the beam (each valued at 5) and five below (each valued at 1), allowing representation of numbers up to 99 on a single rod in decimal form. Users manipulate the beads toward the beam to activate their values, enabling rapid addition, subtraction, multiplication, and division through sequential bead movements—a process that supported intricate trade calculations in imperial , such as balancing merchant ledgers and currency exchanges along the . In the early 17th century, Scottish mathematician introduced "" in his 1617 treatise Rabdologia, marking a transitional tool toward more advanced computational aids like logarithmic tables. Crafted from ivory rods etched with multiples of digits from 0 to 9, these bones were aligned in a frame to form multiplication tables on their sides, allowing users to perform and by reading sums along diagonals without carrying over manually. This rod-based alignment simplified complex operations for astronomers and navigators, reducing errors in lengthy calculations and prefiguring the slide rule's principles, though it required manual assembly for each problem. Blaise Pascal's , invented in 1642 at age 19, represented the first geared designed for automated arithmetic, primarily to assist his father's tax work. The device featured a series of interlocking toothed wheels (similar to modern odometers) within a brass box, where turning dials input numbers and propagated carries via gear ratios, enabling direct and of up to six-digit values aligned with French currency units (livres, sols, and deniers). However, its design limitations—restricted to and without native support for or (achieved only through repeated additions), vulnerability to gear jamming from inconsistencies, and accommodation of the era's irregular 20-sous-per-livre system—hindered reliability and , with only about 50 units built by 1652.

Algorithmic Thinking in Antiquity

Early civilizations developed systematic procedures for solving mathematical problems, foreshadowing the procedural and iterative approaches central to modern algorithms. These methods emphasized repeatable steps to achieve precise results, often without formal notation but through geometric or verbal descriptions. In ancient , , and , such techniques addressed problems in , , and , establishing foundational principles of step-wise reasoning. Around 1800 BCE, Babylonian mathematicians employed geometric methods to solve equations, representing an early form of algorithmic problem-solving recorded on clay tablets. These procedures typically involved or iterative approximations for equations of the form x^2 + px = q, using notation. For instance, the Tell Dhibayi tablet outlines a step-by-step : given a rectangle with area 0;45 and diagonal 1;15 (in ), the method computes $2xy = 1;30, then derives x - y = 0;15 via extraction, and solves for x = 1, y = 0;45. The tablet, dated between 1800 and 1650 BCE and housed at , lists 15 Pythagorean triples—such as (45, 60, 75)—generated through similar geometric approximations, likely for constructing right triangles and estimating areas without explicit algebraic formulas. These methods relied on verbal instructions and iterative bounding, demonstrating a procedural approach to solutions. In ancient Greece, Euclid's algorithm for finding the greatest common divisor (GCD) of two integers, described around 300 BCE in Elements Book VII, exemplifies iterative division as a core algorithmic technique. Presented geometrically with numbers as line segments, the process for unequal lengths AB and CD (AB > CD) proceeds as follows:
  1. Subtract the smaller segment CD from the larger AB repeatedly until the remainder is less than CD.
  2. Replace AB with CD and CD with the new remainder, repeating the subtraction.
  3. Continue until the remainder measures the previous segment exactly or equals a unit.
This yields the GCD: if a unit remains, the numbers are coprime ( 1); otherwise, the final is the measure ( 2). states: "The less of the numbers AB, CD being continually subtracted from the greater, some number will be left which will measure the one before it." The algorithm's efficiency stems from its recursive reduction, later refined into division-based variants, highlighting antiquity's grasp of termination and . Archimedes, around 250 BCE, applied the method of exhaustion to approximate π in his treatise On the Measurement of the Circle, using inscribed and circumscribed polygons to bound the circle's circumference. Starting with a regular hexagon (perimeter 6 for unit radius), he iteratively bisected angles to form polygons with 12, 24, 48, and 96 sides, calculating perimeters via trigonometric identities derived from Pythagorean theorem applications. For the 96-sided polygons, the inscribed perimeter gave a lower bound of $3 \frac{10}{71} \approx 3.1408, and the circumscribed an upper bound of $3 \frac{1}{7} \approx 3.1429, thus \frac{223}{71} < \pi < \frac{22}{7}. This exhaustion technique—refining bounds through successive approximations without assuming limits—demonstrates algorithmic refinement for irrational values, influencing later numerical methods. In ancient , Pingala's Chandaḥśāstra (circa 200 BCE) introduced algorithmic enumeration of poetic meters using binary-like patterns of short (laghu, L) and long (guru, G) syllables, prefiguring combinatorial . The text describes a prastara listing all $2^n combinations for n syllables, generated recursively: for n=1, [G, L]; for n=2, [GG, GL, LG, LL]. Algorithms like nashtam recover missing rows (e.g., row 5 for n=3 is GGL) and uddishtam find a pattern's position (e.g., GLG as 3rd), employing stack-based akin to modern . These procedures systematized pattern generation, laying groundwork for representation in . Brahmagupta's Brahmasphuṭasiddhānta (628 CE) advanced algorithmic handling of and negative numbers, defining operations in Chapter 18 to enable with signed quantities. He stipulated: results from subtracting a number from itself, addition/subtraction with leaves the number unchanged, and by yields . For negatives (termed "debts" or rina), rules include: positive plus negative equals their difference (larger minus smaller); negative times negative is positive; divided by is . These prescriptions formed a procedural framework for algebraic computation, resolving ambiguities in earlier positional systems and facilitating . For example, Brahmagupta solved quadratics like x^2 = 8x + 9 by isolating terms and extracting roots iteratively. His work integrated as an operational entity, essential for algorithmic consistency in .

Foundations in Logic and Mathematics

Binary Representation and Leibniz

Early explorations of binary-like systems appeared in ancient , particularly in the work of around 200 BCE. In his Chandahshastra, a on , Pingala analyzed poetic meters composed of short (laghu) and long (guru) syllables, generating all possible sequences for a given length n using recursive methods. These sequences effectively formed patterns, where short syllables could be represented as 0 and long as 1, yielding 2^n combinations; for example, for n=4, the 16 patterns mirrored binary enumeration from 0000 to 1111. This combinatorial approach, detailed in Chapter 8, laid foundational algorithms for pattern generation and counting, predating formal binary notation by centuries. Gottfried Wilhelm Leibniz advanced binary representation significantly in the early , publishing "Explication de l'Arithmétique Binaire" in 1703, based on ideas from 1679. In this essay, he described "dyadic" or binary arithmetic using only the digits and , where numbers progress by powers of 2 (e.g., 3 as 11, 5 as 101). Leibniz included a dyadic arithmetic table demonstrating and , such as:
DecimalBinaryAddition Example (3 + 2 = 5)
111 + 10 = 101
210
311
4100
5101
This table highlighted patterns in figurate numbers and cycles, emphasizing binary's simplicity for computations without carrying over complex rules. Leibniz imbued binary with philosophical depth, interpreting it as a metaphor for creation: 0 symbolizing nothingness or void, and 1 representing God or unity from which all complexity arises through repetition and combination. He explicitly linked this to the ancient Chinese I Ching, noting in correspondence with Jesuit Joachim Bouvet that the hexagrams' broken (0) and solid (1) lines formed binary sequences predating European knowledge, suggesting a universal foundation for philosophy and science. This theological framing positioned binary not merely as arithmetic but as a "universal characteristic" for reasoning. Leibniz's interest in binary extended to mechanical computation, recognizing its efficiency over decimal systems for implementation in calculators due to requiring only two states—presence or absence—rather than ten distinct positions. In 1673, he invented the , the first mechanical device capable of direct multiplication and division alongside addition and subtraction, using stepped cylinder gears that engaged variable teeth counts for digit-wise operations. Although the stepped reckoner operated in decimal, Leibniz's later binary advocacy stemmed from this work, proposing simpler mechanical designs like rolling balls to represent 0s and 1s, which would reduce gear complexity and errors in automation. This foresight underscored 's superiority for reliable, scalable mechanical arithmetic.

Boolean Algebra and Symbolic Logic

George Boole, a self-taught English , laid the groundwork for in his 1847 pamphlet The Mathematical Analysis of Logic, where he proposed treating logical propositions as algebraic variables taking only the values true or false, and operations on them analogous to arithmetic. This work generalized Aristotelian syllogistic logic into a symbolic system, using elective symbols to represent classes and their intersections, thereby enabling the manipulation of logical statements through equations. Boole expanded this framework in his 1854 book An Investigation of the Laws of Thought, introducing the constants 0 (representing falsehood or the empty class) and 1 (representing truth or the universal class), and formalizing operations such as multiplication for conjunction (AND), addition for disjunction (OR, under the condition of disjoint classes), and subtraction for negation (NOT). These innovations transformed logic from a rhetorical art into a precise algebraic discipline, essential for later digital computation, as binary states could represent these true/false values. The core of Boolean algebra consists of axioms that govern these operations, including commutativity, associativity, and distributivity, which Boole derived from his analysis of logical forms. Commutativity states that the order of operands does not matter: for variables x and y, x \land y = y \land x (AND) and x \lor y = y \lor x (OR). Associativity allows grouping to be ignored: (x \land y) \land z = x \land (y \land z) and (x \lor y) \lor z = x \lor (y \lor z). Distributivity mirrors : x \land (y \lor z) = (x \land y) \lor (x \land z) and x \lor (y \land z) = (x \lor y) \land (x \lor z). These axioms, along with idempotency (x \land x = x, x \lor x = x) and complements (existence of \neg x such that x \land \neg x = 0 and x \lor \neg x = 1), form the foundation of the algebra. To illustrate these basic operations, truth tables enumerate all possible input combinations and their outputs, a method that clarifies Boolean functions despite originating later in the development of symbolic logic. For the AND operation (\land):
ABA ∧ B
000
010
100
111
For OR (\lor):
ABA ∨ B
000
011
101
111
For NOT (\neg):
A¬A
01
10
These tables demonstrate how the operations satisfy the axioms; for instance, commutativity is evident in the symmetry of AND and OR tables. In the same year as Boole's initial work, published Formal Logic; or, the Calculus of Inference, Necessary and Probable, where he independently advanced symbolic logic by emphasizing relational inferences and introducing laws that highlight the duality between and disjunction under . De Morgan's theorems state that the of a conjunction is the disjunction of the negations, \neg (A \land B) = \neg A \lor \neg B, and the of a disjunction is the of the negations, \neg (A \lor B) = \neg A \land \neg B. These laws, derived from his analysis of syllogistic forms and propositional duality, provided tools for transforming logical expressions, complementing Boole's algebraic system and influencing subsequent developments in circuit design and proof theory. Building on these algebraic foundations, Gottlob Frege introduced a more expressive formal notation in his 1879 monograph Begriffsschrift, eine der arithmetischen nachgebildete Formelsprache des reinen Denkens, which pioneered predicate logic and quantified expressions. Frege's two-dimensional "concept-script" used tree-like diagrams to represent implications, negations, and quantifiers (universal and existential), allowing for the precise formulation of predicates and arguments as functions mapping to truth values. This system bridged propositional logic to full predicate calculus by enabling the analysis of generality and relations, such as defining sequences through recursive ancestry, but it remained focused on static formalization without addressing computability limits.

Mechanical and Analytical Innovations

Babbage's Difference and Analytical Engines

Charles Babbage conceived the Difference Engine in 1821 as a mechanical device to automate the calculation and tabulation of mathematical tables, particularly polynomials, addressing the prevalent errors in human-computed astronomical and navigational data that had cost the British government millions. The machine operated on the method of finite differences, which allowed polynomial evaluation through repeated addition rather than multiplication or division, simplifying mechanical implementation. By 1822, Babbage had built a small working model using spare parts, demonstrating its feasibility for computing six-digit numbers and basic functions. The detailed design of No. 1 was advanced by 1832, specifying a structure of brass wheels and levers for precision arithmetic, with an initial capacity for 16-digit numbers and up to sixth-order differences, though later iterations like No. 2 (designed 1847–1849) expanded to 31 digits and seventh-order polynomials. The British government provided initial funding of £1,500 in 1823, later increasing it, but escalating costs—reaching over £17,000 by 1842—led to disputes with engineer Joseph Clement and ultimate termination of support that year, leaving the project incomplete despite partial prototypes. In 1837, Babbage shifted focus to the , a more versatile general-purpose design that separated computation from data storage, featuring a "" analogous to a for arithmetic operations and a "store" for holding up to 1,000 50-digit numbers on rotating shafts. Input and control were managed via punched cards inspired by Jacquard looms—one set for operational instructions and another for variable data—enabling reprogrammability, while conditional branching was achieved through decision tables that allowed the machine to alter its sequence based on results. Neither engine was fully realized during Babbage's lifetime; a prototype represented only one-seventh of No. 1, and efforts produced mere fragments by his death in 1871, hampered by the earlier funding loss in the . Modern reconstructions vindicated Babbage's designs: the in completed the calculating section of No. 2 in 1991 for his bicentennial, using over 8,000 parts of bronze, steel, and brass, and finished the full machine with its printing apparatus in 2002, proving its operational accuracy without electricity. These innovations influenced later tabulating machines, such as those used in 19th-century censuses, by popularizing punched-card for automated sorting and counting.

Lovelace's Contributions and Early Programming

Ada , born Augusta Ada Byron in 1815, played a pivotal role in the early conceptualization of programming through her work on Charles Babbage's . In 1842, Italian mathematician Luigi Menabrea published a in detailing Babbage's design, and Lovelace, fluent in multiple languages and trained in by her mother, translated it into English the following year. Her translation, published in 1843 in the journal Scientific Memoirs, extended far beyond mere translation, incorporating extensive original notes that comprised three times the length of Menabrea's original text. These notes, particularly , represent one of the earliest examples of a , outlining a step-by-step method for the to compute numbers—a sequence used in and . In , Lovelace described the process using a table of operational cards, demonstrating how the machine could perform iterative calculations through loops and conditional branches, anticipating modern programming constructs. She explicitly differentiated between the engine's mechanical operations on numerical data and the abstract, symbolic instructions that could direct those operations, emphasizing that the device was not limited to but capable of broader symbolic manipulation. Lovelace further envisioned the Analytical Engine's versatility beyond numerical computation, suggesting applications such as composing elaborate pieces of music encoded on punched cards, where the machine would manipulate symbols representing musical notes rather than just digits. This foresight highlighted her understanding of computers as general-purpose symbol processors, a concept that prefigured the universal by nearly a century. Her ideas were developed in close collaboration with Babbage, who provided technical clarifications, though Lovelace's interpretive expansions were her own. Tragically, Lovelace's contributions were cut short by her death from in at the age of 36, limiting her direct influence on subsequent developments. Nonetheless, her notes laid foundational groundwork for programming as a discipline, bridging with abstract . Modern recognition of her work, including her designation as the first computer programmer, stems from the prescience in these 1843 publications.

Theoretical Computing Models

Hilbert's Problems and Entscheidungsproblem

In 1900, at the Second in , delivered an address outlining 23 unsolved problems intended to guide mathematical research for the century. Among these, the second problem called for a rigorous proof of the consistency of the axioms of arithmetic, aiming to demonstrate that the fundamental principles of do not lead to contradictions. Similarly, the tenth problem sought a finite algorithmic process to determine whether any given —polynomials with integer coefficients—admits integer solutions, emphasizing the decidability of such equations in . Hilbert's formalism program, which sought to mechanize mathematics through axiomatic systems and finitary proofs, gained prominence in his 1928 address at the in , where he elaborated on foundational challenges. Central to this was the (), formally posed earlier that year with in their book Grundzüge der theoretischen Logik: whether there exists a general to decide, for any statement in a such as predicate logic, if it is a valid . This problem encapsulated Hilbert's vision of resolving all mathematical questions through effective procedures, extending his earlier concerns about decidability to broader logical frameworks. The pursuit of solutions to these problems profoundly motivated efforts to formalize and mechanize mathematical reasoning, laying groundwork for by highlighting the boundaries of algorithmic solvability. Kurt Gödel's incompleteness theorems, published in , provided a pivotal partial resolution, demonstrating that any consistent capable of expressing basic arithmetic is incomplete—containing true statements that cannot be proved within the system—and that such a system's consistency cannot be proved using only its own finitary methods. These results directly undermined key aspects of , revealing inherent undecidability in sufficiently powerful axiomatic systems. In response to these foundational challenges, Alonzo Church developed lambda calculus in the early 1930s as an alternative formal system for expressing computation and functions, first outlined in his 1932 paper on the foundations of logic and further detailed in subsequent works. Lambda calculus provided a basis for defining computable functions and proved instrumental in addressing the Entscheidungsproblem, with Church demonstrating in 1936 its unsolvability for first-order logic. This work, alongside Alan Turing's independent analysis, underscored the limits of mechanical decision procedures in mathematics.

Turing Machines and Computability

In 1936, published his seminal paper "On Computable Numbers, with an Application to the ," which provided a rigorous foundation for the by introducing an abstract model known as the . This work directly addressed the posed by , seeking an to determine the truth of any mathematical statement within a . The serves as a theoretical device that captures the essence of algorithmic processes, enabling precise analysis of what functions are computable. The operates on an infinite, one-dimensional tape divided into cells, each holding a from a finite , such as , , or a blank. A read/write head scans the tape, moving left or right one cell at a time, while the machine exists in one of a finite number of internal s, including an initial and a halting . Behavior is governed by a fixed transition table that specifies, for each combination of current and scanned , the to write, the direction to move the head, and the next to enter. This simple mechanism allows the to simulate any step-by-step computation, defining "computable numbers" as those real numbers whose digits can be generated by such a device in finite time. Turing proved that not all real numbers are computable, establishing key boundaries in mathematical expressibility. A central result in the paper is the undecidability of the : there exists no that can determine, for arbitrary input machines and starting configurations, whether they will eventually halt or run indefinitely. Turing demonstrated this by assuming such a machine exists and constructing a paradoxical case via , similar to Cantor's method for uncountable sets, leading to a . This undecidability result not only resolved the negatively but also revealed inherent limitations of mechanical computation, influencing the understanding of provability in logic. Concurrently, developed as another formalization of computation in 1936, encoding functions and numbers through abstraction and application. Turing established the equivalence between s and , forming the basis of the Church-Turing thesis, which asserts that any function effectively computable in the intuitive sense is computable by a (or equivalently, by ). Though not formally provable, the thesis has been corroborated by subsequent models of computation and is foundational to . Turing further introduced the concept of a in his paper, a single device that can simulate the behavior of any other when supplied with a description of that machine's states, transition table, and initial tape contents encoded as input. This universality anticipates the stored-program paradigm of modern computers, where instructions and data are treated uniformly. The model also offered a mechanical lens on Kurt Gödel's 1931 incompleteness theorems, illustrating that no consistent capable of basic arithmetic can mechanically verify all its own truths, as such verification would solve the . Turing's theoretical insights extended to practical domains, informing his cryptanalysis efforts at , where he contributed to breaking German codes using early computing devices inspired by his abstract models.

Early Electrical and Electronic Developments

Switching Circuits and Relay-Based Systems

The application of Boolean logic to electrical switching circuits marked a pivotal step toward automated computation, with early explorations predating widespread electronic implementations. In 1886, philosopher and logician proposed using electrical switching circuits to perform logical operations, sketching designs for basic gates such as AND and OR using relays, which anticipated the integration of logic into hardware decades before digital electronics. This conceptual foundation highlighted how symbolic logic could be realized physically through electromechanical means, influencing subsequent designs for relay-based logic. A landmark advancement came in 1937 with Claude Shannon's master's thesis, A Symbolic Analysis of Relay and Switching Circuits, where he rigorously mapped onto relay networks. Shannon demonstrated that series connections of relay contacts represent logical AND operations (conjunction), parallel connections represent OR (disjunction), and toggle switches or break contacts implement negation (NOT), providing algebraic methods to simplify and synthesize circuits. His work included diagrams illustrating these mappings—such as a series circuit for AND where current flows only if both inputs are closed, and a parallel setup for OR where current flows if either input is closed—enabling the design of complex switching systems with minimal components and establishing a formal framework for analysis. These principles informed the construction of early electromechanical computers, exemplified by Konrad Zuse's Z1, completed in 1938 as a binary mechanical computer, electrically driven, with punched 35mm film for program and data input. The Z1 featured a 16-word floating-point memory using mechanical storage for random access and supported floating-point arithmetic operations, allowing computations with 22-bit words (including sign, exponent, and mantissa) despite its mechanical constraints.) This device represented a practical realization of binary logic for general-purpose calculation, operating under program control via the film strips to execute arithmetic and control instructions. The wartime urgency of accelerated relay-based systems for specialized tasks, notably in the British Colossus, operational from 1943 for cryptanalytic code-breaking at . Colossus employed approximately 1,500 vacuum tubes alongside relays for auxiliary functions to perform logical operations on teleprinter signals, processing encrypted messages by comparing bit streams from paper tapes and generating pulse outputs for decryption. Designed by engineer , it handled high-speed evaluations to test wheel settings, contributing significantly to Allied intelligence efforts by automating what would otherwise require manual computation.

First Stored-Program Computers

The transition to electronic stored-program computers marked a pivotal shift in the post-World War II era, enabling instructions to be stored and modified in the same as data, unlike earlier machines that required physical rewiring or plugboard reconfiguration. This innovation built briefly on relay-based precursors by replacing mechanical relays with vacuum tubes for faster electronic operation, but the core advancement lay in memory technologies that allowed dynamic programming. The (Electronic Numerical Integrator and Computer), completed in 1945 at the under and for the U.S. Army, was the first general-purpose electronic digital computer, utilizing approximately 18,000 vacuum tubes to perform ballistic trajectory calculations at speeds up to 5,000 additions per second. It was programmed by setting switches and inserting cables on plugboards, a process that could take days for reconfiguration, limiting its flexibility despite its programmable nature for tasks like hydrogen bomb simulations after the war. The , or Small-Scale Experimental Machine (SSEM), developed by Frederic C. Williams and Tom Kilburn at the , achieved the world's first successful execution of a stored on June 21, 1948. This prototype used a Williams-Kilburn for memory, storing 32 words of 32 bits each, and ran Kilburn's initial to find the highest proper factor of $2^{18} (262,144) through repeated subtraction, completing the task in about 52 minutes after 11 a.m. The machine's simple instruction set, focused on subtraction and negation, demonstrated the feasibility of electronic for both data and instructions, paving the way for scalable computing. In 1949, the (Electronic Delay Storage Automatic Calculator) at the , led by , became the first practical for regular scientific use, operational on with its inaugural program computing squares from 0 to 99. It employed mercury , capable of holding about 1,024 18-bit words circulating as acoustic pulses in 32 mercury-filled tanks, and introduced subroutine libraries to facilitate complex calculations like differential equations for research in physics and . This design emphasized reliability and ease of use, running continuously for users and influencing subsequent British computing efforts. Kathleen Booth, while at Birkbeck College in London, developed the first assembly language in 1947 for the Automatic Relay Computer (ARC) simulator, introducing symbolic mnemonics and labels to simplify programming over binary codes. Her work, detailed in the 1947 manual Coding for A.R.C., allowed instructions like "T" for transfer and "S" for subtract to represent operations, with labels for memory addresses, making it easier to write and debug programs for the relay-based ARC and later electronic systems. This innovation, born from Booth's six-month collaboration at the Institute for Advanced Study in Princeton, bridged the gap toward more abstract programming methods.

Information Processing Theories

Shannon's Information Theory

Claude Shannon's foundational work in began with his 1937 master's thesis at , titled "A Symbolic Analysis of and Switching Circuits," which applied to the design and analysis of electrical switching circuits, laying the groundwork for digital logic and extending early ideas toward probabilistic models of communication systems. This thesis demonstrated how complex relay networks could be simplified using symbolic logic, influencing the transition from mechanical to electronic computing by modeling circuits as decision processes. In 1948, published "" in the Technical Journal, establishing a rigorous framework for quantifying and its transmission. Central to this was the concept of as a measure of uncertainty in a message source, defined by the formula H = -\sum_{i} p_i \log_2 p_i, where p_i represents the probability of each symbol in the source, providing a statistical limit on the average content per symbol. also introduced the , which specifies the C as the maximum rate at which can be reliably transmitted over a noisy channel, given by C = B \log_2 (1 + S/N) for a band-limited channel with bandwidth B, signal power S, and noise power N, proving that reliable communication is possible up to this limit using appropriate encoding. These ideas abstracted into binary digits, or "bits"—a term coined by John W. Tukey—representing the smallest unit of as a choice between two equally likely outcomes. Building on Shannon's , developed optimal prefix codes for lossless data in his 1952 paper "A Method for the Construction of Minimum-Redundancy Codes," published in the Proceedings of the IRE. constructs variable-length codes where more frequent symbols receive shorter codes, achieving compression rates approaching the source without ambiguity in decoding, which became a cornerstone for efficient data representation in . Shannon's theory profoundly influenced computing through the development of error-correcting codes, which add redundancy to detect and correct transmission s, enabling reliable and processing in digital systems. For instance, these codes ensured in early supercomputers and storage media by approaching the limit for error-free transmission. In data transmission, the theory defined fundamental limits on and error rates, guiding the design of early computer networks by establishing maximum reliable throughput over noisy channels.

Cybernetics and Feedback Systems

Cybernetics emerged as an interdisciplinary field in the mid-20th century, pioneered by mathematician , who coined the term from the Greek word for "steersman" to describe the study of control and communication in machines and living organisms. In his seminal 1948 book, Cybernetics: Or Control and Communication in the Animal and the Machine, Wiener introduced the concept of feedback loops as central to understanding adaptive systems, drawing parallels between mechanical devices and biological processes. This work built on Wiener's wartime research, particularly the development of servomechanisms for anti-aircraft predictors during , where predictive devices used feedback to track fast-moving targets like enemy aircraft by continuously adjusting aim based on observed deviations. A core idea in Wiener's framework was negative feedback, which promotes system stability by counteracting deviations from a desired state, much like a that activates heating when temperature drops below a setpoint and shuts it off upon reaching equilibrium. Wiener also conceptualized information as a form of negative , arguing that organized messages reduce and in a , akin to injecting into chaotic processes. These principles extended to broader applications, including Wiener's earlier contributions to analyzers—mechanical devices for decomposing signals into components—which foreshadowed automated in control systems. The , a series of ten interdisciplinary meetings held in from to , played a pivotal role in disseminating cybernetic ideas and fostering collaboration among scientists. Organized by the Josiah Macy Jr. Foundation, these gatherings brought together figures like , , and to discuss mechanisms, circular causality, and their implications for , , and social sciences, ultimately laying the groundwork for general . ' emphasis on influenced early and automation, enabling the design of self-regulating machines that mimicked biological adaptation, such as automated governors and predictive controls in industrial processes.

Emergence of Modern Computer Science

Von Neumann Architecture

The Von Neumann architecture, formalized in a collaborative effort documented in John von Neumann's 1945 "First Draft of a Report on the ," established the foundational design for stored-program digital computers by integrating instructions and data within a unified memory system. This report, dated June 30, 1945, and drafted under contract with the U.S. Army Ordnance Department and the , outlined five primary components: a central arithmetic unit for performing computations, a central for sequencing operations, a memory unit capable of storing both data and instructions at electronic speeds, and input/output mechanisms for interfacing with external devices. However, the report's solo authorship by , without crediting key contributors from the Moore School team such as and , led to significant controversy over attribution, contributing to their departure from the project. The stored-program concept enabled flexible reprogramming without hardware reconfiguration, marking a shift from fixed-function machines like to general-purpose systems. As a to the project starting in 1944, synthesized ideas from the Moore School team, producing a cohesive blueprint that prioritized modularity and scalability. To prototype this architecture, led the development of the Institute for Advanced Study (IAS) machine at Princeton, operational in 1952 as the first fully functional implementation of the design. The featured 1,024 words of 40-bit electrostatic memory (using Williams-Kilburn cathode-ray tubes), a with arithmetic and control elements, and support for up to 512 auxiliary storage words on a drum. This prototype demonstrated the feasibility of electronic stored-program computing, performing approximately 16,000 additions per second and serving as a template for subsequent systems. A inherent limitation of the architecture, later termed the von Neumann bottleneck, arises from the shared memory bus for fetching both instructions and , constraining throughput as computational demands increase. This sequential access creates a linear processing constraint, where the must alternate between retrieval and operations, potentially slowing performance in intensive applications. To mitigate this, later designs evolved toward variants, employing separate buses for instructions and to enable parallel access and reduce contention. Post-World War II, the architecture gained standardization through commercial implementations, notably the delivered in 1951 by to the U.S. Census Bureau. Based directly on the principles, UNIVAC I incorporated a stored-program memory using mercury delay lines for 1,000 words, a central processor, and tape-based , processing up to 1,905 additions per second. Its success in census tabulation and business applications propelled the architecture's adoption, leading to over 40 units sold and influencing the trajectory of electronic data processing.

Birth of Artificial Intelligence

The formal inception of as a distinct subfield of occurred at the in 1956, where researchers gathered to explore the possibility of machines simulating human intelligence. Organized by John McCarthy, , , and , the workshop coined the term "" and proposed that significant progress could be achieved within a generation, including the development of machines capable of using language, forming abstractions and concepts, solving problems in diverse domains, and improving themselves through learning. This event marked a pivotal shift from theoretical foundations toward practical efforts in symbolic reasoning and problem-solving programs. One of the earliest demonstrations of AI potential was the , developed in 1956 by Allen Newell and , which automated the proof of mathematical theorems from and Bertrand Russell's . Implemented on the JOHNNIAC computer at the , the program used search methods to explore proof trees, successfully proving 38 of the first 52 theorems in the book and even discovering more elegant proofs than the originals. Building on this, Newell and Simon introduced the General Problem Solver (GPS) in 1959, a more general framework for problem-solving that employed means-ends analysis to reduce differences between current states and goals, applicable to puzzles like the and theorem proving. These programs exemplified the symbolic AI paradigm, emphasizing rule-based manipulation of abstract representations over numerical computation. John McCarthy advanced symbolic AI through the creation of Lisp in 1958, the first programming language designed specifically for AI research, which treated code and data as interchangeable lists to facilitate symbolic processing. Featuring innovations like conditional expressions, recursion for iterative processes, and automatic garbage collection to manage memory, Lisp enabled efficient implementation of AI algorithms on computers like the IBM 704 and became the dominant language for AI development for decades. Marvin Minsky contributed to early approaches with the SNARC in 1951, the first electronic simulator built using tubes to model stochastic in a rat-like navigation task with 40 simulated neurons. Later, in his 1969 book Perceptrons co-authored with , Minsky rigorously analyzed the mathematical limitations of single-layer perceptrons, proving they could not solve linearly inseparable problems like the XOR function without additional mechanisms, which temporarily dampened enthusiasm for connectionist models in favor of symbolic methods.

References

  1. [1]
    History of computers: A brief timeline | Live Science
    Dec 22, 2023 · The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.
  2. [2]
  3. [3]
    Computer Science & Engineering Timeline | Don H. Johnson
    1968, Donald Knuth publishes The Art of Computer Programming ; 1968, J.C.R. Licklider and Robert Taylor publish “The Computer as a Communications Device“, ...
  4. [4]
    [PDF] History and Contributions of Theoretical Computer Science
    Below we provide a brief history of theoretical computer science followed by a review of its contributions to the practice of computer science as well as ...
  5. [5]
    Babylonian mathematics - MacTutor - University of St Andrews
    The Akkadians invented the abacus as a tool for counting and they developed somewhat clumsy methods of arithmetic with addition, subtraction, multiplication ...
  6. [6]
    Earliest Known Uses of Some of the Words of Mathematics (C)
    The counters of a Roman abacus were originally made of stone and called calculi. (Smith vol. 2, page 165). In Latin, persons who did counting were called ...
  7. [7]
    [PDF] Old Calculator With Beads
    Early Beginnings: From Mesopotamia to China. The earliest known counting boards, precursors to the abacus, appeared in Mesopotamia around 2300 BCE. These ...
  8. [8]
    John Napier - Biography - MacTutor - University of St Andrews
    Napier's numbering rods were made of ivory, so that they looked like bones which explains why they are now known as Napier's bones. To multiply numbers the ...
  9. [9]
    Blaise Pascal - Biography - MacTutor - University of St Andrews
    There were problems faced by Pascal in the design of the calculator which were due to the design of the French currency at that time. There were 20 sols in ...Missing: limitations | Show results with:limitations
  10. [10]
    Pythagoras's theorem in Babylonian mathematics - MacTutor
    Plimpton 322 is the tablet numbered 322 in the collection of G A Plimpton housed in Columbia University. You can see from the picture that the top left hand ...
  11. [11]
    [PDF] Greatest Common Divisor: Algorithm and Proof
    Aug 9, 2019 · The second section explores the algorithm from the text of Euclid's Elements, written ca. 300 BCE [Euclid, 2002]. In the Elements, Euclid ...
  12. [12]
    Archimedes and his Pi - Math! Science! History!™
    Aug 4, 2022 · Around 250 BCE, Archimedes ... Archimedes's method of approximation of π with inscribed and circumscribed polygons – graphic by Gabrielle Birchak.
  13. [13]
  14. [14]
    Brahmagupta (598 - 670) - Biography - MacTutor
    Zero divided by zero is zero. Really Brahmagupta is saying very little when he suggests that n divided by zero is n / 0 n/0 n/0. He is certainly wrong when he ...Brahmagupta · Poster of Brahmagupta · QuotationsMissing: primary source
  15. [15]
    (PDF) BRAHMAGUPTA AND THE CONCEPT OF ZERO
    This paper attempts to reconstruct the possible reasoning process that led the Indian mathematician Brahmagupta in 628 AD to the formulation of two ...
  16. [16]
    [PDF] Pingala and the Beginnings of Combinatorics in India - IISc Math
    As given above, neither choice l = 0, g = 1 or g = 0, l = 1 turns a list into binary numbers in order. Some variant orderings do, e.g., set (l, g) = (1, 0) and ...
  17. [17]
    [PDF] Exploring Mathematical Roots: From Pingala's Chandashastra to ...
    Pingala's thorough investigation goes beyond simply recognizing syllabic patterns. ... Sanskrit Prosody, Pingala Sutras, and Binary. Arithmetic. In: Emch GG ...
  18. [18]
    Explanation of Binary Arithmetic - Leibniz Translations
    An English translation of EXPLANATION OF BINARY ARITHMETIC by Gottfried Wilhelm Leibniz, from 1703.
  19. [19]
    Leibniz Expounds on Binary Arithmetic for Computing
    Jan 21, 2010 · Leibniz viewed binary arithmetic less as a computational tool than as a means of discovering mathematical, philosophical and even theological truths.
  20. [20]
    [PDF] Development of the Binary Number System and the Foundations of ...
    Leibniz was also a prominent inventor of mechanical calculators, creating the Leibniz wheel that was used in mechanical calculators until the invention of ...
  21. [21]
    George Boole - Stanford Encyclopedia of Philosophy
    Apr 21, 2010 · These results appeared in two major works, The Mathematical Analysis of Logic (1847) and The Laws of Thought (1854). 1. Life and Work; 2. The ...
  22. [22]
    The Algebra of Logic Tradition - Stanford Encyclopedia of Philosophy
    Mar 2, 2009 · De Morgan's approach was to dissect every aspect of traditional deductive logic (usually called 'Aristotelian logic') into its minutest ...
  23. [23]
    Gottlob Frege - Stanford Encyclopedia of Philosophy
    Sep 14, 1995 · In 1879, Frege published his first book Begriffsschrift ... calculus formulable in Frege's logic is a 'second-order' predicate calculus.Frege's Logic · Frege's Theorem · 1. Kreiser 1984 reproduces the...
  24. [24]
    Frege's Logic - Stanford Encyclopedia of Philosophy
    Feb 7, 2023 · Friedrich Ludwig Gottlob Frege (b. 1848, d. 1925) is often credited with inventing modern quantificational logic in his Begriffsschrift.The Logic of Begriffsschrift · The Axioms and Rules of... · The Logic of Grundgesetze
  25. [25]
    The Engines | Babbage Engine - Computer History Museum
    Babbage's calculating engines are decimal digital machines. They are decimal in that they use the familiar ten numbers '0' to '9' and they are digital in the ...
  26. [26]
    Charles Babbage's Difference Engines and the Science Museum
    Jul 18, 2023 · Charles Babbage's Difference Engine number 2 has attracted considerable interest since it was first displayed in the Science Museum in 1991.Missing: 31 | Show results with:31
  27. [27]
    Babbage's Mechanical Calculator | Research Starters - EBSCO
    In 1822, Babbage completed a small working model with spare parts in his basement. The model calculated six-digit numbers and could evaluate functions with a ...Missing: exact | Show results with:exact
  28. [28]
    Charles Babbage's Difference Engine | Whipple Museum
    The Difference Engine no. 1, was a machine intended to save the government money by preventing critical errors in tables calculated and copied by hand.Missing: brass wheels 31
  29. [29]
    A Brief History | Babbage Engine
    For the British Government that had bankrolled the venture, the project was a costly failure. When the final bills were paid the Treasury had spent £17,500 ...Missing: 1842 | Show results with:1842
  30. [30]
    Analytical Engine Programming Cards - Fourmilab
    Programs for The Analytical Engine were to be punched on pasteboard Jacquard cards. ... conditional backing cards, flagged in blue. N000 1 N001 1 N002 11 ...
  31. [31]
    Babbage's Difference Engine No 2, 2002.
    The engine was built by the Science Museum and the main part was completed in June 1991 for the bicentennial year of Babbage's birth. The printing mechanism was ...
  32. [32]
    The punched card tabulator - IBM
    later dubbed the Analytical Engine — that could do math using ...Missing: influence | Show results with:influence
  33. [33]
    1.2 Early digital computational devices
    The punched-card idea was adopted later by Charles Babbage in about 1830 to control his Analytical Engine, and later by Herman Hollerith for tabulating the 1890 ...
  34. [34]
    Hilbert's Program - Stanford Encyclopedia of Philosophy
    Jul 31, 2003 · He proposed the problem of finding such a proof as the second of his 23 mathematical problems in his address to the International Congress of ...
  35. [35]
    The Church-Turing Thesis (Stanford Encyclopedia of Philosophy)
    Jan 8, 1997 · Church and Turing took on the Entscheidungsproblem for a fundamentally important logical system called the (first-order) functional calculus.
  36. [36]
    The Rise and Fall of the Entscheidungsproblem
    The Entscheidungsproblem is the problem of finding a method to determine if a formula is provable in a system of symbolic logic.
  37. [37]
    Gödel's Incompleteness Theorems
    Nov 11, 2013 · Gödel's two incompleteness theorems are among the most important results in modern logic, and have deep implications for various issues.
  38. [38]
    Alonzo Church - Stanford Encyclopedia of Philosophy
    Oct 21, 2021 · Alonzo Church (1903–1995) was a renowned mathematical logician, philosophical logician, philosopher, teacher and editor.Computability and Church's... · Applications of the Logistic... · Paradox
  39. [39]
    [PDF] ON COMPUTABLE NUMBERS, WITH AN APPLICATION TO THE ...
    The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means.
  40. [40]
    Turing machine - Stanford Encyclopedia of Philosophy
    Sep 24, 2018 · Given Gödel's completeness theorem (Gödel 1929) proving that there is an effective procedure (or not) for derivability is also a solution to the ...
  41. [41]
    [PDF] An Unsolvable Problem of Elementary Number Theory Alonzo ...
    Mar 3, 2008 · An Unsolvable Problem of Elementary Number Theory. Alonzo Church. American Journal of Mathematics, Vol. 58, No. 2. (Apr., 1936), pp. 345-363 ...
  42. [42]
  43. [43]
    [PDF] Shannon-1937.pdf
    series-parallel circuit. The series-parallel ~e811z8- tion may require more ... ity of relay and sWitching circuits and to illustrate the use of the ...Missing: diagrams | Show results with:diagrams
  44. [44]
    Reckoners Chapter 2
    In 1938 and 1939 Zuse pressed on with that approach: a mechanical memory, an arithmetic and control unit made of relays, and program control by perforated 35mm ...
  45. [45]
    [PDF] Military Roots - MIT
    Because signals transmitted by radio could be intercepted much more easily than communications over land lines, cryptanalysis became an economic means of ...
  46. [46]
    The Stored Program - CHM Revolution - Computer History Museum
    Manchester Small Scale Experimental Machine ("Baby"). This groundbreaking demonstration machine was the first computer to execute a program from memory.
  47. [47]
    The ENIAC Story
    The world's first electronic digital computer was developed by Army Ordnance to compute World War II ballistic firing tables.
  48. [48]
    The Women Behind ENIAC - IEEE Spectrum
    Nov 21, 2022 · Invented in 1945, the Electronic Numerical Integrator and Computer (ENIAC) was built to improve the accuracy of U.S. artillery during World War ...
  49. [49]
    ENIAC: The Army-Sponsored Revolution
    The US Army unveiled the Electronic Numerical Integrator and Computer (ENIAC) the world's first operational, general purpose, electronic digital computer.
  50. [50]
    ENIAC Turns 75 - Communications of the ACM
    Feb 11, 2021 · On February 14, 1946, the pair publicly unveiled the world's first true computer: ENIAC (Electronic Numerical Integrator and Computer).
  51. [51]
    The Manchester Small Scale Experimental Machine -- "The Baby"
    The world's first stored-program electronic digital computer successfully executed its first program on 21st June 1948. That program was written by Tom ...
  52. [52]
    How does Baby's memory work? | Science and Industry Museum
    Jun 13, 2018 · Baby used three Williams tubes and one CRT, with the first Williams tube providing the 32 by 32-bit random-access memory (RAM), the next holding ...
  53. [53]
    The First Program (Digital 60) - The University of Manchester
    The necessary divisions were done not by long division but by repeated subtraction of b (because the "Baby" had no hardware divider). The original number used ...Missing: squares | Show results with:squares
  54. [54]
    A brief informal history of the Computer Laboratory
    1949 6 May First logged program on EDSAC 1 (computing squares of 0-99). This was the first complete and fully operational regular electronic digital stored ...
  55. [55]
    1949: EDSAC computer employs delay-line storage
    In May 1949, Maurice Wilkes built EDSAC (Electronic Delay Storage Automatic Calculator), the first full-size stored-program computer.
  56. [56]
    EDSAC - CHM Revolution - Computer History Museum
    The Electronic Delay Storage Automatic Calculator (EDSAC), developed at Britain's Cambridge University, ran its first programs in 1949.
  57. [57]
    1949 | Timeline of Computer History
    Jun 11, 2025 · The first practical stored-program computer to provide a regular computing service, EDSAC is built at Cambridge University using vacuum tubes ...
  58. [58]
    EDSAC 1 and after
    The participants by then had realised how splendid was the success of the Cambridge team in bringing EDSAC into working operation in 1949. I have lost my ...
  59. [59]
    Computer Resurrection Issue 51
    By late 1946 Booth was building one of the first computers in the UK, called the Automatic Relay Calculator (ARC). In 1947 Booth spent six months on a ...
  60. [60]
    the search for simplicity in the total hardware-software combination
    Jan 31, 2024 · First assembly language concept developed by Kathleen Booth (1947) at Institute for Advanced Study, Princeton (Booth-1947. pdf).
  61. [61]
    A symbolic analysis of relay and switching circuits - DSpace@MIT
    A symbolic analysis of relay and switching circuits. Author(s). Shannon, Claude Elwood,1916-2001. Thumbnail. Download34541425-MIT.pdf (16.35Mb). Advisor.
  62. [62]
  63. [63]
    A mathematical theory of communication - IEEE Xplore
    A mathematical theory of communication ; Page(s): 379 - 423 ; Date of Publication: July 1948 ; ISSN Information: Print ISSN: 0005-8580 ; Persistent Link: https:// ...
  64. [64]
  65. [65]
    Celebrating Claude Shannon - IEEE Spectrum
    Mar 23, 2016 · In “A Mathematical Theory of Communication,” published in 1948, Shannon presented a unifying theory for the transmission of information that ...Missing: original | Show results with:original
  66. [66]
  67. [67]
    Claude Shannon: Tinkerer, Prankster, and Father of Information ...
    He realized that, just as digital codes could protect information from prying eyes, so could they shield it from the ravages of static or other forms of ...
  68. [68]
    Turbo Codes Explained: History, Examples, and Applications
    In a landmark 1948 paper, Shannon, who died in 2001, showed that with the right error-correction codes, data could be transmitted at speeds up to the channel ...
  69. [69]
    [PDF] Cybernetics: - or Control and Communication In the Animal - Uberty
    NORBERT WIENER second edition. THE M.I.T. PRESS. Cambridge, Massachusetts. Page 3 ... In the anti-aircraft predictors which I described, the linear ...
  70. [70]
    [PDF] History of Cybernetics
    The basic concepts of cybernetics are negative feedback and information. A famous example of negative feedback is given by Watt's governor, the purpose of which ...
  71. [71]
    Norbert Wiener - The Information Philosopher
    Wiener's negative of the entropy led Leon Brillouin to coin the term negentropy. The quantity we here define as amount of information is the negative of the ...Missing: stability | Show results with:stability
  72. [72]
    A note about Norbert Wiener and his contribution to Harmonic ...
    May 6, 2009 · In this note we explain the main motivations Norbert Wiener had for the creation of his Generalized Harmonic Analysis [13] and his Tauberian ...Missing: analyzer | Show results with:analyzer<|control11|><|separator|>
  73. [73]
    Summary: The Macy Conferences - American Society for Cybernetics
    It was designed to allow social scientists to meet with Wiener and von Neumann, to hear about their ideas, and to discuss how these ideas might be valuable in ...
  74. [74]
    Von Neumann Privately Circulates the First Theoretical Description ...
    This document, written between February and June 1945, provided the first theoretical description of the basic details of a stored-program computer.
  75. [75]
    First draft of a report on the EDVAC - Smithsonian Libraries
    First draft of a report on the EDVAC ; Creator: Von Neumann, John; United States. Army. Ordnance Department.; University of Pennsylvania ; Published: Moore School ...Missing: original | Show results with:original
  76. [76]
  77. [77]
    ft.com - VON NEUMANN: Architect of the computer age
    The next year, 1928, he also provided his first contribution to "game theory", a branch of mathematics that he largely invented and later (with the economist ...
  78. [78]
    Establishing a Pattern: Von Neumann at the IAS - CHM Revolution
    Operational in 1952, the IAS machine was the prototype for the first generation of digital computers.Missing: mercury | Show results with:mercury
  79. [79]
    IAS computer - Computer History Wiki
    Aug 29, 2024 · They contained 1K words in total, each tube using a 32x32 array. (Although this memory had to be refreshed, refresh cycles could be combined ...Missing: mercury | Show results with:mercury
  80. [80]
    Von Neumann Architecture - an overview | ScienceDirect Topics
    This bottleneck arises because instruction fetch and data operations share a common bus, limiting the rate at which instructions and data can be accessed ...
  81. [81]
    [PDF] Von Neumann Computers 1 Introduction - Purdue Engineering
    Jan 30, 1998 · It was developed by a research group at Harvard University at roughly the same time as von Neumann's group developed the Princeton architecture.
  82. [82]
    [PDF] The Advent of Commercial Computing, 1945–1956 - MIT
    Before providing a description of the UNIVAC, it is worth a brief look at the essentials of the architecture that von Neumann described in his. 1945 report ...<|control11|><|separator|>
  83. [83]
  84. [84]
    A Proposal for the Dartmouth Summer Research Project on Artificial ...
    Dec 15, 2006 · The 1956 Dartmouth summer research project on artificial intelligence was initiated by this August 31, 1955 proposal, authored by John McCarthy, Marvin Minsky, ...Missing: organizers | Show results with:organizers
  85. [85]
    [PDF] History of Lisp - John McCarthy
    Feb 12, 1979 · My desire for an algebraic list processing language for artificial intelligence work on the IBM 704 computer arose in the summer of 1956 during ...