Fact-checked by Grok 2 weeks ago

MANIAC I

MANIAC I (Mathematical Analyzer, Numerical Integrator, and Computer Model I) was an early electronic digital computer developed and built at under the direction of physicist . Operational from 1952 until its replacement in 1958, the machine was modeled on John von Neumann's stored-program architecture from the Institute for Advanced Study computer. Featuring electrostatic memory with a capacity of 1,024 40-bit words operating in parallel mode, MANIAC I achieved speeds of up to 10,000 operations per second, establishing it as one of the most capable computers of its era upon debut. Designed primarily for scientific and numerical computations at the laboratory, including simulations for physics problems, it advanced computational capabilities essential to nuclear research and theoretical modeling. Among its notable applications, MANIAC I executed the first computer program to defeat a human opponent in a simplified known as in 1956, demonstrating early potential for algorithmic game-solving.

Development

Origins and Design

The development of MANIAC I stemmed from the escalating computational requirements at for advanced nuclear simulations following the . Prior to its construction, laboratory researchers depended on remote access to machines like , which proved inadequate for the volume and complexity of calculations needed in thermonuclear research and methods. In 1948, , a key figure in wartime computing efforts, returned to to lead the Theoretical Division's computing initiatives, prompting the decision to construct an on-site electronic digital computer. Construction commenced in mid-1949, with the project organized into dedicated hardware and software teams to achieve computational independence. MANIAC I's design drew directly from John von Neumann's architecture developed for the Institute for Advanced Study (IAS) machine in Princeton, adopting a stored-program paradigm where instructions and data resided in the same memory, allowing reprogramming via software rather than hardware reconfiguration. Metropolis directed the effort, collaborating with chief engineer Jim Richardson and a team of physicists and engineers to adapt the IAS blueprint for Los Alamos's specific needs, including high-precision arithmetic for scientific computations. The resulting machine was a general-purpose, binary serial processor, optimized for flexibility in handling diverse numerical problems while prioritizing reliability in a research environment. This design philosophy emphasized modularity and expandability, facilitating subsequent upgrades and influencing early supercomputing practices.

Construction and Initial Operation

Construction of the MANIAC I, the first digital computer built at , began in mid-1949 following initial design work initiated in 1948 by upon his return to the laboratory. Metropolis directed the project, drawing on John von Neumann's stored-program architecture from the Institute for Advanced Study machine, with von Neumann serving as a consultant. The effort involved separate hardware and software teams; the software group comprised about 10 members, roughly half of whom were women, including mathematicians Mary Tsingou, Verna Ellingson, Lois Cook, and Marjory Jones. The hardware team, led by chief engineer Jim Richardson and including engineers such as Dick Merwin, Howard Parsons, Bud Demuth, Walter Orvedahl, and Ed Klein, assembled the machine using approximately 3,000 vacuum tubes in a structure measuring 7 feet tall and 9 feet wide, weighing about 1,000 pounds. Lacking a commercial , the relied on in-house fabrication, incorporating lessons from the IAS project to accelerate assembly while addressing early electronic computer vulnerabilities, such as tube failures and occasional fires that required fire extinguishers during testing. Components underwent testing in late 1951, marking progress toward completion. The MANIAC I became fully operational in March 1952, enabling initial program runs via punched cards or paper tape to minimize human intervention and programming time. Early applications focused on simulations and Fermi's pion-proton phase-shift analysis, supporting ' computational needs previously met by borrowed time on machines like . At startup, it achieved up to 10,000 operations per second, facilitating methods and other statistical computations critical to laboratory research.

Technical Specifications

Architecture and Components

![The MANIAC’s arithmetic unit nearing completion in 1952.jpg][float-right] The MANIAC I employed a , featuring a stored-program design where instructions and data shared the same memory space, enabling sequential execution controlled by a . It utilized a one-address format, with each 20-bit specifying an operation code and , allowing access to up to locations in primary memory. The system operated on 40-bit words, comprising 39 magnitude bits and one , supporting both fixed-point and conversions to coded-decimal notation for input and output. Arithmetic operations were performed in a dedicated unit with six 40-bit registers (R1 through R6), where R1-R2 served as the accumulator pair and R3-R4 handled quotient storage during . http://www.bitsavers.org/pdf/lanl/LA-1725_The_MANIAC_Jul54.pdf[](http://www.bitsavers.org/pdf/lanl/LA-1725_The_MANIAC_Jul54.pdf) Primary memory consisted of 1024 40-bit words stored electrostatically in 40 cathode-ray tubes (CRTs), each holding a 32 by 32 refreshed continuously to maintain , with access times under 10 microseconds. This Williams tube-style allowed parallel readout and was addressed via a deflection for precise electron beam positioning. An auxiliary magnetic drum provided secondary for up to 10,000 words across 200 tracks, organized in 50-word blocks, though access latencies ranged from 68 to 85 milliseconds per block, necessitating buffering in designated locations for efficient transfers. The included an order to decode instructions, a for timing operations, and an instruction managing fetch-execute cycles via a control counter. http://www.bitsavers.org/pdf/lanl/LA-1725_The_MANIAC_Jul54.pdf[](http://www.bitsavers.org/pdf/lanl/LA-1725_The_MANIAC_Jul54.pdf) The arithmetic unit executed and in approximately 158-160 pulses, multiplication via successive shifts in , and division using a pseudo-non-restoring over 40 steps, achieving overall performance of up to 10,000 operations per second. Input was handled through a photoelectric paper tape reader accepting punched decimal-coded programs, while output devices included a high-speed synchroprinter capable of 36,000 characters per minute and a slower flexowriter for verification. units stored blocks of words on 1200-foot reels, facilitating bulk data exchange. All major components relied on technology, with flip-flop circuits for registers and logic, contributing to the machine's total weight of around 1,000 pounds. http://www.bitsavers.org/pdf/lanl/LA-1725_The_MANIAC_Jul54.pdf[](http://www.bitsavers.org/pdf/lanl/LA-1725_The_MANIAC_Jul54.pdf)[](https://ladailypost.com/lanl-70-years-of-electronic-computing/)[](https://www.lanl.gov/media/publications/national-security-science/1220-computing-on-the-mesa)

Memory and Processing Capabilities

The MANIAC I featured a high-speed electrostatic using 40 Williams-Kilburn cathode-ray tubes, each storing bits in a 32×32 raster pattern, operating in parallel to provide words of 40 bits each, for a total capacity of approximately 40,960 bits. Access time to this main was under 10 microseconds, with continuous regeneration via a deflection to maintain data persistence. An auxiliary magnetic drum provided secondary storage of 10,000 forty-bit words across 200 tracks, with each track holding 50 words serially; average access time per 50-word block was 78.5 milliseconds at a rotation speed of 3450 rpm. Processing was handled by a vacuum-tube-based arithmetic unit comprising six 40-bit registers (R1–R6), where R1–R5 supported arithmetic operations in complement notation with a fixed binary point, and R6 managed control functions. The system employed approximately 2500 vacuum tubes and 800 germanium diodes for logic and amplification. Basic addition and subtraction executed immediately upon register contents, while multiplication and division required 40 steps each, supporting both fixed-point binary and coded-decimal modes, with floating-point arithmetic added later via programming. Overall instruction throughput reached about 10,000 per second, facilitated by a one-address instruction format using 20-bit orders (8 bits for operation code from a set of 36 available, 12 for address), stored two per 40-bit word.
ComponentCapacityAccess TimeNotes
High-Speed Electrostatic (Williams Tubes)1024 × 40-bit words<10 μs40 tubes in parallel; regeneration required
Magnetic Drum (Auxiliary)10,000 × 40-bit words~78.5 ms (avg. per block)200 tracks × 50 words; 3450 rpm rotation
The design prioritized serial binary processing for efficiency in scientific computations, with instructions enabling shifts, conditional branches, and input/output to the drum, though indeterminate operations like memory fetches introduced variable pulse timings. This configuration, operational by March 1952, supported the machine's role in complex simulations despite limitations in speed compared to later systems.

Key Applications

Nuclear and Thermonuclear Simulations

The MANIAC I, operational at starting in early 1952, was deployed primarily for high-precision simulations of implosion dynamics and thermonuclear processes essential to weapons refinement. Its von Neumann-based facilitated iterative numerical solutions to partial differential equations governing shock waves, compression, and energy release in triggers. These capabilities exceeded prior punched-card systems like IBM's Selective Sequence Electronic Calculator (SSEC), enabling faster convergence on design parameters for plutonium pits and uranium assemblies. In March 1952, MANIAC I executed its inaugural large-scale hydrodynamic computation, modeling the spherical convergence of explosive compression in a nuclear device, which required tracking fluid instabilities and yield predictions over thousands of time steps. This calculation, processed at speeds up to 5,000 additions per second, validated theoretical models against limited experimental data from prior tests like , reducing reliance on empirical scaling laws. Subsequent runs refined asymmetry tolerances in lens configurations, critical for achieving supercriticality in implosion-type bombs. For thermonuclear applications, MANIAC I supported post-1951 Teller-Ulam staging by simulating radiation-driven ablation and inertial confinement in secondaries, addressing uncertainties in lithium deuteride ignition and neutron multiplication. Operational until 1958, it processed ensembles of scenarios to forecast energy partitioning between and stages, informing the device's 10.4-megaton yield on November 1, 1952, without prior full-yield predecessors. These efforts prioritized deterministic hydrocodes alongside probabilistic tools to quantify variances in opacity and cross-sections. Monte Carlo techniques, formalized by Nicholas Metropolis, Stanislaw Ulam, and Enrico Fermi at Los Alamos, were adapted for MANIAC I to model stochastic neutron diffusion and reaction probabilities in thermonuclear cascades, sampling millions of particle histories to estimate criticality thresholds. Unlike deterministic finite-difference methods, these variance-reduced samplings handled the exponential proliferation of fusion neutrons, yielding statistical errors below 1% for tamper reflection efficiencies. Fermi's direct oversight of MANIAC runs exemplified physicist-programmer collaboration, with Fermi debugging codes mid-execution to calibrate fusion boost factors.

Monte Carlo Methods and Statistical Computing

The MANIAC I computer significantly advanced the practical of Monte Carlo methods at , enabling statistical sampling techniques for solving complex probabilistic problems in physics that were intractable by deterministic means. Developed in the late 1940s and operational by 1952, these methods—initially conceived by Stanislaw Ulam and formalized by and Ulam in 1949—relied on random sampling to model phenomena like neutron diffusion, with MANIAC providing the computational speed necessary for large-scale simulations beyond the capabilities of predecessors like . A landmark application occurred in 1953, when Metropolis, along with , , , and , utilized MANIAC to compute the equation of state for a two-dimensional system of 224 rigid spheres interacting via hard-core repulsion. The algorithm generated random particle displacements within a square of side length 2a, accepting moves based on the Boltzmann factor exp(-ΔE/kT) to sample configurations proportional to their probability, while employing to minimize surface effects. Pressure was derived from the , with simulations run over 48–64 cycles for parameters yielding areas A/A₀ from approximately 1.4 to 2.5; results aligned well with free-volume theory at high densities and virial expansions at low densities, achieving errors around 3%. Each cycle required about 3 minutes on MANIAC, totaling 4–5 hours per pressure point, demonstrating the machine's efficiency for Markov chain-based sampling that became foundational to modern . MANIAC also supported Monte Carlo simulations for in thermonuclear designs, extending wartime efforts by modeling particle paths and interactions with improved techniques like . In 1952, leveraged the system for pion-proton scattering phase-shift analysis, using statistical methods to evaluate integrals over . These applications, overseen by Metropolis as project leader, expanded 's scope from to broader statistical , fostering algorithms that prioritized computational feasibility over exhaustive and influencing subsequent codes for radiation transport.

Other Computational Uses

In 1956, MANIAC I executed a program for , a simplified variant of chess played on a 6×6 board without bishops, marking the first instance of a computer defeating a human opponent in such a game. The program, developed by Paul Stein and Mark Wells with contributions from Stanislaw Ulam, required approximately 20 minutes per move due to the machine's processing limitations. This application demonstrated early efforts in computer game-playing and , distinct from the system's primary scientific simulations. Additionally, physicist George Gamow utilized MANIAC I for pioneering computations in genetics around 1954, modeling aspects of the genetic code through numerical simulations. These efforts involved exploring possible nucleotide assignments for amino acids, predating the full elucidation of DNA structure and function. Gamow's work on MANIAC highlighted the computer's versatility for biological problems, bridging physics and early bioinformatics.

Personnel and Programming

Designers and Builders

The MANIAC I was designed and constructed by a team of scientists and engineers at the Los Alamos Scientific Laboratory, with Nicholas Metropolis serving as project leader. Metropolis, who had contributed to the Manhattan Project's theoretical physics efforts under Enrico Fermi and Edward Teller, shifted focus postwar to computational mathematics and led the Los Alamos group in developing electronic computing capabilities by 1948. The machine's architecture drew directly from John von Neumann's design for the Institute for Advanced Study (IAS) computer at Princeton, adapting stored-program principles to meet needs for high-speed in weapons simulations. This influence stemmed from von Neumann's consultations with personnel, including , during the IAS project's development in the late . Engineering oversight fell to chief engineer Jim Richardson, who coordinated hardware assembly alongside a core group including Dick Merwin, Howard Parsons, Bud Demuth, Walter Orvedahl, and Ed Klein. began in earnest around 1950, with the taking shape that year, and the arithmetic unit nearing completion by 1952, enabling initial operations later that summer. The team's efforts resulted in a general-purpose electronic digital computer capable of approximately 3,500 additions per second, prioritizing reliability for scientific computations over commercial scalability.

Notable Programmers and Users

Marjorie Devaney (also known as Marjory Jones) was one of the earliest for MANIAC I, beginning her work at in 1951 by encoding programs onto paper tape for machine input. Her contributions spanned programming and operation, supporting scientific computations over a subsequent 40-year career in the field. Mary Tsingou served as a pioneering programmer, implementing algorithms on MANIAC I for simulations including theoretical nuclear weapons modeling and the 1955 Fermi–Pasta–Ulam experiment, which revealed unexpected periodic energy behavior in a simulated nonlinear chain of oscillators. Lois Cook (later Leurgans) functioned as a programmer, operator, and problem analyst, directly interfacing with MANIAC I's arithmetic unit to execute and troubleshoot computational tasks in the 1950s. Paul Stein contributed to specialized programming, co-authoring in 1956 the first chess program for MANIAC I—a variant on a 6×6 board omitting bishops—that achieved the milestone of defeating a human opponent, albeit requiring approximately 20 minutes per move. Prominent users included physicists Stanislaw Ulam and , who leveraged MANIAC I for methods in thermonuclear design simulations starting around 1952. also utilized the machine for statistical computing applications tied to laboratory research.

Impact and Legacy

Advancements in Scientific Computing

The MANIAC I, operational at from March 1952 until its decommissioning in 1957, marked a pivotal shift in scientific computing by implementing a stored-program architecture derived from John von Neumann's design principles, which allowed for flexible reprogramming without hardware rewiring, unlike predecessors such as . This capability supported sustained execution of intricate numerical algorithms, processing up to 10,000 operations per second with a 5-kilobit electrostatic memory capacity, thereby enabling physicists to tackle multidimensional problems in and that demanded iterative refinement over thousands of cycles. A primary advancement lay in its application to Monte Carlo techniques, where random sampling simulated probabilistic processes like diffusion and hard-sphere interactions. In 1953, computations on MANIAC yielded the inaugural equation-of-state estimates for two-dimensional hard-sphere systems via modified , validating statistical methods for configuration-space averaging in dense-matter physics and establishing precedents for handling irreducible uncertainties in many-body simulations. These runs, often spanning days, refined acceptance-rejection sampling protocols co-developed by and collaborators, enhancing efficiency for subsequent thermonuclear yield predictions. MANIAC further propelled progress in deterministic numerical methods, including two-dimensional hydrodynamics codes that modeled and shock propagation with finite-difference schemes, critical for implosion symmetry analysis. It executed iterative solvers for elliptic partial differential equations, such as discretized on grids, achieving convergence rates that informed radiation transport approximations. Programmers developed reusable subroutines for these tasks, fostering modular coding practices that accelerated algorithm porting across physics domains. The 1955 Fermi-Pasta-Ulam simulations on MANIAC exemplified emergent computational insights, probing nonlinear lattice vibrations and uncovering prolonged recurrence patterns rather than expected thermalization, which challenged classical ergodic assumptions and laid groundwork for through empirical verification of integrability thresholds. Collectively, these feats underscored MANIAC's role in transitioning scientific inquiry from analytical approximations to data-driven validation, amplifying computational leverage in high-stakes empirical modeling.

Influence on Subsequent Systems

The MANIAC I's operational success in handling complex numerical simulations from 1952 onward directly informed the design of its successor, the MANIAC II, which entered service at Los Alamos Scientific Laboratory in 1957. Built by the University of California under contract for the laboratory, MANIAC II retained the core von Neumann architecture but incorporated enhancements such as magnetic core memory replacing Williams-Kilburn tube storage, enabling greater reliability and capacity up to 4,096 words. Its initial processing speed matched MANIAC I's approximately 10,000 instructions per second but evolved through modifications to reach 50,000 operations per second by the mid-1960s, supporting expanded applications in weapons design and plasma physics until decommissioning in 1977. MANIAC I's demonstrated capabilities in high-speed arithmetic and methods for thermonuclear calculations highlighted the practical demand for scalable electronic digital computers in national laboratories, prompting commercial responses like IBM's Defense Calculator (701) released in 1953 specifically for scientific and workloads. Although MANIAC I initially outperformed the 701 in raw speed for certain tasks, the latter's production scalability and reliability accelerated the shift from laboratory machines to mass-produced systems, influencing IBM's subsequent 704 model introduced in with registers and floating-point that addressed limitations observed in early designs like MANIAC. As one of the earliest functional implementations of stored-program outside Princeton's , MANIAC I was widely emulated in and settings, contributing to the of compatible architectures such as those in the ILLIAC series at the University of and JOHNNIAC at , which adopted similar parallel arithmetic units and electrostatic storage principles refined through MANIAC's operational experience. This emulation standardized for scientific , emphasizing modularity and upgradability that became hallmarks of mainframes, though MANIAC II marked the end of in-house construction at due to rising costs and commercial alternatives.

Ethical and Strategic Implications

The MANIAC I's computational capabilities significantly advanced the ' thermonuclear weapons program during the early , providing strategic leverage through enhanced simulation accuracy. Operational from March 1952, it executed complex hydrodynamics and transport calculations essential for validating the Teller-Ulam staged implosion design, which integrated primaries to trigger secondaries via compression. These simulations, building on ENIAC's preliminary work, enabled iterative refinements that minimized uncertainties in weapon yield predictions, supporting the transition from one-off experimental devices like (detonated November 1, 1952, yielding 10.4 megatons) to deployable thermonuclear warheads by the late . This efficiency reduced dependence on costly full-scale tests, allowing scientists to explore design variants rapidly and maintain a technological edge over the , which lagged until its 1955 thermonuclear test. Strategically, the MANIAC reinforced U.S. nuclear superiority by facilitating the scaling of arsenal production and integration with delivery systems, underpinning doctrines like articulated by Secretary of State in 1954. By processing up to 10,000 operations per second—far surpassing human calculators or punch-card machines—it democratized access to high-fidelity modeling within the laboratory, empowering physicists like and to optimize fusion efficiency and material configurations. This not only expedited the development of multi-megaton weapons, such as the B41 bomb (yield up to 25 megatons, deployed 1960), but also informed broader deterrence strategies amid escalating Soviet capabilities, including their 1953 fission bomb test and ICBM pursuits. Ethically, the MANIAC's prioritization for weapons simulations exemplified tensions between scientific innovation and moral accountability in nuclear research. While its methods and numerical integrators advanced verifiable physics modeling, their application to amplify destructive yields—potentially enabling continental-scale devastation—intensified debates among alumni. , dismissed as AEC chairman in partly for opposing unchecked thermonuclear pursuit, argued that hydrogen bombs crossed into moral abomination by blurring civilian-military distinctions and risking global catastrophe, a view echoed in his testimony but overridden by proponents emphasizing deterrence imperatives. In contrast, director and project leader focused on technical feasibility without documented public ethical qualms, reflecting the era's classified imperatives where national survival trumped restraint. Retrospectively, the MANIAC's role has been critiqued for accelerating the toward , lowering barriers to escalation by making high-yield weapons computationally routine, though contemporary secrecy limited overt discourse.

References

  1. [1]
    Computing - Nuclear Museum - Atomic Heritage Foundation
    Manhattan Project physicist Nicholas Metropolis built the MANIAC I computer at Los Alamos, basing it on computer architecture John von Neumann had developed at ...Missing: specifications | Show results with:specifications
  2. [2]
    Computing on the mesa | Los Alamos National Laboratory
    Dec 1, 2020 · The MANIAC computer, built at Los Alamos in 1952, was one of the first electronic digital computers and could perform up to 10,000 operations ...Missing: specifications | Show results with:specifications
  3. [3]
    MANIAC: An Early Mainframe Computer at Los Alamos - YouTube
    Apr 19, 2021 · MANIAC: An Early Mainframe Computer at Los Alamos. 4.4K views · 4 ... Computer History Museum•2.8K views · 14:22. Go to channel · Clemson ...Missing: specifications | Show results with:specifications
  4. [4]
    MANIAC computer with covers removed - 102622406 - CHM
    Memory operated in parallel mode, capacity of 1024 40 bit words." MANIAC computer with covers removed - Image 1 MANIAC computer with covers removed - Image 2 ...
  5. [5]
    LANL: 70 Years Of Electronic Computing - Los Alamos Reporter
    Apr 12, 2022 · Capable of 10,000 operations per second, the MANIAC was easily the most powerful computer in the world upon its debut. Fast and possessing a ...Missing: specifications | Show results with:specifications
  6. [6]
    Nicholas Metropolis' Interview - Atomic Heritage Foundation
    The Monte Carlo method is a statistical approach to solve many-body problems. Metropolis also recalls contributing to the development of the MANIAC I computer.
  7. [7]
    [PDF] Scientific Uses of the MANIAC
    MANIAC II was a successor with many improvements started in Los Alamos in 1955. MANIAC III was developed at the University of Chicago when Nick returned there ...<|separator|>
  8. [8]
    Computing and the Manhattan Project - Atomic Heritage Foundation
    Jul 18, 2014 · Manhattan Project physicist Nicholas Metropolis built the MANIAC I computer at Los Alamos, basing it on computer architecture John von ...Missing: origins | Show results with:origins
  9. [9]
    MANIAC | Proceedings of the 1952 ACM national meeting (Toronto)
    The MANIAC is a general purpose, electronic, digital computer which has been designed and constructed at the Los Alamos Scientific Laboratory.
  10. [10]
    Nicholas Metropolis - Nuclear Museum - Atomic Heritage Foundation
    In 1957, Metropolis oversaw the construction of MANIAC II, which replaced MANIAC I on the latter's shutdown in 1958. In 1957, Dr. Metropolis returned to the ...
  11. [11]
    [PDF] Metropolis, Monte Carlo, and the MANIAC - MCNP
    Those who returned to Los Alamos after the war were drawn irresistibly to. Nick and his MANIAC, to what this won- derful electronic computer could do for them.
  12. [12]
    [PDF] THE MANIAC - Bitsavers.org
    First, there is the formulation of the problem itself by the ... MANIAC group who deluged us with criticisms, es- pecially Mark Wells. Los ...
  13. [13]
    LANL: 70 Years Of Electronic Computing - Los Alamos Daily Post
    Apr 14, 2022 · MANIAC project leader Nicholas Metropolis (standing) and the MANIAC's chief engineer Jim Richardson in 1953. Courtesy/LANL. Marjory Jones ...
  14. [14]
    [PDF] C3q h:10s alamos - FAS Project on Government Secrecy
    The early IBM accounting machines were housed in E. Building, and the MANIAC was built in P Building. Since that time the computing center has been moved to ...
  15. [15]
    MANIAC - ACM Digital Library
    The MANIAC is a generalpurpose, electronic, digital cc~nputer which has been designed add con- structed at the Los Alamos Scientific Laboratory. The ...Missing: architecture | Show results with:architecture
  16. [16]
    [PDF] & Computers
    In March 1952 the first large-scale hydrodynamic calculation was completed in Los Alamos on the new computer, the MANIAC (mathematical and numerical integrator ...
  17. [17]
    The Los Alamos Computing Facility During the Manhattan Project
    Metropolis led the effort to build a von Neumann machine at Los Alamos, the MANIAC. John Kemeny, one of the SED PCAM operators, was a co-inventor of the BASIC ...<|separator|>
  18. [18]
    [PDF] Los Alamos National Laboratory: Seven Decades of Computing ...
    MANIAC. This electronic computer developed at LASL was used from 1952 to 1958 to simulate the thermonuclear process. MANIAC ran a broad range of scientific ...
  19. [19]
    Hitting the Jackpot: The Birth of the Monte Carlo Method | LANL
    Nov 1, 2023 · He became known for his design of the MANIAC series of computers in the 1950s. ... equations associated with rapid nuclear processes. Even ...Missing: thermonuclear | Show results with:thermonuclear
  20. [20]
    [PDF] Equation of State Calculations by Fast Computing Machines
    Nicholas Metropolis, Arianna W. ... The last two virial coefficients were obtained by straightforward Monte Carlo integration on the. MANIAC (see Sec.
  21. [21]
  22. [22]
    [PDF] LA-UR-23-30228 - OSTI.gov
    Sep 8, 2023 · Upon the delivery of the MANIAC I, the laboratory's Monte Carlo research grew tremendously. ... “MCNP—A General Monte Carlo Code for Neutron and ...
  23. [23]
    MANIAC I - Chessprogramming wiki
    The Los Alamos group writes a program for the MANIAC I to play a reduced game of chess – using a 6 x 6 board without bishops. Fred Guterl. Quote by Fred Guterl ...
  24. [24]
    Los Alamos scientisits Paul Stern (left) and Nick Metropolis playing ...
    By 1956, mathematician and atomic bomb designer Stanislaus Ulam at Los Alamos programmed the Lab's MANIAC computer to play chess on a 6 x 6 chess board.
  25. [25]
    Gamow and the Genetic Code - Astrophysics Data System
    There is a Science Service report of November 1954 of a meeting of the National Academy in which Gamow described his use of this new Maniac high speed computer ...
  26. [26]
    Marjorie Ann “Marge” Jones Devaney - Memorials - Find a Grave
    She started working on the Mathematical Analyzer, Numerical Integrator, and Computer (MANIAC) program at Los Alamos National Laboratory in 1951.
  27. [27]
    History of LANL Computing - Tiki-Toki
    MANIAC. The first computer constructed at Los Alamos. View on timeline. ~1952. Mary Kircher. Mary Kircher brought ...
  28. [28]
    Oral-History:Mary Tsingou Menzel
    Jan 26, 2021 · Tsingou used the MANIAC to model aspects of theoretical nuclear weapons. Tsingou returned to the University of Michigan to earn her Master's ...
  29. [29]
    Los Alamos Chess - Futility Closet
    Apr 11, 2022 · The first chess-like game played by a computer was this little variant, written for the MANIAC I by Paul Stein and Mark Wells at Los Alamos ...
  30. [30]
    Neutronics Calculation Advances at Los Alamos: Manhattan Project ...
    ... MANIAC, led to the creation of Monte Carlo and deterministic discrete ... It consisted of 18 000 vacuum tubes, 70 000 resistors, 10 000 capacitors ...
  31. [31]
    The Maniac II system - ACM Digital Library
    Maniac II is a research computer and undergoes more or less continual modification and improvement. In its eleven-year life, it has grown in speed from 10,000 ...
  32. [32]
    When the World Went Digital | American Scientist
    The MANIAC (for Mathematical and Numerical Integrator and Computer) became operational in 1951. It had five kilobytes of memory, less than what is allocated to ...
  33. [33]
    Computers and Nuclear Weapons Design - GlobalSecurity.org
    Jun 1, 2016 · ... Thermonuclear Weapons, 1946-1950" Prepared by J. Carson Mark. The design adopted for the first hydrogen bomb did not come easily or quickly.
  34. [34]
    The Smartest Man Who Ever Lived - The Atlantic
    Oct 3, 2023 · In his new novel, The MANIAC, the Chilean writer Benjamín Labatut suggests that the name of the computer von Neumann helped invent fits the ...
  35. [35]
    Nuclear bombs, artificial intelligence and the madness of reason
    Nov 14, 2023 · Benjamin Labatut's The Maniac is a barely fictionalised account of scientific developments in the century since Weber issued his warning.