Fact-checked by Grok 2 weeks ago

Earth Simulator

The Earth Simulator is a series of supercomputers operated by the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) to conduct large-scale simulations of Earth's environmental systems, including dynamics, circulation, and . Launched in 2002 with the first-generation system (ES1) developed by , it achieved a peak performance of 35.86 teraflops on the LINPACK benchmark, securing the top position on the list from June 2002 to June 2004 and demonstrating 87.5% computational efficiency. The project, initiated in 1997 at a cost of 60 billion yen, aimed to model impacts and geophysical phenomena to inform prediction and . Subsequent iterations advanced the platform's capabilities: ES2 operated from 2009 to 2015, ES3 from 2015 to 2021, and the current ES4, deployed in March 2021, integrates hybrid architectures featuring EPYC CPUs, SX-Aurora processors, and A100 GPUs for a total peak of 20.2 petaflops. These systems have enabled high-fidelity system modeling, supporting research into phenomena such as formation, dynamics, and long-term variability, while fostering international collaboration and industrial applications in . The Simulator's early dominance spurred global advancements in for scientific simulation, underscoring the value of processing in handling complex geophysical datasets.

Overview

Purpose and Objectives

The Earth Simulator Project was initiated in 1997 by Japan's government through the Science and Technology Agency (predecessor to MEXT) to develop advanced computational capabilities for predicting global environmental changes and mitigating . Its foundational objectives centered on simulating complex geophysical phenomena, including climate variability, dynamics such as earthquakes, and fluid systems like ocean currents and , to enable accurate forecasting of environmental shifts. This initiative aimed to safeguard human safety and support by reducing uncertainties in long-term predictions of events like typhoons and seismic activity. A core purpose was to "ensure a bright future for human beings by accurately predicting variable global environment," as articulated in project outlines, with simulations focused on integrated systems to interpret phenomena such as and climate oscillations. These efforts emphasized causal modeling of interactions between the atmosphere, oceans, and , prioritizing empirical validation through high-resolution computations to inform disaster preparedness and . The project was propelled by strategic national investment from the Ministry of Education, Culture, Sports, Science and Technology (MEXT), reflecting 's commitment to for earth sciences, and has been operated by the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) since its to facilitate for researchers while ensuring peaceful applications. A secondary objective involved advancing simulation technologies themselves, fostering innovations in computational methods applicable to broader scientific domains.

Organizational Context

The Earth Simulator supercomputer was developed by NEC Corporation in partnership with the Japan Agency for Marine-Earth Science and Technology (JAMSTEC), a government-affiliated focused on marine and earth sciences. The initial project, encompassing design, construction, and deployment of the first-generation system, incurred costs of 60 billion yen, equivalent to roughly $500 million USD based on contemporaneous exchange rates. This investment was primarily sourced from Japanese national government budgets allocated through the Ministry of Education, Culture, Sports, Science and Technology (MEXT), underscoring a deliberate prioritization of specialized for earth system modeling over broader commercial or general-purpose applications. Operated continuously from JAMSTEC's Yokohama Institute for Earth Sciences facility since its inception, the system has undergone successive upgrades funded via similar national mechanisms, maintaining operational control under JAMSTEC while leveraging NEC's expertise in architecture. This framework exemplifies Japan's pursuit of technological self-sufficiency in supercomputing during periods of international restrictions on advanced computing exports, particularly from the in the late , which prompted domestic innovation in vector-based systems to support critical simulations in , , and disaster prediction. The collaboration emphasizes missions, with JAMSTEC coordinating interdisciplinary research access while NEC provides hardware tailored to scientific workloads, distinct from global competitors' scalar-focused designs.

Development History

Inception and First Generation (2002)

The Earth Simulator project was launched by Japan's Ministry of Education, Culture, Sports, Science and Technology in 1997, aiming to develop a capable of simulating global environmental changes, particularly atmosphere-ocean interactions. The initiative sought to integrate advanced vector processing for handling the large-scale, data-parallel computations required in geophysical modeling. Development was undertaken by NEC Corporation, resulting in the first-generation Earth Simulator (ES1) becoming operational on March 11, 2002, at the Earth Simulator Center in . The system consisted of 640 interconnected nodes, each featuring eight vector arithmetic processors built on the NEC SX-6 architecture with 0.15 μm technology, achieving a peak theoretical performance of 40 teraflops and 10 terabytes of main memory. This vector-based design excelled in sustained high-performance for iterative solvers and grid-based simulations prevalent in earth sciences. Upon its benchmark evaluation, the ES1 topped the list in June 2002 with a Linpack performance of 35.86 teraflops, outpacing the previous leader, the U.S. Department of Energy's ASCI White by over fourfold. This dominance, maintained until late 2004, underscored Japan's engineering prowess in specialized supercomputing hardware and prompted widespread international scrutiny of vector architectures' viability against emerging microprocessor clusters.

Second Generation (2009)

The second-generation Earth Simulator, designated ES2, was completed as an upgrade to the original system in March 2009 by the Japan Agency for Marine-Earth Science and Technology (JAMSTEC). This renewal incorporated 160 nodes of SX-9/E vector processors, delivering a peak performance of 131 teraflops (TFLOPS). The configuration included 1,280 arithmetic processors and 20 terabytes (TB) of main memory, interconnected via a Fat-Tree topology to support distributed-memory . The ES2 addressed constraints of the first-generation system by leveraging the advanced SX-9 architecture, which featured higher per-processor performance—102.4 gigaflops (GFLOPS) peak per arithmetic unit—resulting in over three times the overall computational capacity. Enhanced memory access speeds and interconnect efficiency in the SX-9 enabled handling of expanded datasets required for multi-physics simulations, facilitating higher-resolution modeling of earth system phenomena. These upgrades sustained the vector processing paradigm optimized for the long-loop computations prevalent in JAMSTEC's geophysical workloads. Operational continuity emphasized JAMSTEC's mandate in marine-earth science research, with the ES2 prioritizing simulations for , , and solid-earth . Efficiency improvements inherent to vector systems helped mitigate power and maintenance costs relative to scalar alternatives, though specific operational expenditure details were not publicly itemized beyond the upgrade's focus on sustained high utilization rates for core missions. The system operated until the transition to the third generation in 2015.

Third Generation (2015)

The third-generation Earth Simulator (ES3) commenced operations on June 1, 2015, following its deployment in March of that year to succeed the second-generation system. Developed by NEC Corporation, ES3 comprised 5,120 nodes of SX-ACE vector processors, delivering a theoretical peak performance of 1.31 petaFLOPS. This configuration emphasized sustained vector processing capabilities, enabling efficient handling of computationally intensive workloads while incorporating scalar processing elements for enhanced compatibility with diverse simulation codes. ES3 facilitated advancements in modeling complex, coupled systems, including atmosphere-ocean interactions critical for climate prediction and disaster risk assessment. The system's architecture supported higher-resolution simulations of tectonic processes, directly addressing Japan's post-2011 Great East Japan Earthquake research imperatives by improving fidelity in seismic wave propagation and fault dynamics analyses conducted by JAMSTEC. Enhancements in interconnect and node-level capacity—upgraded from prior generations—allowed for finer-grained meshes in global-scale models without proportional increases in runtime. Fault tolerance mechanisms were refined to minimize disruptions in extended simulations exceeding weeks, incorporating redundant pathways and error-correcting protocols suited to parallelism. These features bridged the series' vector heritage with incremental steps toward architectural flexibility, preparing for integrated earth system computations that demand varied processing paradigms in subsequent iterations.

Fourth Generation (2021)

The fourth-generation Earth Simulator (ES4), developed by NEC for the Japan Agency for Marine-Earth Science and Technology (JAMSTEC), entered operation in March 2021 as a hybrid multi-architecture supercomputer tailored for advanced Earth system modeling. This system integrates AMD EPYC 7742 CPU nodes for general-purpose computing, NEC SX-Aurora TSUBASA vector engine nodes for high-throughput vector operations, and NVIDIA A100 GPU nodes for accelerator-intensive tasks, comprising 720 CPU nodes, 684 vector-equipped nodes, and 8 GPU-equipped nodes. The architecture employs a 200 Gb/s InfiniBand HDR200 interconnect to enable low-latency data exchange across heterogeneous components, supporting scalable parallel simulations. ES4 achieves a total peak performance of 19.5 PFLOPS, representing approximately 15 times the computational capacity of its third-generation predecessor while maintaining comparable overall power consumption levels. This efficiency gain—effectively reducing power per floating-point operation—stems from the selective use of specialized accelerators matched to workload demands, such as vector engines for legacy codes and GPUs for machine learning-augmented predictions, rather than uniform reliance on power-hungry general-purpose processors. Initial deployments emphasized kilometer-scale global simulations for climate variability, ocean dynamics, and seismic hazard assessment, leveraging the system's hybrid flexibility to handle diverse numerical models without extensive code rewrites. The design facilitates exascale pathway exploration by incorporating modular node types that allow incremental upgrades and workload partitioning, addressing bottlenecks in high-resolution system forecasting amid escalating demands for predictive accuracy in mitigation and .

Technical Architecture

Vector-Based Design in Early Generations

The early generations of the Earth Simulator, spanning ES1 (deployed in ), ES2 (), and ES3 (), centered on a processing paradigm implemented via NEC's SX series architecture, which leveraged single-instruction multiple-data (SIMD) units to handle the structured, data-parallel computations inherent in geophysical modeling. This design emphasized sustained throughput for iterative algorithms solving partial differential equations (PDEs), such as those modeling and wave propagation in earth systems, where long, contiguous data arrays from finite-difference or finite-volume discretizations could be processed in across registers. In the SX-6 processors powering ES1, each CPU incorporated eight replicated pipeline sets, operating at 500 MHz and capable of executing arithmetic, logical, and masking operations concurrently to achieve 8 GFLOPS per processor, with nodes aggregating eight such processors alongside 16 GB of shared main accessed at 256 GB/s bandwidth via 2048-way interleaved banks. Subsequent upgrades in ES2 and ES3 retained this multi-pipe —up to eight pipes per SX-9/SX-ACE CPU, clocked at 3.2 GHz for peaks exceeding 100 GFLOPS per —while scaling to 64 GB per node in later configurations, prioritizing over capacity to sustain data feeds for bandwidth-bound kernels like updates in finite-volume methods. This vector-centric approach contrasted with contemporaneous U.S. supercomputers, such as those based on scalar-dominant clusters (e.g., Power or Xeon systems in ASCI programs), by favoring actual simulation throughput—often 60-70% of peak on real PDE workloads—over inflated theoretical ratings that scalar designs achieved through wider but less efficient instruction streams, thereby delivering superior efficiency for the Earth Simulator's target codes without relying on peak-oriented benchmarks like LINPACK.

Transition to Hybrid and ARM-Based Systems

The third-generation Earth Simulator (ES3), operational from March 2015, introduced hybrid scalar- processing through NEC's SX-ACE architecture, diverging from the pure designs of prior systems. Each SX-ACE processor integrated 16 scalar cores with dedicated pipelines, enabling partial scalar computation capabilities alongside high-performance operations for workloads. This evolution addressed limitations in handling non-vectorizable tasks, such as irregular data access patterns common in emerging coupled simulations, while maintaining compatibility with legacy vector-optimized codes through targeted . The transition was primarily driven by escalating energy constraints and the pursuit of sustainable scalability toward , as pure architectures proved inefficient for scalar-dominant operations, leading to suboptimal /watt ratios in diverse applications. By incorporating scalar elements, ES3 improved overall system versatility without requiring wholesale code rewrites, allowing or execution modes to bridge -specific software with general-purpose scalar processing. In the fourth-generation Earth Simulator (ES4), deployed in early 2021, the hybrid model advanced to a partitioned configuration with 684 vector-accelerated nodes using SX-Aurora TSUBASA vector engines (each delivering 19.6 teraflops peak), 720 scalar nodes based on dual 7742 processors (64 cores each), and 8 GPU nodes equipped with 64 Nvidia A100 accelerators. This design optimized by allocating specialized hardware to workload types—vector engines for traditional earth modeling, scalar CPUs for control and I/O tasks, and GPUs for non-traditional accelerations—achieving balanced resource utilization and supporting legacy vector codes via offload mechanisms without full redesigns. The hybrid framework in ES4 facilitated integration of earth system models with techniques for , such as 3D convolutional neural networks for seismic analysis, yielding up to 2.9 times faster processing compared to prior data systems. This adaptability stemmed from causal pressures in computational , where growing model complexity demanded coupling physics-based simulations with empirical data-driven methods, unfeasible on vector-only platforms due to architectural rigidity and power overheads.

Interconnect and Scalability Features

The Earth Simulator's initial generations utilized a proprietary interconnection network featuring a full fat-tree , which facilitated low-latency all-to-all communication across 640 nodes, each comprising multiple processing elements. This employed a single-stage full for global addressing and synchronization, ensuring coherent data exchange in distributed simulations scaling to over 5,000 processing elements. Subsequent upgrades, particularly in the fourth generation operational since March 2021, transitioned to a high-bandwidth fabric operating at 200 Gb/s, while retaining a fat-tree for interconnecting clusters of CPU, , and GPU nodes. This configuration connected all system components to a unified fabric, enabling shared file systems and seamless resource pooling across diverse computational workloads. Scalability was enhanced through these topologies' support for elastic expansion, accommodating variable problem sizes in ensemble-based forecasting by maintaining high and minimizing contention in large-scale parallel operations. The upgrade in particular allowed dynamic node allocation without reconfiguration, promoting adaptability for time-sensitive simulations requiring sustained system-wide coherence.

Applications and Simulations

Climate and Ocean Modeling

The Earth Simulator enabled execution of high-resolution global ocean circulation models, such as the CCSR Ocean Component Model (COCO), integrated within the MIROC coupled climate framework, achieving horizontal resolutions of approximately 20 km for simulating mesoscale ocean dynamics. These simulations replicate key processes including El Niño-Southern Oscillation (ENSO) variability and oceanic carbon cycling, with sub-degree grid spacing permitting explicit resolution of eddies and fronts that influence heat and nutrient transport. Coupled atmosphere-ocean-sea ice models run on the Earth Simulator, such as variants of , incorporate bidirectional interactions to forecast seasonal climate patterns and generate projections for intergovernmental assessments. These models contributed data to ensembles underlying reports, emphasizing realistic teleconnections and ice-ocean feedbacks in mid-latitude circulation. Parameterizations for sub-grid phenomena, including genesis and intensification, undergo empirical adjustment using in-situ observations from buoys and altimetry to calibrate schemes and surface flux representations against historical events. This tuning process leverages the simulator's computational capacity to iterate over ensembles, reducing biases in simulated tracks and oceanic responses.

Solid Earth and Seismic Simulations

The Earth Simulator has facilitated high-resolution simulations of seismic wave propagation in subduction zones, employing finite-difference methods (FDM) to model three-dimensional wavefields from major earthquakes. These simulations achieve resolutions sufficient to capture fault dynamics at scales down to meters, enabling detailed analysis of rupture processes and ground motion amplification. For instance, parallel FDM implementations on the system have simulated broadband s from events like the 2002 Denali fault earthquake, utilizing spectral-element methods to propagate waves across global heterogeneous structures. In geodynamic modeling, the system supports finite-element and multigrid-based codes for simulating cycles and rupture segmentation along interfaces, such as the Nankai Trough. These efforts include long-term tectonic reconstructions of dynamics, incorporating stress-history dependent plate motions to predict potential volcanic triggers and generation mechanisms linked to slab dehydration. A notable application involved post-event analysis of the 2011 Tohoku-Oki , where simulations reconstructed rupture propagation and seismic- wave interactions to refine fault models. Mantle convection simulations on the Earth Simulator utilize optimized codes like ACuTEMan to model whole-mantle circulation, including slab and upwelling plumes that drive tectonic deformation over geological timescales. These computations integrate temperature- and pressure-dependent rheologies to simulate low-degree patterns consistent with data. Validation occurs through comparison with observational datasets from networks such as Hi-net, where simulated waveforms and derived hazard maps are calibrated against real-time borehole seismograms to assess ground shaking predictions.

Broader Computational Uses

The Earth Simulator has supported simulations in , such as predicting photochemical reactions of molecules, achieving high computational efficiency for complex chemical processes that extend its utility beyond core geophysical modeling. These applications leverage the system's architecture for detailed , including nanoscience and materials property predictions, as demonstrated in benchmarks showing superior performance in such workloads. Resource allocation occasionally accommodates fluid dynamics computations in non-earth contexts, though such uses remain secondary to JAMSTEC's and are constrained by priority quotas favoring marine-earth sciences. Domestic industrial collaborations have utilized the platform for applied , integrating its high-fidelity simulations into and materials development projects. International access is facilitated through dedicated project allocations for external collaborators, enabling joint efforts with institutions like the UK's Hadley Centre and the , typically under restricted computational quotas to maintain focus on -related objectives. Public proposals from non-JAMSTEC researchers are reviewed for relevance to , , or allied fields, with approvals ensuring adaptability while preserving mission priorities.

Performance Milestones

TOP500 Rankings and Benchmarks

The first-generation Earth Simulator secured the number one position on the supercomputer list upon its debut in June 2002, achieving an Rmax of 35.86 teraflops on the High-Performance LINPACK (HPL) benchmark, and retained this ranking across five consecutive biannual lists through June 2004. This sustained dominance reflected the system's vector architecture's proficiency in HPL's dense matrix operations, yielding an efficiency ratio of approximately 90% relative to its 40-teraflops theoretical peak, far exceeding typical scalar-based competitors. In direct comparison to contemporaneous U.S. Accelerated Strategic Computing Initiative (ASCI) systems, such as ASCI Q at (ranked second and third with around 20-25 teraflops), the Earth Simulator demonstrated a clear vector-processing advantage in HPL performance, where sustained throughput on linear algebra kernels benefited from high and unit utilization unavailable in dominant scalar architectures of the era. Later generations prioritized specialized and designs for earth- modeling over generalized HPL optimization, resulting in rankings within the top 20-50 range initially, though slipping to around 95th by June 2024 for the fourth-generation based on its 9.99-petaflops HPL result against a higher theoretical . These positions underscored persistent in HPL—often exceeding 70-80% sustained-to-peak ratios—but highlighted trade-offs against massively parallel GPU-accelerated generalists dominating modern lists.
GenerationDebut YearPeak TOP500 RankHPL RmaxNotes on Efficiency
First (ES1)20021 (June 2002–June 2004)35.86 TFLOPS~90% Rmax/Rpeak, vector dominance over ASCI scalar systems
Second (ES2)200916Not specified in aggregateHighest Japanese efficiency; vector focus maintained high sustained HPL
Third (ES3)2015~Top 50~1 PFLOPS peak contextSpecialized (SX-ACE) yielded competitive HPL for science codes
Fourth (ES4)2021~Top 100 (e.g., 95 in 2024)9.99 PFLOPSHybrid /GPU; efficiency prioritized for targeted benchmarks over raw ranking

Computational Power Evolution

The Earth Simulator's first generation (ES1), operational from , delivered a of 40 teraflops (TFLOPS) across 640 nodes, each comprising eight processors, with a total main of 10 terabytes (TB). Sustained for applications reached 35.86 TFLOPS, equivalent to 87.5% of , owing to the architecture's efficiency in handling structured numerical workloads typical of simulations. Subsequent generations scaled computational capacity significantly while retaining vector processing for high sustained fractions. The second generation (ES2), introduced around 2009, featured 1,280 arithmetic processors with a collective peak of approximately 131 TFLOPS and 20 TB of main , enabling finer-grained simulations of geophysical phenomena. By the third generation (ES3), deployed in 2015 with 5,120 SX-ACE nodes, peak performance advanced to 1.31 petaflops (PFLOPS), supported by roughly 328 TB of , where vector optimizations continued to yield sustained efficiencies exceeding 50% for domain-specific modeling codes. The fourth generation (ES4), activated in , achieved a peak of 19.5 PFLOPS through a hybrid configuration integrating vector engines with scalar processors, paired with 556.5 tebibytes (TiB) of main memory—over 50 times the ES1 capacity—and petabyte-scale shared storage evolving toward exabyte-class archival for petascale datasets in coupled earth system models. This progression reflects hardware maturation, with post-ES2 improvements in flops per watt for vector nodes enhancing effective throughput for memory-bound geophysical algorithms without compromising simulation fidelity.

Energy Efficiency and Operational Metrics

The first-generation Earth Simulator (ES1), operational from , required approximately 12 MW of power to operate, encompassing compute, cooling, and ancillary systems, reflecting the high energy demands of its architecture with 5,120 processors. Subsequent generations improved ; for instance, the third-generation ES3 utilized SX-ACE systems with advanced cooling to manage power within similar facility constraints, though exact figures for ES3 full-load draw remain around 1-2 MW based on scaled deployments. The fourth-generation ES4, deployed in 2021, achieves an actual power consumption of approximately 1.5 MW while delivering over 19 PFLOPS peak performance—more than 15 times that of ES3—through hybrid integration of CPUs, SX-Aurora engines, and A100 GPUs, enabling sustained workloads at reduced energy per flop compared to vector-only predecessors. Operational reliability is maintained via modular redundancy in hardware and scheduled preventive maintenance, limited to twice annually, yielding uptime exceeding 99% and supporting uninterrupted multi-year simulation campaigns essential for earth science modeling. JAMSTEC operational metrics emphasize job throughput in batch-oriented environments, utilizing Network Queuing System II (NQSII) with dedicated large (L) queues for multi-node production runs and small (S) queues for single-node tasks, prioritizing ensemble simulations that process thousands of parallel jobs for statistical robustness in climate and seismic forecasts. This queue structure facilitates high utilization rates, with reports indicating efficient allocation for long-running geophysical computations without frequent interruptions beyond routine upkeep.

Scientific and Technological Impact

Key Achievements in Earth Science Research

The Earth Simulator enabled the Ocean General Circulation Model for the Earth Simulator (OFES), a global eddy-resolving ocean model at 0.1° horizontal resolution, which produced 50-year climatological hindcasts and interannual simulations from 1950 onward that accurately reproduced observed mesoscale variabilities, including the decadal oscillations of the and associated heat transport anomalies. These simulations demonstrated fidelity to altimetry and in-situ observations, revealing causal links between western boundary currents and basin-scale climate signals previously unresolved in coarser models. In atmospheric and coupled climate modeling, the system supported integrations of a 20 km resolution global , achieving projections of future warming patterns under elevated CO2 scenarios while enhancing hindcast accuracy for El Niño-Southern Oscillation (ENSO) events through improved representation of air-sea interactions and convective parameterization. Preliminary validations of these coupled ocean-atmosphere frameworks confirmed alignment with historical reanalysis data for tropical Pacific variability, providing empirical evidence for refined dynamical processes in kilometer-scale (effective) forecasting of precursors. For , Earth Simulator computations advanced three-dimensional elasto-static analyses with 100 million , simulating seismic ground motions and crustal deformations to predict responses in heterogeneous media, which matched observed waveforms from regional earthquakes and informed geophysical assessments. These efforts extended to modeling by incorporating high-fidelity numerical schemes that integrated constraints, yielding patterns consistent with long-wavelength heterogeneity data and challenging uniform-layer assumptions in through evidence of depth-dependent variations.

Contributions to Global HPC Landscape

The Earth Simulator's architecture, centered on 512 vector processors interconnected via a high-bandwidth, low-latency network, exemplified a resurgence in vector-parallel computing paradigms that had waned since the 1990s in favor of commodity scalar clusters. Achieving 35.86 TFLOPS sustained performance on the LINPACK benchmark upon its 2002 debut—equivalent to 87.5% of its 41 TFLOPS peak—it outperformed contemporary U.S. systems like ASCI Q by demonstrating superior efficiency for memory-intensive scientific workloads, thereby challenging the dominance of massively parallel processing (MPP) scalar designs. This prompted renewed U.S. interest in vector technologies, with reports indicating that the system's success highlighted the strategic value of government-backed vector development to maintain competitiveness in HPC, influencing subsequent explorations into hybrid vector-scalar architectures during the mid-2000s. Japan's ¥60 billion (approximately $500 million USD in 2002) national investment in the Earth Simulator underscored a model of self-funded, vertically integrated HPC development, yielding demonstrable returns through accelerated progress toward petascale and exascale capabilities, as seen in the lineage to the 10.51 PFLOPS by 2011. This sovereign approach, prioritizing domain-specific optimizations for earth sciences over general-purpose scalability, contrasted with Europe's reliance on multinational consortia like the EuroHPC Joint Undertaking, which distributes resources across member states for broader accessibility but often incurs coordination overheads. By validating high ROI via targeted applications—such as enabling simulations infeasible on prior global systems—the project informed international debates on funding models, emphasizing national autonomy in fostering HPC innovation amid geopolitical competition.

Empirical Validations and Predictive Successes

The Earth Simulator facilitated the development and execution of coupled general circulation models, such as SINTEX-F, which achieved notable predictive success in forecasting El Niño-Southern Oscillation (ENSO) events. These models accurately predicted the onset and intensity of major ENSO episodes, including the 1997/98 El Niño, with skill extending up to 1.5 years in advance, by resolving causal interactions between equatorial heat content anomalies and atmospheric teleconnections. This performance was validated against historical and observations, demonstrating the efficacy of high-resolution coupled dynamics over uncoupled statistical approaches. Simulations of (IOD) events on the Earth Simulator similarly exhibited strong hindcast and forecast alignment with observations, capturing the dipole's oscillatory heat content gradients and associated Indian monsoon impacts through physics-based parameterization of air-sea fluxes. Empirical tuning to prior events ensured conservative estimates of event strength, avoiding overprediction of extremes and highlighting the role of subsurface ocean diffusion in modulating surface variability, consistent with and data. In oceanic component validations, models like MSSG-O run on the Earth Simulator reproduced observed heat uptake patterns, aligning with float measurements of upper-ocean temperature profiles and supporting first-principles representations of turbulent and meridional overturning. These matches underscored causal realism in simulating heat redistribution, where simulations outperformed coarser grids by explicitly resolving eddy-induced transports that correlate with in-situ observations from 2000 onward. For seismic applications, rate-state friction laws—calibrated from laboratory rock mechanics experiments—enabled Earth Simulator-based models to forecast short-term aftershock decay rates following large events, including sequences after the 2004 Sumatra-Andaman earthquake, with statistical agreement to observed seismicity patterns derived from global catalogs. Such implementations emphasized frictional evolution grounded in empirical velocity-weakening behaviors, providing probabilistic hazard estimates superior to purely statistical models in capturing spatial clustering. Overall, these validations reflect the Simulator's role in prioritizing observationally constrained parameters, yielding projections of variability that underpredicted transient warming spikes relative to higher-sensitivity ensembles, thereby aligning more closely with post-2000 global temperature records through restrained feedback amplification.

Criticisms and Limitations

Development Costs and Resource Allocation

The development of the original Earth Simulator, initiated in 1997 and completed in 2002, cost approximately ¥60 billion (around $500 million at contemporary exchange rates), funded primarily by the Japanese government through the Ministry of Education, Culture, Sports, Science and Technology (MEXT) and associated agencies. This investment covered hardware from , facility construction in , and initial software adaptations for earth system modeling. Subsequent generations have entailed similarly substantial outlays; for instance, plans for an upgraded system announced in 2005 projected costs of ¥80–100 billion, while the fourth-generation Earth Simulator (ES4), deployed around 2021, exceeded ¥110 billion in development expenses. These figures reflect not only hardware and integration but also ongoing maintenance and upgrades tailored to vector processing architectures optimized for climate simulations. Across its four generations since 2002, the Earth Simulator program has demanded cumulative public investments in the hundreds of billions of yen, drawing entirely from taxpayer-supported budgets managed by JAMSTEC ( Agency for Marine-Earth Science and Technology) under government directives. This heavy dependence on state subsidies underscores a national priority for specialized in geosciences, yet it occurs against 's elevated public debt, projected to reach 230% of GDP by late 2025—one of the highest ratios globally. Such fiscal pressures have fueled broader scrutiny of in public R&D, particularly when weighed against alternative investments in diversified economic technologies or pressing needs like defense capabilities, where opportunity costs could redirect funds toward initiatives with wider industrial applicability. In comparison to private-sector clusters, such as those deployed by U.S. oil and gas firms for seismic analysis and reservoir modeling, the Earth Simulator's bespoke design for applications imposes higher per-generation costs and narrower utility. Private efforts often leverage commodity hardware for multipurpose tasks, achieving economic returns through enhanced resource extraction efficiency and risk reduction, whereas the ES series' vector-centric focus limits adaptability to commercial or general-purpose workloads despite its pioneering performance benchmarks. This specialization, while advancing targeted simulations, highlights potential inefficiencies in taxpayer-funded pursuits that prioritize niche scientific goals over scalable, revenue-generating platforms.

Challenges in Simulation Accuracy

Early simulations of tropical cyclones using high-resolution atmospheric general circulation models (AGCMs) on the , such as the , demonstrated pronounced sensitivities in cumulus parameterization schemes, leading to overestimations of storm intensity that were particularly responsive to perturbations in initial conditions; these issues were partially addressed through iterative refinements to the schemes, underscoring the challenges of sub-grid scale representations in parameterized convection. Coupled ocean-atmosphere models executed on the Earth Simulator, including variants of the COCO and SINTEX-F systems, projected tropospheric warming trends with amplified sensitivity compared to satellite-derived observations from datasets like those analyzed in global assessments, where modeled upper-air increases outpaced empirical measurements by factors linked to biases in cloud and feedbacks; such divergences persisted despite the Simulator's computational advantages, reflecting systemic limitations in capturing radiative-convective processes. The chaotic nature of atmospheric and oceanic dynamics posed inherent barriers to long-range predictability in Earth Simulator-based ensemble forecasts, as evidenced by studies using AFES and large-ensemble frameworks like d4PDF, where spreads in projected variables—such as sea-level pressure anomalies and tracks—regularly surpassed observed interannual variances, indicating that uncertainties amplified errors beyond decadal horizons despite high-fidelity integrations.

Environmental and Operational Drawbacks

The original Earth Simulator (ES1), operational from , exhibited substantial power demands with a peak consumption of 10 MW, equivalent to the electricity usage of several thousand households and imposing notable loads on Yokohama's regional grid infrastructure. This high energy footprint contributed to operational inefficiencies, as the system's architecture generated significant heat, necessitating advanced high-efficiency air-cooling mechanisms to prevent thermal throttling despite the absence of liquid immersion. Although Japan's grid at the time included and sources, the concentrated draw exacerbated local supply challenges during peak computational runs focused on and simulations. Maintenance of the ES1's bespoke SX-6 vector nodes presented ongoing operational hurdles, with the custom integration of 5,120 processors requiring specialized expertise unavailable from off-the-shelf alternatives, fostering dependency on for diagnostics, repairs, and component sourcing. This vendor-specific design, while enabling high efficiency in targeted workloads, resulted in protracted upgrade cycles; subsequent iterations like ES2 and ES3 adhered to 's proprietary vector paradigms, delaying adoption of commoditized scalar or GPU-based architectures that emerged in global HPC by the mid-2000s. The resource-intensive nature of sustaining such a specialized underscored broader operational trade-offs in earth system , where investments in exascale precursors like the ES diverted and budgetary focus from augmenting sparse empirical observation networks—such as or in-situ sensors critical for validating simulations—potentially hindering causal insights into underrepresented phenomena like deep-ocean or regional effects. In practice, this prioritization amplified reliance on model ensembles over diversified , as evidenced by persistent gaps in observational coverage that limit the causal fidelity of even high-fidelity simulations.

Legacy and Future Directions

Influence on Subsequent Supercomputing Projects

The Earth Simulator's demonstration of high-performance vector processing in 2002 catalyzed Japan's sustained investment in domestic supercomputing initiatives, paving the way for successor systems like the Fugaku supercomputer, which achieved the top ranking on the TOP500 list in June 2020 with 442 petaflops of performance. While Fugaku shifted to scalar ARM-based A64FX processors developed by Fujitsu and RIKEN—contrasting the Earth Simulator's NEC SX-6 vector architecture—it inherited strategic lessons in scaling hybrid workloads for applications beyond earth sciences, such as industrial simulations and societal challenges, reflecting a broadened scope from the Earth Simulator's specialized climate focus. This evolution underscored Japan's emphasis on custom architectures tailored to national priorities, fostering resilience against global supply chain dependencies. The Earth Simulator's vector-centric design influenced the persistence of specialized vector technologies in Japanese HPC, even as scalar and GPU-dominant systems prevailed globally for general-purpose computing. , the Earth Simulator's architect, continued refining processors in the series, integrating them into hybrid configurations for niches like and climate modeling where vector efficiency excels over scalar parallelism. For instance, the fourth-generation Earth Simulator, deployed in 2021 at JAMSTEC, combined -Aurora TSUBASA engines with A100 GPUs, achieving up to 19.5 petaflops while leveraging vector strengths for simulation accuracy in workloads. This enduring vector lineage, rooted in the original Earth Simulator's 35.86 teraflops peak from 640 nodes each with eight pipes, validated alternative paths to peak performance amid the industry's scalar shift. By showcasing Japan's capacity for indigenous HPC leadership, the Earth Simulator bolstered efforts to cultivate domestic chip and system design, enhancing technological sovereignty amid U.S.-China semiconductor tensions and export restrictions. This legacy supported projects like Fugaku's A64FX processor, engineered to comply with international controls while advancing exascale capabilities, thereby reducing reliance on foreign components in critical infrastructure. Such developments positioned Japan to navigate global tech frictions, prioritizing self-sufficient innovation over off-the-shelf imports.

Integration with AI and Emerging Technologies

The fourth-generation Earth Simulator (ES4), operational since 2021, incorporates NVIDIA A100 GPUs alongside AMD EPYC CPUs and NEC SX-Aurora vector engines, enabling hardware-level acceleration for workloads in earth system simulations. This multi-architecture design facilitates the integration of neural networks to approximate subgrid-scale processes, such as and , where traditional parameterizations struggle with unresolved physics at coarse resolutions. Researchers have leveraged ES4's GPU partitions for training models that enhance micrometeorological simulations, including probabilistic super-resolution techniques for urban heat islands, demonstrating improved fidelity over purely physics-based approaches. However, such ML emulations risk to training data, particularly given sparse observational datasets for rare events, potentially undermining causal inferences in long-term climate projections. JAMSTEC has applied ES4 to machine learning-driven predictions beyond core atmospheric modeling, such as outbreak forecasting via ensemble models integrating environmental variables, highlighting the supercomputer's role in hybrid physics-ML frameworks for interdisciplinary . These efforts align with broader pursuits of digital twins—virtual replicas fusing simulations with real-time observations—for and systems, as pursued by JAMSTEC in collaborations like the Digital Twins of the Ocean initiative. Yet, realizing comprehensive digital twins demands vast, high-quality , where current limitations in sensor coverage and model-data discrepancies constrain predictive accuracy, often requiring conservative validation against empirical benchmarks. Emerging explorations of quantum-HPC hybrids for tasks like seismic inversion remain theoretical for ES4, with no deployed implementations; while quantum annealing shows promise for traveltime tomography in controlled tests, real-world benchmarks reveal scalability gaps compared to classical GPU-accelerated methods on systems like ES4. Overall, ES4's AI synergies enhance simulation efficiency but hinge on rigorous cross-validation to mitigate biases from data paucity and algorithmic approximations, ensuring advancements prioritize empirical robustness over unsubstantiated hype.

References

  1. [1]
    EARTH SIMULATOR JAMSTEC-JAPAN AGENCY FOR MARINE ...
    The Earth Simulator (ES4) is a multi-architecture supercomputer based on AMD EPYC CPUs, combined with accelerators (NEC SX-Aurora TSUBASA and NVIDIA GPU A100).Missing: facts | Show results with:facts
  2. [2]
    Earth Simulator Center - TOP500
    Built by NEC, the Earth Simulator was based on their SX-6 architecture. It consisted of 640 nodes with eight vector processors and 16 gigabytes of memory at ...Missing: facts | Show results with:facts
  3. [3]
    JAMSTEC Goes Hybrid On Many Vectors With Earth Simulator 4 ...
    Sep 27, 2021 · The Earth Simulator 1, or ES1, system had 700 TB of storage and 1.6 PB of tape archive, and it delivered a world-record 40 teraflops of peak ...Missing: facts | Show results with:facts
  4. [4]
    Earth Simulator Manufacturing Begins - HPCwire
    Jun 2, 2000 · The Earth Simulator project is designed to enable environmental research through analysis and simulation of the global environment through the ...Missing: objectives MEXT
  5. [5]
    JAPANESE EARTH SIMULATOR: A CHALLENGE AND AN ...
    May 3, 2002 · The Earth Simulator project started in 1997 and has been in the public domain ever since. The Japanese reported its progress in ...Missing: MEXT | Show results with:MEXT
  6. [6]
    [PDF] Outline of the Earth Simulator Project
    The first aim is to ensure a bright future for human begins by accurately predicting variable global environment. The second is to contribute to the development ...Missing: objectives MEXT
  7. [7]
    [PDF] Outline of the Earth Simulator Project - JAMSTEC
    The first aim is to ensure a bright future for human beings by accurately predicting variable global environment. The second is to contribute to the development ...Missing: objectives MEXT
  8. [8]
    NEC to provide JAMSTEC with the SX-Aurora TSUBASA vector ...
    Sep 25, 2020 · JAMSTEC is scheduled to begin operating the new system in March 2021 as part of the "Next Earth Simulator" project for research and development ...Missing: funding | Show results with:funding
  9. [9]
    NEC's Vector Supercomputer to Power Japan's 'Next Earth Simulator'
    Oct 13, 2020 · According to JAMSTEC, the system will deliver 19.5 theoretical peak petaflops – nearly 15 times the 1.3 petaflops delivered by Earth Simulator 3 ...Missing: objectives MEXT<|separator|>
  10. [10]
    Yokohama Institute for Earth Sciences (YES) | JAMSTEC
    YES is located about 5 minutes from Sugita Exit on Shuto Expressway Bayshore Route (also called Wangan-sen). By public transport. YES can be reached via three ...
  11. [11]
    [PDF] The Earth Simulator System - Computer Engineering Group
    The Japanese government's initiative “Earth. Simulator project” started in 1997. Its target was to promote research for global change predictions by using ...Missing: SX- | Show results with:SX-
  12. [12]
    [PDF] Earth Simulator Running
    The Earth Simulator project had been initiated for aiming at understanding and elucidation of the global change as precisely as possible, and an ultra high ...Missing: goals | Show results with:goals
  13. [13]
    WORLD'S FASTEST ULTRA COMPUTER STARTS OPERATION
    Mar 15, 2002 · This is the world's fastest ultra computer configured with 640 nodes (64GFLOPS/node, 5,120 CPUs in total), each of which consists of eight ...Missing: TFLOPS | Show results with:TFLOPS
  14. [14]
    [PDF] high-end computing research and development in japan
    May 10, 2004 · Earth Simulator. Dedicated to atmosphere, ocean and materials simulation, manufactured by NEC, 640 nodes with 40 Tflop/s peak; 8 PR per node ...<|separator|>
  15. [15]
    June 2002 - TOP500
    TOP 10 Sites for June 2002 ; 1, Earth-Simulator, NEC Japan Agency for Marine-Earth Science and Technology Japan, 5,120 ; 2, ASCI White, SP Power3 375 MHz, IBM
  16. [16]
    THE EARTH SIMULATOR HERALDS "NEW AGE OF SIMULATION"
    The awesome results of the Earth Simulator in the 19th TOP500 supercomputer list engendered intense interest. The List was analyzed in detail by ...
  17. [17]
    Earth Simulator(ES2) System Overview
    The upgrade of the Earth Simulator has been completed in March 2009. The renewed system(ES2) 160 nodes of NEC's SX-9E. *1 Japan Aerospace Exploration ...Missing: second generation 9
  18. [18]
    “Earth Simulator System” Achieves World's Top Computing ...
    Jun 4, 2009 · The renewed Earth Simulator System is a large-scale vector supercomputer that consists of 160 nodes of NEC's SX-9/E and boasts peak performance ...Missing: ES2 | Show results with:ES2
  19. [19]
    Earth Simulator(ES2) System Overview
    The ES is a highly parallel vector supercomputer system of the distributed-memory type, and consisted of 160 processor nodes connected by Fat-Tree Network.Missing: second generation 2009 NEC SX- 9
  20. [20]
    Souped Up Earth Simulator 2 Supercomputer Unveiled - HPCwire
    Mar 2, 2009 · Japan's Earth Simulator 2 (ES2) supercomputer has been unveiled to the public, following a set of upgrades that have more than tripled the ...<|separator|>
  21. [21]
    Earth Simulator - Wikipedia
    ES3, from 2017 to 2018, ran alongside Gyoukou, a supercomputer with immersion cooling that can achieve up to 19 PFLOPS.
  22. [22]
    NEC Delivers SX-ACE Vector Supercomputers for use as the Earth ...
    May 26, 2015 · JAMSTEC will begin operating the system as the Earth Simulator on June 1 this year. Consisting of 5,120 nodes of SX-ACE, the Earth Simulator is ...
  23. [23]
    Earth Simulator(ES3) System Overview
    Earth Simulator (ES3) 2015-2021. System Overview · System Operation Status. 地球シミュレータ(ES3). move to Japan Agency for Marine-Earth Science and ...
  24. [24]
    System Overview JAMSTEC-JAPAN AGENCY FOR MARINE ...
    The Earth Simulator (ES4) is a multi-architecture supercomputer based on AMD EPYC CPUs, combined with accelerators (NEC's SX-Aurora TSUBASA and NVIDIA's GPU A ...Missing: A64FX | Show results with:A64FX
  25. [25]
    New Earth Simulator to Take on Planet's Biggest Challenges
    Sep 25, 2020 · The new system, scheduled to become operational in March, will be based around SX-Aurora TSUBASA vector processors from NEC and NVIDIA A100 ...<|separator|>
  26. [26]
    Predictive Simulations of Climate Change and Natural Disasters ...
    Oct 8, 2025 · Uehara: The Earth Simulator is a supercomputer managed by JAMSTEC that was developed for the purpose of researching climate change, including ...Missing: objectives MEXT
  27. [27]
    NEC Earth Simulator and the SX-Aurora TSUBASA - ResearchGate
    The Earth Simulator (ES) is a parallel supercomputer based on the NEC SX vector computer system. The first-generation ES started its operation in 2002.
  28. [28]
    [PDF] Performance of Ultra-Scale Applications on Leading Vector ... - OSTI
    Results demonstrate that the ES vector systems achieve excellent performance on our application suite – the highest of any architecture tested to date.
  29. [29]
    NEC SX-6i: A DEPARTMENTAL VECTOR SUPERCOMPUTER
    Mar 8, 2002 · The processor consists of 8-way vector pipes running at 500 MHz. ... In Japan NEC installed all 640 nodes, 5120 processors, at the Earth Simulator ...
  30. [30]
    The NEC SX-6. - The Netlib
    All models are based on the same processor, an 8-way replicated vector processor where each set of vector pipes contains a logical, mask, add/shift ...Missing: Earth Simulator
  31. [31]
    [PDF] Supercomputer SX-9 Development Concept - NEC Corporation
    The Earth Simulator began operation in March 2002, and its overwhelming perform- ance not only amazed the world, but also continues to make a large contribution ...
  32. [32]
    NEC SX-9 - Wikipedia
    The NEC SX-9 processors run at 3.2 GHz, with eight-way replicated vector pipes, each having two multiply units and two addition units; this results in a peak ...Missing: ES2 | Show results with:ES2
  33. [33]
    Can Vector Supercomputing Be Revived? - The Next Platform
    Oct 26, 2017 · In early 2009, the Japanese government upgraded Earth Simulator to the ES2 ... peak performance of 131 teraflops. Again, this used to sound ...
  34. [34]
    [PDF] Performance Evaluation of the SX-6 Vector Architecture for Scientific ...
    The. Earth Simulator, based on NEC SX-61 vector technology, achieves five times the LINPACK performance with almost half the number of processors of the IBM SP- ...
  35. [35]
    The Earth Simulator system - ResearchGate
    Aug 5, 2025 · The Earth Simulator achieved 35.86TFLOPS, or 87.5% of peak performance of the system, in LINPACK benchmark, and has been proven as the most ...
  36. [36]
    [PDF] The Current Status of the Earth Simulator
    The Earth Simulator is a highly parallel vector supercomputer system consisting of 640 processor nodes connected by high performance interconnection network ...
  37. [37]
    [PDF] “New” Earth Simulator starts operation in March, 2021
    Upgrade to 2nd Generation Earth Simulator. The second generation Earth Simulator which was based on NEC SX-9 system consisted 160 nodes. The whole system ...
  38. [38]
    CCSR Ocean Component Model (COCO) - ResearchGate
    This study compares two simulations of CCSR Ocean Component Model (COCO) ... model with a horizontal resolution of 20 km were run on the Earth Simulator.
  39. [39]
    Improved Climate Simulation by MIROC5: Mean States, Variability ...
    The ocean model is the CCSR Ocean Component Model (COCO; Hasumi 2006), which includes a sea ice model. A land model that includes a river module is also coupled ...
  40. [40]
    Convective Control of ENSO Simulated in MIROC in - AMS Journals
    The high sensitivity of the El Niño–Southern Oscillation (ENSO) to cumulus convection is examined by means of a series of climate simulations using an updated ...
  41. [41]
    Two decades of Earth system modeling with an emphasis on Model ...
    Oct 20, 2020 · The past 20 years of research using Earth system models (ESMs) is reviewed with an emphasis on results from the ESM based on MIROC (Model for Interdisciplinary ...
  42. [42]
    Development of the MIROC-ES2L Earth system model and ... - GMD
    May 13, 2020 · The Earth Simulator and JAMSTEC Super Computing System were used for the simulations, and the administration staff provided much support.
  43. [43]
    [PDF] Evaluation of Climate Models
    This chapter evaluates climate models, including their characteristics, model types, and performance using techniques for assessment.
  44. [44]
    Development of a High-Resolution Climate Model for ... - NASA ADS
    We are planning to conduct a series of future climate change projection experiments on the Earth Simulator with a high-resolution coupled ocean-atmosphere ...Missing: simulations | Show results with:simulations
  45. [45]
    [PDF] Broadband modeling of the 2002 Denali fault earthquake ... - specfem
    Abstract. We use a spectral-element method implemented on the Earth Simulator in Japan to simulate broadband seismic waves generated by the 3 November 2002 ...<|separator|>
  46. [46]
    Large Scale Parallel Simulation and Visualization of 3D Seismic ...
    Aug 6, 2025 · ... earthquake is demonstrated by high-resolution 3D parallel FDM simulation of seismic wave propagation using the Earth Simulator supercomputer.
  47. [47]
    A numerical simulation of earthquake cycles along the Nankai ...
    The Earth Simulator was used for all the simulations. The Generic Mapping tools package [40] was used for depicting the plate configuration and creating ...
  48. [48]
    A cause of rupture segmentation and synchronization in the Nankai ...
    Sep 1, 2006 · All of the numerical simulations have been done using the Earth Simulator. R. Honda kindly provided the gravity data. We thank K. Hirahara ...
  49. [49]
    Seismic‐ and Tsunami‐Wave Propagation of the 2011 Off the Pacific ...
    May 1, 2013 · ... rupture processes of the subduction‐zone earthquake. Data and ... Earth Simulator at JAMSTEC. Acknowledgments. This study was supported ...Earth And Fault Models · Seismic‐wave Propagation · Tsunami
  50. [50]
    Tectonic plates in 3D mantle convection model with stress-history ...
    May 24, 2020 · Kameyama M (2005) ACuTEMan: A multigrid-based mantle convection simulation code and its optimization to the Earth Simulator. J Earth Simulator 4 ...
  51. [51]
    High accuracy mantle convection simulation through modern ...
    ... mantle convection simulation code and its optimization to the Earth simulator ... modeling active compositional fields in mantle convection simulations.
  52. [52]
    [PDF] Numerical Simulation of the Mantle Convection and Subduction ...
    3 The Earth Simulator Center, Japan Agency for Marine-Earth Science and Technology ... Our goal is to construct the models of mantle convection and caused vol-.
  53. [53]
    Low‐degree mantle convection with strongly temperature‐ and ...
    Mar 30, 2006 · All the simulations were carried out on the Earth Simulator at Japan Agency for Marine-Earth Science and Technology. Some of figures in this ...
  54. [54]
    Velocity increase in the uppermost oceanic crust of subducting ...
    Mar 18, 2015 · ... Hi-net waveform data and the CMT solutions from the F-net. The computations were conducted on the Earth Simulator at the Japan Agency for ...
  55. [55]
    NEC's 'Earth simulator' succeeds in prediction of photochemical ...
    Jan 4, 2011 · The Earth Simulator demonstrated the world's top-level computing efficiency, especially for complicated applications, including nanoscience, ...
  56. [56]
    [PDF] Outline of the Earth Simulator Project
    2) In principle, the research achievements obtained by using the Earth Simulator should be promptly published and returned to the public. 3) The Mission ...
  57. [57]
    EARTH SIMULATOR DELIVERS NEW SCIENCE RESULTS - HPCwire
    Jul 11, 2003 · This successful simulation can contribute greatly to disaster relief. Yet another example is the study of the thermal conductivity of CNT ...
  58. [58]
    JAMSTEC Earth Simulator Resource Allocation
    In this category, there are the public applications focus of ocean science, earth science, and related objects for the outside of JAMSTEC. 3. JAMSTEC ...Missing: beyond | Show results with:beyond
  59. [59]
    25 Year Anniversary | TOP500
    The Earth Simulator supercomputer at the Earth Simulator Center in yokohama, Japan, took the No. 1 spot in June 2002 with a performance of 35.86 Tflop/s ( ...
  60. [60]
    World's Fastest Supercomputers - Science
    1, NEC Earth Simulator, Earth Simulator Ctr., Japan ; 2 & 3, Hewlett-Packard ASCI Q · Los Alamos Natl. Lab., U.S. ; 4, IBM ASCI White, Livermore Natl. Lab., U.S. ...<|separator|>
  61. [61]
    [PDF] HPCS HPCchallenge Benchmark Suite
    Not only does the Japanese Earth Simulator outperform the top American systems on the HPL benchmark. (Tflop/s), the differences in bandwidth performance on ...
  62. [62]
    [PDF] Earth Simulator Achieves High Performance on LINPACK Benchmark
    Jun 4, 2009 · The sustained performance of the Earth Simulator would claim the No.1 position among supercomputers in Japan (No.16 worldwide) according to the ...Missing: HPL | Show results with:HPL
  63. [63]
    Pure Vector Engines Get GPU Math Help for Next Earth Simulator
    Sep 28, 2020 · The fourth Earth Simulator system will be coming online in March 2021 with a blend of classic NEC pure vector processors mixed with Nvidia GPU accelerators.<|separator|>
  64. [64]
    SX-Aurora TSUBASA B401-8, Vector Engine Type20B 8C 1.6GHz ...
    Earth Simulator -SX-Aurora TSUBASA - SX-Aurora TSUBASA B401-8, Vector Engine ... HPL of optimized run, 9.99 PFlop/s. Power Measurement Level: 2. Software.
  65. [65]
    Potential of a modern vector supercomputer for practical applications
    Mar 7, 2017 · This paper examines the potential of the modern vector-parallel supercomputer through the performance evaluation of SX-ACE using practical ...
  66. [66]
    TOP500 List - June 2024
    TOP500 List - June 2024 ; 95, Earth Simulator -SX-Aurora TSUBASA - SX-Aurora TSUBASA B401-8, Vector Engine Type20B 8C 1.6GHz, Infiniband HDR200, NEC Japan Agency ...
  67. [67]
    Successful Achievement in Developing the Earth Simulator
    Each processor node is a shared memory system composed of eight vector processors. The total peak performance and main memory capacity are 40 Tfłop/s and 10 ...
  68. [68]
    Practical power consumption estimation for real life HPC applications
    For example, the 2004's top ranking machine, the Japanese Earth Simulator, required 12 MW of power to operate, roughly the amount required to power a small town ...
  69. [69]
    JAMSTEC Earth Simulator Operation Status
    (The whole system is taken out of service twice per year for preventive maintenance, and during Yokohama Institute facilities' yearly checkup.) and except ...Missing: uptime reliability
  70. [70]
    Earth Simulator(ES1) System Overview
    S batch queue is designed for single-node batch jobs and L batch queue is for multi-node batch queue. There are two-type queues. One is L batch queue and the ...Missing: throughput | Show results with:throughput
  71. [71]
    OFES NCEP forced hindcast 0.1 deg global 3day - apdrc
    The OGCM For the Earth Simulator (OFES) is a global 0.1x0.1 degree model forced by NCEP winds. It was run with climatological forcing for 50 years.
  72. [72]
    Decadal Variability of the Kuroshio Extension: Observations and an ...
    ... resolution) OGCM hindcast performed on Japan's Earth Simulator supercomputer. ... resolution climate model. Geophys. Res. Lett., 32 .L14617, doi:10.1029 ...
  73. [73]
    (PDF) A Fifty-Year Eddy-Resolving Simulation of the World Ocean
    Benefiting from the support of the "Earth Simulator", Japan achieved a series of remarkable accomplishments in Earth system simulation in the years since its ...
  74. [74]
    Global Warming Projection by an Atmospheric Global Model with 20 ...
    We projected global warming on the Earth Simulator using a very high horizontal resolution ... Climate Model and Convective Parameterization,” J. Climate ...
  75. [75]
    Development of a High-Resolution Climate Model for Future Climate ...
    ... Earth Simulator with a high-resolution coupled ocean-atmosphere climate model. The main scientific aims for the experiments are to investigate 1) the change ...
  76. [76]
    [PDF] Developing coupled ocean-atmosphere global climate model for the ...
    Developing coupled ocean-atmosphere global climate model for the Earth Simulator and its preliminary physical validation. Keiko TAKAHASHI, Nobumasa KOMORI ...
  77. [77]
    Three-dimensional elasto-static analysis of 100 million degrees of ...
    The Science and Technology Agency is promoting 'Earth Simulator Project' to establish a method for predicting changes in the geophysical features of the Earth.<|separator|>
  78. [78]
    Earth Simulator Proposed Research Projects
    Earth Simulator Proposed Research Projects ... Research on development of global climate model with remarkably high resolution and climate model with cloud ...<|separator|>
  79. [79]
    U.S. PLANS VECTOR SUPERCOMPUTER REVIVAL - HPCwire
    What became clear is that the Earth Simulator's importance is not as much the installation of a single powerful machine, but the commitment of the Japanese ...Missing: influence resurgence
  80. [80]
    Hardware system of the Earth Simulator - ScienceDirect
    The Earth Simulator (ES), developed under the Japanese government's initiative “Earth Simulator project”, is a highly parallel vector supercomputer system.Missing: resurgence | Show results with:resurgence
  81. [81]
    [PDF] The International Exascale Software Project Roadmap1 Abstract
    The. European Union (EU) is also planning to launch projects aimed at petascale and exascale computing and simulation. Japan has a project to build a 10- ...
  82. [82]
    Extended ENSO Predictions Using a Fully Coupled Ocean ...
    Jan 1, 2008 · The El Niño condition in the 1997/98 winter can be predicted to some extent up to about a 11⁄2-yr lead but with a weak intensity and large phase.
  83. [83]
    [PDF] Annual Report of the Earth Simulator Center - JAMSTEC
    ... Earth. Simulator. The SINTEX-F1 seasonal prediction system has so far demonstrated high performance of predicting the occurrences of. El Niño-Southern ...
  84. [84]
    [PDF] Outline of the Earth Simulator Project
    ○ Improvement of computational performance on the Earth. Simulator 2 (ES2) to be fit the architectures of discritization schemes for ultra high resolution ...
  85. [85]
    [PDF] Outline of the Earth Simulator Project
    as ES1, ES2 and NEC SX-series, the length of an inner-most loop should be long enough to achieve high performance. On. ES1, ADVENTURE Solid had an enough ...
  86. [86]
    "VIRTUAL" ATMOSPHERIC AND OCEANIC CIRCULATION IN THE ...
    One of the main goals of the Earth Simulator. (ES) Project is to reduce ambiguities in global warming projections. The effect of cumulus.
  87. [87]
    2002 | Timeline of Computer History
    Developed by the Japanese government to create global climate models, the Earth Simulator is a massively parallel, vector-based system that costs nearly 60 ...
  88. [88]
    Japan to Build Fastest Supercomputer - BetaNews
    Jul 25, 2005 · Construction would cost some 80 to 100 billion yen ($715-900 million US) of which the government would request 10 billion yen ($90 million US) ...
  89. [89]
    Nvidia to join team developing Japan's next supercomputer
    Aug 22, 2025 · Total development cost is expected to exceed 110 billion yen ($740 million). The system will be installed on Kobe's Port Island where the ...
  90. [90]
    Japan General Government Gross Debt to GDP - Trading Economics
    Government Debt to GDP in Japan is expected to reach 230.00 percent of GDP by the end of 2025, according to Trading Economics global macro models and analysts ...
  91. [91]
    [PDF] Real-World Examples of Supercomputers Used For Economic and ...
    Through advanced modeling and simulation, private sector participants have improved well recovery, reduced operating costs, and reduced failure risk. These ...Missing: Earth comparison
  92. [92]
    Comparison of Explicitly Simulated and Downscaled Tropical ...
    Oct 11, 2010 · The simulations were performed on the Earth Simulator/JAMSTEC, under the framework of the Innovative Program of Climate Change Projection ...
  93. [93]
    Variations of Tropical Lapse Rates in Climate Models and Their ...
    A major criticism of climate model simulations has been their overestimation of warming in the tropical upper troposphere, between 8- and 13-km altitude, ...
  94. [94]
    Predictability of Explosive Cyclogenesis over the Northwestern ...
    The ensemble spread of the SLP is large to the west and southwest of the ... Earth Simulator: Preliminary outcomes of AFES (AGCM for the Earth Simulator).2. Data And Methods · 3. Results · A. Increment Analysis
  95. [95]
    Seasonal predictability of baroclinic wave activity - Nature
    Oct 26, 2021 · ... chaotic climate system and to adequately sample the ... The d4PDF large ensemble simulations are produced with the Earth Simulator ...
  96. [96]
    [PDF] BlueGene/L Applications: Parallelism on a Massive Scale - OSTI.gov
    Sep 12, 2006 · previous number one machine on the Top500 list, the Earth Simulator, occupies 34,000 square feet and has a peak power consumption of 10 MW ...
  97. [97]
    [PDF] Hardware Technology of the Earth Simulator
    The Earth Simulator is a system that optimally merges the most advanced technolo- gies so that 8 processors can form one node that shares 16GB memory. The ...
  98. [98]
    Learning earth system models from observations - Journals
    Feb 15, 2021 · The problem of using real observations is that we do not have regular-gridded vertical profiles of the full state of the atmosphere at the ...
  99. [99]
    Earth System Modeling 2.0: A Blueprint for Models That Learn From ...
    Nov 30, 2017 · We outline how parameterization schemes can learn from global observations and targeted high-resolution simulations, for example, of clouds and convection.
  100. [100]
    Building the Fugaku supercomputer | Scientific Computing World
    ... Earth Simulator or the Numerical Wind Tunnel, is that the Fugaku machine has been designed around industrial usage and societal impact rather than a focus ...Missing: influence | Show results with:influence
  101. [101]
    Is This The End Of The Line For NEC Vector Supercomputers?
    Mar 23, 2023 · JAMSTEC is home of the legendary original Earth Simulator system that was built using 640 nodes with eight vector engines each, with a total of ...
  102. [102]
    Decoupling proceeding amid lingering U.S.-China tensions
    Aug 25, 2023 · In order to circumvent the restrictions, Japanese companies will have to shift their raw material procurement to other regions, which will ...Missing: circumvention | Show results with:circumvention
  103. [103]
    Japan tightens chipmaking export controls amid US-China tech ...
    Feb 3, 2025 · Japan makes it harder for Chinese entities to obtain chipmaking tools, advanced processors, and cryogenic cooling from Japanese companies.
  104. [104]
    [PDF] Artificial Intelligence for Earth System Predictability (AI4ESP)
    17.2.3 Energy Efficiency ... “Efficient Surrogate Modeling Methods for Large-Scale Earth. System Models on ...
  105. [105]
    Infectious Disease Prediction|JAMSTEC 4D Virtual Earth Project
    ・Malaria prediction using Machine learning models. Project Image. ES4 ES4The Earth SimulatorThe Earth Simulator (ES4) is a multi-architecture ...
  106. [106]
    International Activities | JAMSTEC
    Jul 10, 2025 · Visit of JAMSTEC by chair and speakers of “Digital Twins of Ocean” session at STS forum on October 4 ... RELATIONS JAMSTEC's practices for SDGs ...
  107. [107]
    Seismic traveltime inversion with quantum annealing - Nature
    May 23, 2025 · This study demonstrates the application of quantum computing based quantum annealing to seismic traveltime inversion, a critical approach for inverting highly ...Missing: Earth Simulator