Fact-checked by Grok 2 weeks ago

Climate model

A climate model is a computer program that numerically solves systems of differential equations derived from fundamental physical laws to simulate interactions within Earth's climate system, encompassing the atmosphere, oceans, land surface, biosphere, and cryosphere. These models approximate continuous processes on discrete grids, incorporating resolved dynamics alongside parameterized representations of sub-grid-scale phenomena such as convection, turbulence, and cloud formation, which introduce inherent uncertainties due to incomplete knowledge of those processes. Ranging from simplified one-dimensional energy balance models to comprehensive three-dimensional general circulation models (GCMs) and Earth system models, they enable hindcasting of paleoclimates, attribution of observed changes to natural and anthropogenic forcings, and projections of future conditions under radiative forcing scenarios. Notable achievements include replicating observed large-scale circulation patterns, such as Hadley cells and jet streams, and elucidating mechanisms like the amplification of polar warming via ice-albedo feedback, though empirical evaluations highlight persistent discrepancies, including overestimation of tropospheric warming rates and precipitation extremes in many models relative to satellite and surface observations. Controversies arise from evidence that multimodel ensembles, particularly in recent phases like CMIP6, exhibit a tendency to run "hot" compared to realized warming since the late 20th century, often linked to inflated estimates of equilibrium climate sensitivity exceeding empirical constraints from paleoclimate data and instrumental records, raising questions about parameter tuning, structural biases in cloud and aerosol feedbacks, and the reliability of long-term projections for policy applications. Despite advancements in resolution and process inclusion through international efforts like the Coupled Model Intercomparison Project (CMIP), fundamental challenges persist in capturing chaotic variability, regional details, and emergent phenomena, underscoring the need for rigorous validation against empirical data over reliance on ensemble means that may mask individual model flaws.

Fundamentals

Definition and Purpose

Climate models are computational representations of the Earth's climate system, comprising mathematical equations that describe the dynamics and thermodynamics of its primary components: the atmosphere, oceans, land surface, and sea ice. These models discretize the planet into a three-dimensional grid, solving fundamental physical laws—such as the Navier-Stokes equations for fluid motion, the thermodynamic energy equation, and laws of radiative transfer—numerically to simulate interactions among these components. The core purpose of climate models is to replicate observed climate patterns and variability for validation against empirical data, enabling attribution of historical changes to specific forcings like solar variability or greenhouse gas concentrations. By prescribing external forcings and initial conditions, models hindcast past climates—such as reproducing the cooling after the 1991 Mount Pinatubo eruption—and project future trajectories under scenarios of varying emissions, as in the Representative Concentration Pathways used in assessments since 2010. Beyond projection, climate models facilitate hypothesis testing through controlled simulations that isolate causal mechanisms, such as the role of aerosols in modulating radiative forcing or ocean heat uptake in delaying surface warming. This approach underpins efforts to distinguish anthropogenic signals from natural oscillations like El Niño-Southern Oscillation, though model outputs depend on parameterizations for sub-grid processes unresolved at typical resolutions of 50–250 km horizontally. Empirical tuning and ensemble methods address structural uncertainties, with multi-model intercomparisons like CMIP6 (initiated in 2016) providing robust diagnostics of performance against satellite and reanalysis datasets.

Core Components and Principles

Climate models integrate multiple components to represent the Earth's climate system, primarily the atmosphere, oceans, land surface, and sea ice or cryosphere. The atmospheric component simulates air motions, temperature, humidity, and radiative processes, while the oceanic component models currents, stratification, and heat storage. Land surface models handle vegetation, soil moisture, and runoff, and cryospheric models depict ice sheets and permafrost dynamics. These components exchange fluxes of momentum, heat, freshwater, and biogeochemical tracers to capture system interactions. The foundational principles derive from physical laws, including conservation of mass, momentum, energy, and water vapor. Governing equations encompass the Navier-Stokes equations for fluid motion, thermodynamic equations for heat transfer, and continuity equations for mass balance, augmented by equations for water substance phase changes and radiative transfer. These partial differential equations describe continuous processes but are discretized on spatial grids using numerical methods such as finite differences or spectral transforms to enable computation. Oceanic components similarly apply primitive equations adapted for incompressible fluids with density variations. Sub-grid scale processes, unresolved by typical grid resolutions of tens to hundreds of kilometers, require parameterization schemes to approximate their average effects. Examples include convective precipitation, cloud formation, turbulence in the planetary boundary layer, and gravity wave propagation, which are represented through empirical or semi-empirical relations tuned to observations or higher-resolution simulations. Such parameterizations introduce uncertainties, as they rely on assumptions about scale separation and process representation, necessitating validation against empirical data from field campaigns and satellite observations. Conservation properties are enforced explicitly in model formulations to prevent spurious drifts in long-term simulations.

Types of Climate Models

Simple Energy Balance Models

Simple energy balance models (EBMs) represent the climate system through the conservation of energy at global or zonal scales, equating absorbed shortwave radiation from the Sun to emitted longwave radiation plus any heat storage or transport terms. These models treat the Earth as a single point (zero-dimensional) or meridionally varying slab (one-dimensional), neglecting horizontal and vertical atmospheric dynamics, ocean circulation, and detailed radiative transfer. The foundational equation for a zero-dimensional EBM is (1 - a) \frac{S}{4} = \epsilon \sigma T^4, where S is the solar constant (approximately 1361 W/m², with ~0.1% solar-cycle variability), a is the Bond albedo (about 0.3), \epsilon is the effective emissivity (less than 1 due to greenhouse gases), \sigma is the Stefan-Boltzmann constant (5.67 × 10^{-8} W m^{-2} K^{-4}), and T is the effective emitting temperature. This yields an effective temperature of roughly 255 K without greenhouse effects, rising to about 288 K when accounting for atmospheric absorption. Such models were pioneered independently in 1969 by Soviet climatologist Mikhail Budyko and American meteorologist William Sellers to explore ice-albedo feedbacks and meridional heat transport. Budyko's zonally averaged model incorporated latitudinal diffusion of sensible heat and variable surface albedo, simulating poleward energy flux via a diffusion term proportional to the meridional temperature gradient. Sellers' formulation similarly balanced radiative fluxes with turbulent heat exchange, predicting warmer poles if Arctic ice were removed. These early EBMs demonstrated multiple steady states, including "snowball Earth" solutions triggered by albedo feedbacks, where initial cooling expands ice cover, further reducing absorption and amplifying temperature drops. Extensions include time-dependent versions adding heat capacity C \frac{dT}{dt} to the balance, enabling study of transient responses to forcings like volcanic eruptions or solar variations, with equilibrium climate sensitivity derived from linearized feedbacks around a reference state. Zonal EBMs parameterize ocean heat transport as diffusive (-D \frac{\partial^2 T}{\partial y^2}, where D is a diffusion coefficient and y latitude) or with explicit ocean-atmosphere coupling. Water vapor and cloud feedbacks are often approximated via temperature-dependent emissivity or albedo. These models have been applied to paleoclimate transitions, such as Neoproterozoic glaciations, and sensitivity analyses, revealing that ice-albedo feedback can double radiative forcing responses in high latitudes. Despite their simplicity, EBMs exhibit limitations in capturing transient climate variability, regional patterns, and nonlinear processes like convection or biosphere interactions, as they aggregate fluxes without resolving spatial heterogeneity. They overestimate diffusion coefficients compared to observations, leading to smoothed meridional gradients, and struggle with cloud-radiative feedbacks, which require empirical parameterizations prone to uncertainty. Validation against paleodata shows reasonable global means but divergences in polar amplification during ice ages. EBMs thus serve primarily as diagnostic tools for feedback mechanisms rather than predictive simulations, informing more complex models by isolating causal energy pathways.

Radiative-Convective and One-Dimensional Models


Radiative-convective models compute the vertical temperature profile in a single atmospheric column by balancing radiative fluxes with convective heat transport, assuming horizontal homogeneity and neglecting advection. These one-dimensional models treat the atmosphere as layered slabs, solving the radiative transfer equation for longwave and shortwave radiation while parameterizing convection to prevent superadiabatic lapse rates. Pioneered by Manabe and Strickler in 1964, the approach used detailed band-model calculations for water vapor, carbon dioxide, and ozone absorption, achieving close agreement with observed mid-latitude temperature profiles when convective adjustment relaxed unstable layers to a 6.5 K/km lapse rate.
In radiative-convective equilibrium, net radiative cooling in upper layers is offset by upward convective fluxes from the surface, with surface temperatures determined by energy balance including solar input, albedo, and outgoing longwave radiation. Early implementations employed gray-gas approximations for simplicity but evolved to include line-by-line spectroscopy for accuracy, enabling sensitivity tests to greenhouse gas concentrations. Manabe and Wetherald extended the framework in 1967 by incorporating relative humidity distributions, demonstrating a 2.2 K global surface warming for doubled CO2, primarily from water vapor feedback amplifying the direct radiative effect. One-dimensional models facilitate first-order estimates of tropospheric stability and cloud forcing but overestimate tropical lapse rates without moist convection schemes, as convection moistens the atmosphere and reduces radiative cooling. Modern variants, such as those in radiative-convective equilibrium intercomparisons, prescribe sea surface temperatures or free-evolving surfaces to isolate convective organization and sensitivity, yielding equilibrium climate sensitivities of 2-4 K per CO2 doubling depending on cloud parameterization. Limitations include the absence of large-scale dynamics, restricting applicability to idealized cases rather than transient climate simulations.

Intermediate Complexity Models

Intermediate complexity models, also known as Earth system models of intermediate complexity (EMICs), occupy a position in the hierarchy of climate models between simpler energy balance models and fully coupled general circulation or Earth system models. These models incorporate representations of multiple Earth system components, including atmosphere, ocean, sea ice, land surface, vegetation, and sometimes ice sheets or carbon cycles, but employ simplifications such as reduced spatial resolution, statistical-dynamical parameterizations, or zonal averaging to achieve computational efficiency. This allows simulations over millennial timescales or large ensembles that would be infeasible with higher-resolution models. Key characteristics include coarse grids (often 5-10 degrees latitude-longitude), diffusive or quasi-geostrophic atmospheric dynamics, and simplified ocean circulations like frictional geostrophic models, which prioritize essential feedbacks such as ocean heat uptake and ice-albedo effects over fine-scale processes like eddies. EMICs are particularly suited for investigating long-term climate commitments, paleoclimate reconstructions, and sensitivity to forcings like CO2 concentrations, as demonstrated in projections using eight EMICs for post-emission climate responses. Their reduced complexity facilitates uncertainty quantification by enabling rapid perturbation experiments, though this comes at the cost of limited regional fidelity and reliance on tuning to match observations. Prominent examples include LOVECLIM version 1.2, developed by the University of Louvain, which couples a quasi-geostrophic atmospheric model (ECBilt) with a primitive equation ocean (CLIO), dynamic-thermodynamic sea ice, and vegetation components (VECODE), enabling simulations of past climates like the last glacial maximum. CLIMBER models, such as CLIMBER-2 and the updated CLIMBER-X v1.0 (released in 2023), use statistical-dynamical approaches with 2D-3D ocean representations and explicit carbon cycle modules to study Earth system changes over thousands of years, including biosphere and ocean carbon feedbacks. Other instances are the UVic Earth System Climate Model (ESCM), emphasizing energy-moisture balance, and the MIT Earth System Model (MESM), which integrates intermediate ocean and atmospheric physics for carbon-constrained scenarios. Applications of EMICs extend to evaluating equilibrium climate sensitivity and transient responses, as in IPCC assessments where they bridge simple and complex models for long-term integrations. Recent developments, such as the DCESS II model (calibrated in 2025), focus on enhanced biogeochemical cycles for paleoclimate and future projections, highlighting their role in filling computational gaps despite known biases in processes like cloud feedbacks. Their efficiency supports probabilistic forecasts, but validation against paleodata reveals discrepancies in tipping elements like Atlantic meridional overturning circulation strength.

General Circulation and Earth System Models

General circulation models (GCMs) are three-dimensional numerical frameworks that simulate the physical processes governing atmospheric and oceanic circulation by discretizing the globe into a grid and solving the primitive equations of motion, including Navier-Stokes equations adapted for rotating spherical geometry, alongside thermodynamic and moisture equations. These models typically feature horizontal resolutions of 50 to 250 kilometers and 20 to 50 vertical levels, enabling representation of large-scale features like jet streams, trade winds, and ocean gyres through time-stepping integration over periods ranging from days to centuries. Early GCMs focused on atmospheric components alone, but modern implementations couple atmosphere-ocean general circulation models (AOGCMs) with sea ice and land surface schemes to capture interactions such as heat exchange and momentum transfer across interfaces. Earth system models (ESMs) extend GCMs by integrating biogeochemical and ecological processes, including interactive carbon, nitrogen, and aerosol cycles, which allow for dynamic feedbacks between physical climate and biospheric responses like vegetation growth and soil carbon storage. For instance, ESMs simulate how elevated atmospheric CO2 influences plant photosynthesis and transpiration, altering land-atmosphere fluxes that in turn affect regional precipitation and temperature patterns. Key components in ESMs encompass not only physical reservoirs (atmosphere, ocean, land, cryosphere) but also biochemical modules for ocean productivity, terrestrial ecosystems, and atmospheric chemistry, often parameterized due to unresolved scales. Examples include the Community Earth System Model (CESM), which couples the Community Atmosphere Model with ocean, land, and ice components plus biogeochemistry, and GFDL's ESM2M, incorporating prognostic ocean biogeochemistry. Both GCMs and ESMs rely on supercomputing resources for ensemble simulations, as in the Coupled Model Intercomparison Project (CMIP), where multiple models are run under standardized forcing scenarios to assess climate variability and projections. Parameterizations approximate sub-grid processes like convection, cloud formation, and turbulence, introducing uncertainties that are evaluated through hindcasts against observational data such as reanalyses from ERA5 or satellite measurements. While GCMs emphasize dynamical realism in fluid flows, ESMs prioritize holistic system interactions, though both face challenges in resolving mesoscale phenomena without excessive computational cost.

Historical Development

Early Theoretical Foundations (Pre-1960s)

The foundations of climate modeling prior to the 1960s were rooted in theoretical analyses of Earth's energy balance and the role of atmospheric gases in radiative transfer, rather than numerical simulations. In 1824, Joseph Fourier hypothesized that the atmosphere functions analogously to glass in a greenhouse by trapping outgoing terrestrial heat, explaining why Earth's surface temperature exceeds what would be expected from incoming solar radiation alone, based on comparisons of planetary temperatures and simple radiative equilibrium considerations. This insight established the conceptual basis for atmospheric retention of infrared radiation, though Fourier did not identify specific mechanisms or gases. Building on Fourier's ideas, John Tyndall conducted laboratory experiments from 1859 to 1861 demonstrating that certain atmospheric constituents, notably water vapor and carbon dioxide, selectively absorb heat rays (infrared radiation) while allowing visible sunlight to pass through. Tyndall's quantitative measurements using a spectroscope showed water vapor's strong absorption across infrared wavelengths and CO2's role in specific bands, attributing the atmosphere's heat-trapping capacity primarily to these "aqueous vapor" and minor gases rather than air itself, thus providing empirical evidence for selective radiative forcing. Svante Arrhenius advanced these concepts in 1896 by performing the first semi-quantitative calculations of CO2's climatic impact, estimating that halving atmospheric CO2 would lower global temperatures by 4–5°C, while doubling it would raise temperatures by 5–6°C, derived from radiative transfer equations incorporating absorption data and assuming logarithmic saturation effects. Arrhenius's one-layer model treated the atmosphere as a single slab emitting downward longwave radiation, balancing incoming solar energy (adjusted for albedo) against outgoing terrestrial flux via the Stefan-Boltzmann law, and he speculated on paleoclimatic implications like ice ages from CO2 variations, though his estimates assumed uniform global effects and neglected convection or water vapor feedbacks. In 1938, Guy Callendar synthesized observational data from 147 stations showing a 0.005°C per year land surface warming since the 1880s, attributing approximately half (about 0.003°C annually) to rising anthropogenic CO2 from fossil fuel combustion, which he calculated had increased concentrations by 6% over the prior 50 years. Callendar refined Arrhenius's sensitivity by factoring in empirical absorption overlaps and urban heat influences, proposing a simple energy balance where enhanced CO2 reduces outgoing longwave radiation, leading to disequilibrium and surface warming until restoration; his work emphasized verifiable trends over pure theory, countering skepticism about CO2 saturation. These pre-1960s developments provided the physical principles—radiative equilibrium, selective absorption, and sensitivity to trace gases—that later numerical models would parameterize and simulate dynamically.

Emergence of Numerical Models (1960s-1980s)

The development of numerical climate models began in the early 1960s with the pioneering work of Joseph Smagorinsky at the Geophysical Fluid Dynamics Laboratory (GFDL), where the first general circulation model (GCM) based on primitive equations was implemented to simulate global atmospheric dynamics. This model discretized the Navier-Stokes equations on a grid over the sphere, incorporating subgrid-scale parameterizations for processes like turbulence and moist convection, though limited by computational constraints to coarse resolutions (e.g., effectively 100-200 km horizontal grid spacing) and short integration times. Smagorinsky's 1963 experiments demonstrated the feasibility of numerically solving for large-scale circulations, producing rudimentary simulations of zonal winds and Hadley cells, albeit with unrealistic equatorial precipitation biases due to inadequate moist physics. By the mid-1960s, Syukuro Manabe and collaborators at GFDL advanced these atmospheric GCMs (AGCMs) by integrating radiative transfer schemes that accounted for water vapor, ozone, and carbon dioxide absorption, enabling the first assessments of climatic equilibrium states. A landmark 1967 study by Manabe and Richard Wetherald used a one-dimensional radiative-convective extension to quantify CO2 doubling effects, predicting a global surface warming of about 2.3°C, which laid groundwork for three-dimensional applications. The 1969 coupling of an AGCM with a deep-ocean GCM by Manabe and Kirk Bryan represented a critical step, yielding the first interactive ocean-atmosphere simulations that captured meridional heat transport and poleward energy fluxes, though equilibrium states required flux adjustments to prevent drift. During the 1970s, multiple institutions expanded GCM capabilities, with the UK Met Office deploying its inaugural GCM in 1972, incorporating seasonal forcing and land-sea contrasts for improved realism in mid-latitude storm tracks. Refinements included better cloud parameterizations and hydrologic cycles, allowing multi-year integrations that revealed model sensitivities to boundary conditions, such as ice sheets. The 1979 Charney Report synthesized these advances, affirming GCMs' potential for projecting CO2-induced changes while noting uncertainties in cloud feedbacks and ocean dynamics. In the 1980s, computational upgrades—such as vector processors and spectral transform methods—facilitated higher resolutions (down to 4-5° latitude-longitude grids) and inclusion of components like sea ice and land surface schemes, enabling simulations of interannual variability. GFDL's transition to spectral cores improved efficiency for climate-length runs, while international efforts standardized diagnostics, though persistent biases in tropical convection and polar amplification highlighted parameterization limitations. These models underscored the causal role of greenhouse gases in driving radiative imbalances, validated against observational climatologies, yet required empirical tuning for stability.

Expansion and Standardization (1990s-2000s)

During the 1990s, climate modeling expanded significantly with the development of fully coupled atmosphere-ocean general circulation models (AOGCMs), which integrated dynamic interactions between atmospheric, oceanic, and sea ice components to simulate global climate variability more realistically than earlier uncoupled systems. These models incorporated additional processes such as aerosol effects and land surface feedbacks, driven by advances in computational power that enabled simulations on grids with horizontal resolutions around 250-300 km. In 1995, the Working Group on Coupled Modelling (WGCM) of the World Climate Research Programme (WCRP) established the Coupled Model Intercomparison Project (CMIP) to standardize evaluations of coupled models by providing a centralized database of simulations from multiple groups. Initial phases, CMIP1 and CMIP2, involved 18 general circulation models running standardized experiments, including pre-industrial control simulations and scenarios with 1% annual CO2 increase, facilitating comparisons of model performance and uncertainties. This effort supported the Intergovernmental Panel on Climate Change's (IPCC) Second Assessment Report (AR2) in 1995, which relied on ensemble outputs from emerging AOGCMs for equilibrium climate sensitivity estimates ranging from 1.5°C to 4.5°C. The 2000s saw further standardization through expanded CMIP phases and IPCC-driven protocols, with CMIP3 launched in 2005 encompassing 25 models and 12 experiments aligned with Special Report on Emissions Scenarios (SRES) forcings developed in 2000. These advancements allowed for multi-model ensembles in IPCC AR4 (2007), which analyzed projections from over 20 AOGCMs, highlighting common patterns in temperature and precipitation responses while quantifying spread due to structural differences. Resolution improvements continued, with some models achieving ~100 km atmospheric grids by the mid-2000s, though parametrization of sub-grid processes like clouds remained a key challenge. This period marked a shift toward Earth system models (ESMs) by incorporating biogeochemical cycles, as seen in early coupled carbon-climate simulations.

Recent Advances (2010s-2025)

The Coupled Model Intercomparison Project Phase 6 (CMIP6), endorsed in 2016, marked a significant evolution in climate modeling by introducing Shared Socioeconomic Pathways (SSPs) for scenarios, enabling more comprehensive exploration of baseline emissions without policy interventions compared to CMIP5's Representative Concentration Pathways (RCPs). CMIP6 incorporated models with enhanced complexity, including more Earth System Models (ESMs) that simulate biogeochemical cycles like carbon and nitrogen, and improvements in physical process representations such as ocean biogeochemistry and atmospheric chemistry. These advancements allowed for better attribution of historical climate changes and projections supporting the IPCC Sixth Assessment Report, with some models showing refined simulations of precipitation patterns at various timescales. In the 2020s, efforts focused on increasing model resolution to kilometer scales, facilitated by supercomputing advances, to better capture extreme events like storms and urban heat islands. High-resolution regional climate models (RCMs) and convection-permitting models have improved depictions of local precipitation extremes, though challenges persist in fully resolving convective processes without excessive computational cost. Projects like the Climate Change Adaptation Digital Twin integrate high-resolution data for adaptation planning, providing detailed simulations of regional impacts. Machine learning (ML) integration emerged as a transformative approach, with emulators accelerating simulations and data-driven methods enhancing parametrizations. By 2025, ML-based atmosphere models demonstrated potential for sub-kilometer resolutions and accurate weather-to-climate predictions over extended periods, outperforming traditional physics-based models in specific tasks like extreme event forecasting. However, simpler ML architectures sometimes surpassed complex deep learning in capturing natural climate variability for local predictions. Improvements in cloud parametrization addressed longstanding biases, particularly in stratocumulus and Southern Ocean clouds, through refined microphysics and convection schemes in select models. These updates, tested in CMIP6 and beyond, enhanced mean-state simulations of clouds and precipitation, contributing to more reliable feedback estimates in warming scenarios. Overall, these developments have refined model ensembles for policy-relevant projections while highlighting ongoing needs for hybrid physics-ML frameworks to reduce uncertainties.

Validation Against Observations

Metrics for Assessing Model Skill

Climate models are evaluated using a suite of statistical metrics that quantify their ability to reproduce observed climate patterns, variability, and trends. These metrics typically compare simulated fields—such as surface temperature, precipitation, and atmospheric circulation—against observational datasets like reanalyses (e.g., ERA5) or instrumental records. Common approaches include assessing global means, regional patterns, and temporal evolution, with skill often deemed higher when models capture both amplitude and phase of variability. One foundational metric is the Pearson correlation coefficient, which measures linear similarity between model and observed spatial patterns, ranging from -1 to 1, where values near 1 indicate strong pattern agreement. For instance, correlations for annual-mean sea level pressure exceed 0.95 in many coupled models against observations. This metric emphasizes phase consistency but ignores amplitude differences, making it complementary to others. The root mean square error (RMSE) quantifies the average magnitude of differences, with centered RMSE focusing on deviations after removing mean biases to highlight pattern errors. Global RMSE for surface air temperature in CMIP5 models averaged around 1.5–2.0°C against 20th-century observations, varying by region and variable. Bias, a related metric, assesses systematic offsets, such as overestimation of tropical precipitation in some models by 0.5–1 mm/day. Taylor diagrams integrate multiple statistics—correlation, standard deviation ratio, and centered RMSE—into a polar plot for visual comparison of model performance against a reference (e.g., observations). The diagram's skill metric, derived from these, normalizes by observational variance, yielding scores where 1 indicates perfect agreement; median scores across CMIP projections for temperature time series reached 0.69 in evaluations of 17 models from 1970–2005 hindcasts. These diagrams reveal trade-offs, such as high correlation but underestimated variability in precipitation fields. Additional metrics address specific aspects, including trend correlation for long-term changes (e.g., matching observed ~0.20°C/decade warming since 1975) and variance ratios to evaluate simulated variability like ENSO amplitudes. For probabilistic skill, metrics like the continuous ranked probability score (CRPS) assess ensemble spread against observations. Evaluations often weight metrics by variable importance, though no single metric captures all fidelity dimensions, prompting multi-metric frameworks in intercomparisons like CMIP6.
MetricDescriptionTypical Application
Pearson CorrelationLinear pattern similarity (0–1 scale)Spatial fields like SLP or temperature
RMSE (Centered)Error magnitude after bias removalPattern fidelity assessment
BiasMean systematic differenceGlobal/regional means (e.g., °C or mm/day)
Taylor Skill ScoreComposite of correlation, std. dev. ratio, RMSEMulti-variable diagrams for model ranking
Trend CorrelationAgreement in linear change ratesTime series like global warming trends
These metrics, applied in hindcast validations (e.g., 1850–present), underpin model weighting in ensembles, though challenges arise from observational uncertainties and sparse data in regions like the Arctic.

Matches Between Predictions and Data

Climate models have demonstrated skill in projecting the broad-scale increase in global mean surface air temperature associated with anthropogenic greenhouse gas emissions. A evaluation of 17 projections from models published between 1970 and 2007 found that 10 were consistent with subsequent observations through 2017, with an average skill score of 0.69 when assessed against realized temperature changes; adjusting projections for discrepancies in estimated radiative forcings (such as overestimated CO2 concentrations in some early models) improved consistency to 14 out of 17 cases, confirming the models' ability to capture the temperature response to forcings. The predicted vertical structure of atmospheric temperature changes has also aligned with observations, particularly the pattern of tropospheric warming and stratospheric cooling serving as a fingerprint of greenhouse gas-driven forcing. Satellite and radiosonde data from 1979 to 2018 show tropospheric warming of 0.6–0.8 K over the four decades (1979–2018) in the tropics and robust stratospheric cooling of 1–3 K over four decades, matching multi-model ensemble simulations that attribute this differential heating to increased downward longwave radiation trapping heat in the lower atmosphere while enhancing radiative cooling aloft. Arctic amplification, the enhanced warming of high northern latitudes relative to global averages, represents another area of predictive success, with early general circulation models anticipating this phenomenon due to ice-albedo feedbacks and poleward heat transport changes; observations from 1970 to 2020 indicate annual mean amplification ratios exceeding 3.5 in recent decades, consistent with the directional and magnitude trends in coupled model projections under rising CO2 scenarios. Projections of large-scale patterns, such as the overall decline in Northern Hemisphere sea ice extent during summer months, have tracked observed trends since the 1980s, with models capturing the accelerating loss linked to surface warming and thermodynamic processes, though exact timing and extent vary across ensembles.

Persistent Discrepancies and Biases

Climate models, including those in the Coupled Model Intercomparison Project Phase 6 (CMIP6), exhibit persistent warm biases in simulated sea surface temperatures (SSTs), particularly in the Southern Ocean and during summertime in mid-latitudes, where observed trends are cooler than modeled responses to radiative forcing. These discrepancies arise partly from inadequate representation of ocean-atmosphere interactions and sea ice dynamics, leading to overestimated heat uptake in models compared to Argo float observations since 2004. For instance, CMIP6 ensembles display zonally asymmetric warm SST biases exceeding 2°C in the Southern Ocean's frontal zones, persisting across model generations despite refinements in resolution. In the tropical troposphere, models systematically overpredict warming rates, with CMIP6 simulations showing amplification of surface trends by factors of 1.5–2.0 at mid-tropospheric levels (around 200–300 hPa), whereas satellite records from Microwave Sounding Units (MSUs) and radiosondes indicate near-surface-like or subdued trends since 1979. This mismatch, documented in independent analyses, implies overestimation of convective mixing and lapse rate feedbacks, contributing to inflated equilibrium climate sensitivity (ECS) values in models, often ranging 3–5°C per CO2 doubling, against empirical constraints from the instrumental era suggesting 1.5–3°C. Radiosonde data from Christy et al. confirm tropospheric warming lags model predictions by 0.1–0.2°C/decade globally, a gap widening in CMIP6 relative to CMIP5. Precipitation biases compound these issues, with CMIP6 models overestimating extreme event frequencies and intensities in the tropics and mid-latitudes by 10–50% relative to station data, linked to deficient cloud microphysics and convective parametrization. Regional evaluations over China and Europe reveal cold winter biases and warm summer biases exceeding 1–3°C in multi-model means, distorting projections of heatwaves and droughts. Such persistent errors, while acknowledged in IPCC AR6 assessments of model evaluation, stem from unresolved sub-grid processes like aerosol-cloud interactions, underscoring limitations in causal representations of feedbacks despite computational advances. Empirical critiques, including those from observational datasets prioritized over model tuning, highlight that these biases inflate projected warming and sensitivity, as models fitting historical surface trends poorly constrain future ECS.
Bias TypeExample Region/VariableModel Over/UnderestimationObservational Reference
Warm SSTSouthern Ocean fronts+1–2°C biasShip/buoy data
Tropospheric warmingTropics (200 hPa)+0.1–0.2°C/decade excessMSU/radiosondes
Extreme precipitationGlobal land+10–50% intensityStation networks
Summer temperatureMid-latitudes+1–3°C warm/dryReanalyses

Limitations and Uncertainties

Parametrization Challenges

Parametrization refers to the approximation of subgrid-scale processes in climate models, including convection, cloud formation, turbulence, and boundary layer dynamics, which operate at scales smaller than the model's grid resolution, typically tens to hundreds of kilometers. These processes cannot be explicitly simulated due to computational limitations, necessitating heuristic or semi-empirical schemes based on simplified physical assumptions or statistical fits to observations. Imperfections in these schemes arise from incomplete understanding of underlying physics, leading to systematic biases and uncertainties that amplify across model components like radiation and hydrology. A primary challenge is convective parametrization, where schemes must represent vertical transport of heat, moisture, and momentum in unresolved updrafts and downdrafts. Differences in trigger functions, closure assumptions, and entrainment rates among schemes, such as mass-flux versus plume-based approaches, produce divergent simulations of tropical precipitation and atmospheric stability. For example, perturbed physics ensembles perturbing 17 convective and cloud parameters in the NCAR CAM5 model identified high sensitivity in cloud fraction and precipitation efficiency, contributing to inter-model spread in global hydrological cycles. These uncertainties persist despite tuning to match present-day observations, as schemes often fail to generalize to perturbed climates, such as doubled CO2 scenarios, where convective mass fluxes can vary by factors of two across models. Cloud parametrization introduces further difficulties, as clouds exert strong shortwave and longwave radiative forcings but exhibit multiscale organization defying simple closure. Models commonly overestimate low-level cloud cover in the subtropics or misrepresent diurnal cycles, with phase errors exceeding 3-6 hours compared to satellite observations like those from CERES. Such biases stem from inadequate handling of subgrid variability in humidity and stability, leading to erroneous cloud feedbacks that account for over 50% of the range in equilibrium climate sensitivity (2-5 K) across CMIP6 models. Diagnostic studies link these errors to deficiencies in prognostic equations for cloud water and ice, which rely on assumptions about microphysical processes that diverge from high-resolution large-eddy simulations. Turbulence and boundary layer parametrizations add complexity, particularly over heterogeneous surfaces like land-ocean interfaces, where subgrid variations in surface fluxes can alter energy partitioning and amplify regional biases in temperature and evaporation. Quantification efforts, including Bayesian calibration of parameters in idealized GCMs, reveal that structural uncertainties in these schemes exceed observational error bars, with convection-related parameters showing the largest posterior spreads. Overall, these challenges necessitate ongoing development, such as scale-aware or machine-learning augmented schemes, though traditional parametrizations remain prone to equifinality—multiple parameter sets yielding similar mean states but divergent variability.

Cloud and Feedback Representations

Clouds in general circulation models (GCMs) are represented through parametrizations due to their sub-grid-scale nature, typically unresolved by model grids spanning 50–250 km horizontally and multiple vertical layers. These parametrizations approximate processes like convection, microphysics, and radiative interactions using empirical or semi-empirical schemes, such as diagnostic cloud fraction based on relative humidity or prognostic equations for cloud water and ice. Challenges arise from incomplete knowledge of cloud dynamics, leading to biases in low-level stratocumulus, convective cumulonimbus, and mixed-phase clouds, where models often overestimate ice formation and fail to capture phase partitioning accurately. Cloud feedbacks, which amplify or dampen global warming, depend on changes in cloud cover, altitude, and optical properties in response to temperature perturbations. Positive feedbacks dominate in models from increased high-altitude cirrus clouds trapping outgoing longwave radiation, while negative feedbacks stem from reduced low-level cloud cover allowing more solar insolation; net cloud feedback contributes 0.2–1.0 W/m²/°C to equilibrium climate sensitivity (ECS), the largest uncertainty therein. Tropical low clouds exhibit particularly high inter-model spread, linked to climatological biases in subsidence and moisture, with some models predicting stronger positive feedbacks than observations suggest. Empirical evaluations reveal persistent discrepancies, such as models underestimating observed decreases in high-cloud fraction amid warming, implying overstated positive longwave feedbacks and potentially inflated ECS estimates up to 4–5°C. Diurnal and regional biases, including excessive nighttime cloud cover over land, further highlight parametrization shortcomings against satellite data from instruments like MODIS and CERES. Efforts to mitigate these include machine learning-based parametrizations and higher-resolution convection-permitting models, yet core uncertainties in aerosol-cloud interactions and turbulence persist, contributing to ECS ranges of 1.5–4.5°C in CMIP6 ensembles. Such limitations underscore the empirical tuning in many schemes, which may prioritize hindcasting over out-of-sample prediction, as critiqued in assessments of model fidelity.

Computational and Scalability Issues

Global climate models (GCMs) require substantial computational resources to simulate coupled physical processes across atmospheric, oceanic, land, and cryospheric components, involving the numerical solution of nonlinear partial differential equations on three-dimensional grids. Typical horizontal resolutions in CMIP6 models range from 0.4° to 1° for the atmosphere (roughly 44–111 km at the equator) and finer for oceans at about 0.25° (around 25 km), as higher resolutions exponentially increase the number of grid cells and thus demand prohibitive computing power. This constraint necessitates parametrizations for sub-grid-scale processes like convection and turbulence, which cannot be explicitly resolved due to current hardware limits. Simulations for CMIP6 experiments aggregated nearly 500,000 model years across 33 scenarios on 14 high-performance computing (HPC) systems, with core-hour costs positively correlating with model complexity, resolution, and coupling overhead (5–15% of total compute time). Institutions like NOAA's GFDL employ supercomputers with thousands of processors and petabytes of storage for such runs, where doubling resolution quadruples grid points and escalates demands for memory and parallel efficiency. Full-century simulations often span weeks to months, even on petascale machines, highlighting bottlenecks in data I/O, load balancing, and inter-component communication. Scalability issues persist in parallel architectures: atmospheric dynamics scale effectively to exascale levels, but oceanic and biogeochemical modules suffer from poor weak scaling due to load imbalances and communication overheads, limiting ensemble sizes needed for uncertainty quantification. Energy consumption exacerbates these challenges; ECMWF projections indicate that advancing to 5 km resolutions for ensemble forecasts by 2025 would render current supercomputer designs energetically unsustainable without code portability and efficiency optimizations. CMIP6 generated 40 petabytes of output data, underscoring storage and archival scalability strains alongside raw simulation costs. These computational barriers restrict model fidelity for regional phenomena and long-term projections, prompting ongoing shifts toward hybrid approaches like machine learning emulators to mitigate resource demands while preserving physical consistency.

Controversies and Empirical Critiques

Multiple studies have identified a systematic tendency for climate models in the Coupled Model Intercomparison Project Phase 6 (CMIP6) to overestimate warming trends in the troposphere when compared to satellite and radiosonde observations. In a 2020 analysis of 38 CMIP6 models, researchers found pervasive overprediction across lower- and mid-tropospheric layers both globally and in the tropics, with all models exceeding observed warming rates in every tested observational dataset analogue. This bias persists even after accounting for natural variability, suggesting structural issues in model physics rather than transient discrepancies. For instance, the models projected tropical mid-tropospheric warming rates approximately 1.5 to 2 times higher than the observed 0.7–1.0 K per decade from 1979 to 2014, based on University of Alabama in Huntsville (UAH) satellite data. In the tropical upper troposphere (200–300 hPa layer), CMIP6 models exhibit particularly pronounced overestimation, predicting enhanced warming amplification relative to the surface—a feature tied to moist convection and lapse rate feedbacks—that aligns poorly with empirical records spanning 1958–2017. Observations from radiosonde networks indicate warming rates closer to surface levels (about 1.1 times surface warming), whereas models forecast 1.5–2.0 times, contributing to the "hot model" problem where over a quarter of CMIP6 simulations imply equilibrium climate sensitivities exceeding 5°C for doubled CO2. A 2024 study confirmed that most coupled models substantially overestimate these tropical tropospheric trends over the satellite era (1979–present), even after adjustments for multi-decadal variability and potential satellite biases, undermining confidence in projections reliant on unweighted ensembles. This discrepancy has intensified from CMIP5 to CMIP6, with median model sensitivities rising from 3.0°C to 3.7°C, prompting calls to discount or exclude high-sensitivity models for more skillful historical hindcasts. Surface-level assessments reinforce these atmospheric findings, as the observed global warming rate of approximately 0.14°C per decade (UAH dataset, 1979–2023) falls below the central projections of most CMIP ensembles when normalized to equivalent forcings. Analyses excluding "hot" models—those with historical overperformance—yield improved predictive skill for future trends, reducing projected warming spreads by up to 20% under high-emission scenarios. These biases are attributed to overstated positive feedbacks, such as water vapor and cloud responses, which amplify simulated sensitivities beyond paleoclimate and instrumental constraints. While some evaluations claim model accuracy by subsetting compliant simulations, full-ensemble comparisons highlight the need for refined parametrizations to align with causal drivers of observed variability.

Influence on Policy and Projections

Climate models' projections have profoundly shaped global policy responses to anticipated warming, serving as the primary scientific foundation for frameworks like the United Nations Framework Convention on Climate Change (UNFCCC) and the 2015 Paris Agreement, which aim to limit global temperature rise to well below 2°C above pre-industrial levels. The Intergovernmental Panel on Climate Change (IPCC), in its Sixth Assessment Report (AR6) released in 2021, integrates outputs from the Coupled Model Intercomparison Project Phase 6 (CMIP6) to generate scenarios under Shared Socioeconomic Pathways (SSPs), projecting median global warming of 2.0–4.4°C by 2100 depending on emissions trajectories, thereby justifying aggressive decarbonization targets, carbon pricing, and renewable energy subsidies adopted by nations accounting for over 90% of global GDP. These projections inform integrated assessment models (IAMs) like DICE and PAGE, which quantify purported economic damages from warming—estimated at 1–4% of global GDP per degree Celsius—to support cost-benefit analyses for policies such as the European Union's Green Deal (2019) and the U.S. Inflation Reduction Act (2022). Critics argue that this influence amplifies policy stringency due to models' systematic tendency to overestimate historical warming, potentially inflating projected risks and costs. Evaluations of CMIP5 and CMIP6 ensembles against satellite-derived tropospheric temperatures from datasets like the University of Alabama in Huntsville (UAH) reveal that multi-model means have projected 1.5–2 times the observed warming rate of approximately 0.13°C per decade since 1979, with discrepancies widening in the tropical mid-troposphere where models exhibit root-mean-square errors exceeding 1°C. A subset of CMIP6 models, termed "hot" models due to their equilibrium climate sensitivity (ECS) values above the IPCC's assessed likely range of 2.5–4.0°C, contribute disproportionately to ensemble means, resulting in end-of-century projections up to 0.7°C warmer than ensembles excluding them; this bias propagates into impact assessments, exaggerating sea-level rise, heatwave frequency, and agricultural yield losses cited in policy documents. Such overestimations raise causal concerns for policy reliability, as higher ECS assumptions in models—often exceeding empirical paleoclimate and observational constraints of 1.5–3.0°C—drive scenarios emphasizing low-probability, high-impact outcomes like tipping points, despite limited evidence of their imminence in current observations. For example, AR6 projections under SSP2-4.5 informed the net-zero pledges of over 130 countries by 2050, yet retrospective validation shows that excluding hot models aligns projections more closely with the observed 0.8–1.0°C warming since 1850, suggesting policies may prioritize mitigation over adaptation or technological innovation without commensurate risk reduction. Independent assessments recommend weighting or culling biased models to refine policy inputs, arguing that unadjusted ensembles undermine causal realism in linking emissions to outcomes and could lead to opportunity costs exceeding trillions in forgone economic growth. This debate underscores the need for policy frameworks to incorporate model uncertainty ranges explicitly, rather than defaulting to central tendencies that may embed parametrization errors in cloud feedbacks and aerosol effects.

Alternative Modeling Approaches

Energy balance models (EBMs) represent a foundational alternative to complex general circulation models (GCMs), simplifying the climate system to zero- or one-dimensional frameworks that equate incoming solar radiation absorbed by Earth with outgoing longwave radiation. These models, developed since the 1970s, incorporate parameters for albedo, emissivity, and feedbacks like water vapor or lapse rate, enabling analytical solutions for equilibrium climate sensitivity (ECS). For instance, basic EBMs yield ECS estimates ranging from 1.0 to 4.0 K per CO2 doubling, depending on feedback assumptions, often lower than multi-model GCM means of 3.0-5.0 K reported in CMIP ensembles. EBMs have been applied to constrain ECS using observed energy imbalances and historical warming, such as in studies regressing radiative forcing against temperature changes from 1850-2011, producing ECS medians around 1.6-2.0 K—values critiqued for potential underestimation of long-term feedbacks absent in short records but praised for direct empirical grounding over GCM-derived projections. Such approaches highlight GCM limitations in reproducing observed tropospheric warming patterns or cloud feedbacks, where EBMs avoid parametrization uncertainties by tuning to satellite-era data. Statistical and empirical modeling techniques offer another pathway, deriving regional projections from GCM outputs via regression or analog methods rather than resolving dynamics explicitly. These include perfect prognosis schemes mapping large-scale predictors to local variables using historical observations, effective for variables like precipitation where GCMs exhibit persistent biases exceeding 20-50% in mid-latitudes. Empirical models prioritize data fidelity over physical completeness, as in comparisons showing them outperforming physics-based simulations for near-term predictability by leveraging observed covariances. Machine learning (ML) approaches are emerging as hybrids or standalone alternatives, emulating subgrid processes or forecasting directly from reanalysis data. Neural GCMs, trained on high-resolution simulations, have demonstrated medium-range weather prediction skill comparable to traditional models while reducing computational demands by orders of magnitude. In climate contexts, simpler ML architectures have surpassed deep learning baselines for temperature projections, achieving lower root-mean-square errors by avoiding overfitting to noisy training sets. Hybrid ML-physics models address GCM parametrization gaps, such as convection, but require validation against independent observations to mitigate risks of extrapolating beyond training regimes, where pure data-driven methods falter. These alternatives foster pluralism, complementing GCMs by emphasizing observability and computational efficiency amid ongoing debates over model tuning and structural biases.

Future Directions

Integration of AI and High-Resolution Techniques

Recent advancements in climate modeling have incorporated artificial intelligence (AI), particularly machine learning (ML) algorithms, to emulate complex physical processes and enable simulations at higher spatial resolutions, such as 4 km grids, which exceed the typical 100 km scales of traditional global climate models (GCMs). These AI-driven approaches, including neural networks and generative models, reduce computational demands by approximating sub-grid phenomena like convection and turbulence, allowing for hourly outputs over decades without prohibitive resource use. For instance, Google's NeuralGCM hybrid model integrates differentiable GCM physics with ML components to produce forecasts and projections that match or surpass conventional models in accuracy while running orders of magnitude faster. High-resolution techniques enhanced by AI focus on downscaling coarser GCM outputs to finer scales relevant for regional impacts, such as urban heat or localized precipitation. Convolutional neural networks (CNNs) have been applied to downscale Coupled Model Intercomparison Project Phase 6 (CMIP6) Earth system GCMs from ~100 km to 0.1° (~10 km) resolution, improving representations of orographic effects and land-atmosphere interactions in targeted domains like Europe. Similarly, interpretable deep learning methods have demonstrated superior performance over statistical downscaling for historical rainfall patterns, capturing non-linear relationships in topography and climate variables. ML emulators also augment limited-area models by generating high-fidelity projections for convection-permitting scales below 4 km, potentially enabling kilometer-scale global simulations that resolve mesoscale dynamics previously reliant on coarse parametrizations. Despite these gains, AI integration faces challenges in physical consistency and interpretability, as ML models often function as black boxes that may amplify biases in training data derived from imperfect historical observations or low-resolution simulations. Validation remains critical; while AI can accelerate processing of vast datasets for pattern recognition in extremes like droughts, natural variability in climate signals can lead deep learning models to underperform simpler physics-based alternatives in long-term predictions. Moreover, the energy-intensive training of large AI models raises concerns about net computational efficiency, potentially offsetting gains in model scalability unless mitigated by optimized hardware or hybrid designs. Ongoing efforts emphasize hybrid AI-physics frameworks to ensure causal fidelity, with datasets like ClimateSet facilitating ML benchmarking against established models.

Enhancing Uncertainty and Regional Fidelity

Efforts to enhance uncertainty quantification in climate models have increasingly relied on large multi-model ensembles, such as those from CMIP6, which sample structural differences across models to estimate projection spreads, though model selection remains critical to avoid amplifying biases in regional applications. Stochastic parametrizations address subgrid-scale uncertainties by introducing randomness in unresolved processes like convection, improving representation of model error and forecast skill compared to deterministic schemes, as demonstrated in idealized systems and operational weather models. Perturbed-parameter ensembles perturb physical parameters to capture internal variability, but they inadequately represent structural uncertainties from parameterization choices, necessitating hybrid approaches with stochastic elements for more robust probabilistic outputs. Regional fidelity improvements stem from downscaling techniques that refine coarse global climate model (GCM) outputs—typically at 50-250 km resolution—to finer scales suitable for impacts, with dynamical downscaling using nested regional climate models (RCMs) simulating mesoscale processes but requiring high computational resources and inheriting GCM boundary biases. Statistical downscaling establishes empirical relationships between large-scale GCM predictors and local observations, offering efficiency for ensembles, though its stationarity assumption falters under non-stationary future climates, as evidenced by persistent biases in precipitation extremes. Machine learning advancements, including generative models, enable hybrid dynamical-generative downscaling to produce high-resolution ensembles with reduced computational cost and better uncertainty estimates, outperforming traditional methods in capturing localized variability like tropical cyclone risks. Despite these enhancements, regional projections retain substantial uncertainties from model disagreements on feedbacks and initial conditions, with CMIP6 ensembles showing amplified spreads in extremes compared to global means, underscoring the need for observational constraints to narrow credible ranges without over-relying on potentially biased model tuning. Integration of AI-driven uncertainty quantification, such as probabilistic deep learning, promises further gains by emulating subgrid processes and propagating errors spatio-temporally, but validation against independent data remains essential to mitigate overfitting risks in diverse climates.

Coordination Efforts and International Projects

The World Climate Research Programme (WCRP), established in 1980 and co-sponsored by the World Meteorological Organization, the International Science Council, and the Intergovernmental Oceanographic Commission, coordinates global climate research efforts, including standardized modeling activities to advance understanding of Earth system variability and change. Its Working Group on Coupled Modelling (WGCM) oversees key initiatives that facilitate international collaboration among modeling centers. The flagship effort, the Coupled Model Intercomparison Project (CMIP), initiated in 1995, provides a framework for diverse international modeling groups to conduct coordinated simulations of past, present, and future climate conditions. CMIP standardizes experimental protocols, data output formats, and diagnostics, enabling systematic comparison of model results against observations and across models to identify strengths, biases, and uncertainties in projections. By 2020, CMIP6 involved contributions from over 30 institutions worldwide, producing petabytes of data used in assessments like the IPCC Sixth Assessment Report, with emphasis on high-resolution simulations and scenario-based forcings such as Shared Socioeconomic Pathways. Planning for CMIP7, announced in development as of 2023, aims to address emerging priorities like extreme event attribution, compound risks, and integration with observational networks, while enhancing computational efficiency and model diversity through community input. Complementary projects under WCRP, such as the Evaluation and Intercomparison of Earth System Models (ESMValTool), support model validation by providing standardized tools for benchmarking against empirical data from satellites and reanalyses. For regional applications, the Coordinated Regional Climate Downscaling Experiment (CORDEX), launched in 2009, coordinates downscaling of CMIP global outputs to finer grids (typically 12-50 km resolution) across continental domains, involving partnerships from over 50 countries. CORDEX has generated multi-model ensembles for domains like Africa, Europe (EURO-CORDEX), and North America (NA-CORDEX), with Phase 2 simulations aligned to CMIP6 forcings completed by 2023, facilitating sector-specific impact studies while propagating global model uncertainties to local scales. These initiatives promote data interoperability via the Earth System Grid Federation, but reliance on participating nations' resources highlights disparities in modeling capacity among developing regions.

References

  1. [1]
    Climate Models | NOAA Climate.gov
    Nov 21, 2014 · Climate models are based on well-documented physical processes to simulate the transfer of energy and materials through the climate system.
  2. [2]
    Climate Modeling - an overview | ScienceDirect Topics
    A global climate model is a complex mathematical representation of the climate system and its atmosphere, land surface, ocean, and sea ice components, and their ...
  3. [3]
    Quantifying uncertainty in climate change science through empirical ...
    An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as ...
  4. [4]
    Long‐Term Experimental Evaluation of a High‐Resolution ...
    Feb 6, 2024 · The reproducibility of Atmospheric General Circulation Model (AGCM) ... From climate model ensembles to climate change impacts and ...
  5. [5]
    Use of 'too hot' climate models exaggerates impacts of global warming
    May 4, 2022 · Hot mess. Because of problems rendering clouds, some climate models run hot. Researchers say it's better to use select models than an average.
  6. [6]
    Are climate models overestimating warming?
    Aug 12, 2024 · In reality, the average warming in the CMIP6 models exceeds observed warming over 63% of the area of the Earth. This is larger than our expected ...Missing: controversies | Show results with:controversies
  7. [7]
    Global Warming: Observations vs. Climate Models
    Jan 24, 2024 · The observed rate of global warming over the past 50 years has been weaker than that predicted by almost all computerized climate models.Missing: controversies | Show results with:controversies
  8. [8]
    Stable climate simulations using a realistic general circulation model ...
    May 16, 2022 · Stable climate simulations using a realistic general circulation model ... climate model SP-CCSM4, J. Adv. Model. Earth Syst., 6, 1185–1204 ...
  9. [9]
    The futures of climate modeling | npj Climate and Atmospheric Science
    Mar 12, 2025 · The effects of doubling the CO2 concentration on the climate of a general circulation model. J. Atmos. Sci. 32, 3–15 (1975). Article CAS Google ...
  10. [10]
    Climate Modeling - Geophysical Fluid Dynamics Laboratory - NOAA
    Climate models are tools for understanding climate behavior, using a mathematical representation of the atmosphere, land, ocean, and sea ice, divided into a 3D ...
  11. [11]
    [PDF] Building a Climate Model
    Climate models use physical variables like temperature, pressure, and wind, and prognostic variables such as temperature, dry-air mass, and water substance.
  12. [12]
    Study Confirms Climate Models are Getting Future Warming ...
    Jan 9, 2020 · Scientists use climate models to better understand how Earth's climate changed in the past, how it is changing now and to predict future climate ...
  13. [13]
    [PDF] Frequently Asked Questions
    Climate models are important tools for understanding past, present and future climate change. They are sophisticated computer programs that are based on ...
  14. [14]
    [PDF] Evaluation of Climate Models
    This chapter evaluates climate models, including their characteristics, model types, and performance using techniques for assessment.
  15. [15]
    Q&A: How do climate models work? - Carbon Brief
    Jan 15, 2018 · In a climate model, scientists set the ground rules based on the physics of the Earth system, but it is the model itself that creates the storms ...
  16. [16]
    [PDF] 2 The equations governing atmospheric flow. - Staff
    The equations governing atmospheric flow are Mass Conservation, Momentum Conservation, and Energy Conservation.
  17. [17]
    [PDF] Climate Modeling 101: - Geophysical Fluid Dynamics Laboratory
    Mar 10, 2008 · A numerical model of the climate system can be thought of as a very large set of mathematical representations of our understanding of the way ...
  18. [18]
    What is a GCM? - IPCC Data Distribution Centre
    Instead, their known properties must be averaged over the larger scale in a technique known as parameterization. This is one source of uncertainty in GCM-based ...
  19. [19]
    Parameterization of the planetary boundary layer in atmospheric ...
    This paper reviews various parameterization techniques used in current general circulation models and suggests the need for a sensitivity test to find the best ...
  20. [20]
    Temperatures from energy balance models: the effective heat ... - ESD
    Dec 16, 2020 · Energy balance models (EBMs) are highly simplified models of the climate system, providing admissible conceptual tools for understanding ...
  21. [21]
    Simple Climate Models | METEO 469: From Meteorology to Mitigation
    Simple climate models, like Energy Balance Models (EBMs), focus on energetics and thermodynamics. The Zero Dimensional EBM treats Earth as a point, balancing ...Missing: history | Show results with:history
  22. [22]
    4. Theory – Introduction to Climate Science
    Our simple energy balance model 2 from above can be modified to include a temperature dependency of the albedo, which can exhibit a runaway transition to a ...
  23. [23]
    A Global Climatic Model Based on the Energy Balance of the Earth ...
    A relatively simple numerical model of the energy balance of the earth-atmosphere is set up and applied. The dependent variable is the average annual sea level ...
  24. [24]
    [PDF] On the Budyko-Sellers energy balance climate model with ice line ...
    We focus here on the seminal zonal average EBM introduced independently by. M. Budyko [5] and W. Sellers [33] in 1969 to investigate ice-albedo feedback in the.
  25. [25]
    [PDF] Energy Balance Climate Models
    Each term in the equation was writ- ten in terms of the sea level temperature field. In doing so they distilled the climate problem into a one-dimensional.
  26. [26]
    Energy balance climate models - North - 1981 - AGU Publications
    Sellers, W. D., A climate model based on the energy balance of the earth-atmosphere system, J. Appl. Meteorol, 8, 392–400, 1969.<|separator|>
  27. [27]
    Energy balance models in climate science - ScienceDirect.com
    This class of climate models is mainly based upon the balance of streams of energy from the sun and the emission of energy to space.
  28. [28]
    Re-Examining the First Climate Models - AMS Journals
    We revisit clear-sky one-dimensional radiative–convective equilibrium (1D-RCE) and determine its equilibrium climate sensitivity to a CO 2 doubling (ECS) and ...
  29. [29]
    [PDF] Climate Modeling Through Radiative-Convective Models
    This review describes the role of radiative-convective models in the theory of climate and climate change. One of the basic objectives of climate models is ...Missing: peer- | Show results with:peer-
  30. [30]
    Thermal Equilibrium of the Atmosphere with a Convective ...
    The thermal equilibrium state in the absence of solar insulation is computed by setting the temperature of the earth's surface at the observed polar value. In ...
  31. [31]
    Radiative–convective equilibrium model intercomparison project
    RCE is an idealization of the climate system in which there is a balance between radiative cooling of the atmosphere and heating by convection. The scientific ...
  32. [32]
    Comparison of radiative-convective models with constant ... - Tellus
    In the current generation of radiative-convective models a constant value for the critical atmospheric lapse rate, generally 6.5 K/km, is used.
  33. [33]
    Description of the Earth system model of intermediate complexity ...
    LOVECLIM 1.2 includes representations of the atmosphere, the ocean and sea ice, the land surface (including vegetation), the ice sheets, the icebergs and the ...
  34. [34]
    7. Models – Introduction to Climate Science
    The range of models ordered with respect to complexity is called the hierarchy of climate models. ... Intermediate complexity models consist of 2D-EBMs, which are ...
  35. [35]
    8.8.3 Earth System Models of Intermediate Complexity - AR4 WGI ...
    Earth System Models of Intermediate Complexity that explicitly simulate the interactions between atmosphere, ocean and land surface were forced by a ...<|separator|>
  36. [36]
    Long-Term Climate Commitments Projected with Climate–Carbon ...
    Eight earth system models of intermediate complexity (EMICs) are used to project climate change commitments for the recent Intergovernmental Panel on Climate ...
  37. [37]
    What Are Climate Models and How Accurate Are They?
    May 18, 2018 · Despite a small amount of uncertainty, scientists find climate models of the 21st century to be pretty accurate because they are based on well- ...
  38. [38]
    Peer review - The Earth system model CLIMBER-X v1.0 – Part 2
    Jun 27, 2023 · This model is one of the few models of intermediate complexity to simulate Earth system changes over many thousands of years. The model ...
  39. [39]
    [PDF] CLIMBER-2: a climate system model of intermediate complexity. Part II
    Abstract A set of sensitivity experiments with the cli- mate system model of intermediate complexity. CLIMBER-2 was performed to compare its sensitivity.
  40. [40]
    Description and Evaluation of the MIT Earth System Model (MESM)
    Jul 25, 2018 · The MESM belongs to the class of Earth system models of intermediate complexity, which occupy a place between simple conceptual models and ...
  41. [41]
    Presentation, calibration and testing of the DCESS II Earth ... - GMD
    Apr 8, 2025 · Peer review · Metrics ... Presentation, calibration and testing of the DCESS II Earth system model of intermediate complexity (version 1.0).
  42. [42]
    Are general circulation models obsolete? - PNAS
    Nov 14, 2022 · The general circulation model, or GCM, is a mainstay of research into the evolving state of the Earth system over a range of timescales. The ...
  43. [43]
    So What Is in an Earth System Model? - AGU Journals - Wiley
    Dec 18, 2019 · A particular class of numerical model—the General Circulation Model (GCM)—is the most complex of numerical models used within climate and ...
  44. [44]
    Two decades of Earth system modeling with an emphasis on Model ...
    Oct 20, 2020 · Earth system models are climate models incorporating biogeochemical processes such as the carbon cycle.<|separator|>
  45. [45]
    Earth System Model - Geophysical Fluid Dynamics Laboratory - NOAA
    The climate system consists of five interacting components: the atmosphere, the hydrosphere, the cryosphere, the land surface and the biosphere. Scenarios of ...
  46. [46]
    DOE Explains...Earth System and Climate Models
    These reduced complexity models provide lower resolution climate information but are easier and faster to run. This makes them perfect for research questions ...Missing: intermediate | Show results with:intermediate
  47. [47]
    Community Earth System Model: Home
    What We Offer. Fully coupled numerical simulations of the Earth system consisting of atmospheric, ocean, ice, land surface, carbon cycle, and other components.
  48. [48]
    Climate and Ecosystems Comprehensive Earth System Models
    Earth System Models (ESMs) include physical climate model components plus bio-geo-chemistry simulation, used to assess climate and ecosystem changes.
  49. [49]
    Joseph Fourier, the 'greenhouse effect', and the quest for a universal ...
    Joseph Fourier, the 'greenhouse effect', and the quest for a universal theory ... The English translation of Fourier's 1824 article, by. Google Scholar.
  50. [50]
    The Carbon Dioxide Greenhouse Effect - American Institute of Physics
    The talk inspired Arrhenius to take a deep look. In 1896 Arrhenius completed a laborious numerical computation which suggested that cutting the amount of CO2 ...
  51. [51]
    I. The Bakerian Lecture.—On the absorption and radiation of heat by ...
    Tyndall John. 1861I. The Bakerian Lecture.—On the absorption and radiation of heat by gases and vapours, and on the physical connexion of radiation ...
  52. [52]
    John Tyndall's discovery of the 'greenhouse effect', - RSC ECG
    During 1860 and 1861 Tyndall developed an experimental arrangement in the basement laboratory of the Royal Institution to measure heat transfer and absorption ...
  53. [53]
    [PDF] On the Influence of Carbonic Acid in the Air upon the Temperature of ...
    Arrhenius's paper is the first to quantify the contribution of carbon dioxide to the greenhouse effect (Sections I-IV) and to speculate about.
  54. [54]
    Arrhenius 1896: First Calculation of Global Warming
    Sep 6, 2024 · The expected temperature rise from a doubling of carbon dioxide is called the Equilibrium Climate Sensitivity (ECS). Arrhenius's estimate ...
  55. [55]
    The artificial production of carbon dioxide and its influence on ...
    The increase in mean temperature, due to the artificial production of carbon dioxide, is estimated to be at the rate of 0.003°C. per year at the present time.
  56. [56]
    [PDF] Callendar G.S., 1938 The artificial production of carbon dioxide and ...
    Small changes of atmospheric carbon dioxide do not affect the amount of sun heat which reaches the surface, because the CO, absorption bands lie well outside ...
  57. [57]
    [PDF] On increasing global temperatures: 75 years after Callendar
    In his 1938 paper, Guy Callendar used data from 147 weather stations to estimate an observed increase of global temperatures of around 0.3°C over the previous ...
  58. [58]
    GENERAL CIRCULATION EXPERIMENTS WITH THE PRIMITIVE ...
    GENERAL CIRCULATION EXPERIMENTS WITH THE PRIMITIVE EQUATIONS. I. THE BASIC EXPERIMENT. J. SMAGORINSKY.
  59. [59]
    Brief History of Global Atmospheric Modeling at GFDL
    Joseph Smagorinsky and Syukuro Manabe pioneered the development of numerical models of the atmosphere suitable for studying the Earth's climate in the 1950's ...
  60. [60]
    Timeline: The history of climate modelling - Carbon Brief
    Jan 16, 2018 · Norman Phillips' first general circulation model in 1956. The establishment of a modelling group at the National Center for Atmospheric Research ...
  61. [61]
    Climate Calculations with a Combined Ocean-Atmosphere Model in
    For the fulldetails the reader is referred to Manabe (1969) andBryan (1969). The numerical model of the atmosphere is verysimilar to that described by Manabe ...
  62. [62]
    (PDF) Forty years of numerical climate modeling - ResearchGate
    Aug 5, 2025 · This paper discusses some of the important developments during the first 40 years of climate modelling from the first models of the global ...
  63. [63]
    [PDF] History of climate modeling - University of Michigan Library
    The history of climate modeling begins with conceptual models, followed in the. 19th century by mathematical models of energy balance and radiative transfer ...
  64. [64]
    [PDF] Climate Models An Assessment of Strengths and Limitations
    ulated time, as required of a climate model used for studying anthropogenic climate change. ... SIS general circulation model. International J. Climatology ...
  65. [65]
    CMIP Overview - Coupled Model Intercomparison Project
    In 1995, the WGCM established the Coupled Model Intercomparison Project (CMIP) to provide climate scientists with a database of coupled global model simulations ...
  66. [66]
    CMIP5 - Coupled Model Intercomparison Project Phase 5 - Overview
    CMIP5 is meant to provide a framework for coordinated climate change experiments for the next five years and thus includes simulations for assessment in the AR ...History · Data Portal · Getting Started · Data Description
  67. [67]
    [PDF] Climate Models and Their Evaluation
    climate model (Knutson and Tuleya, 1999; Walsh et al., 2004). Projections ... general circulation model. In: Investigations on the Model System of the ...
  68. [68]
    CMIP6: the next generation of climate models explained - Carbon Brief
    Dec 2, 2019 · One major improvement to CMIP6 scenarios is a better exploration of possible baseline “no climate policy” outcomes. The prior generation of ...
  69. [69]
    CMIP Phase 6 (CMIP6) - Coupled Model Intercomparison Project
    This CMIP6 overview paper presents the background and rationale for the new structure of CMIP, providing a detailed description of the CMIP Diagnostic, ...
  70. [70]
    Enhanced performance of CMIP6 climate models in simulating ...
    May 6, 2024 · This study thoroughly compares CMIP5 and CMIP6 climate models in simulating historical precipitation at various time scales for the study region.1 Introduction · 2 Study Area And Data · 3 Methodology
  71. [71]
    GISS and NCCS Contribute to CMIP6 International Climate Model ...
    Apr 30, 2021 · Impact: The model simulations GISS contributed to CMIP6 play an essential role in attributing climate changes in the historical period, ...<|separator|>
  72. [72]
    Extreme events: New stride forward in high-resolution climate ...
    Apr 23, 2025 · “Running high-resolution simulations with a fully coupled climate model is one of the best ways to study how extreme events and the climate ...Missing: 2020s | Show results with:2020s
  73. [73]
    Improving extreme rainfall predictions: the limits of high-resolution ...
    Jul 18, 2024 · In the new study, researchers explored whether higher-resolution models could more accurately simulate large storms across the western United States.Missing: 2020s | Show results with:2020s
  74. [74]
    Climate Change Adaptation Digital Twin: a window to the future of ...
    Apr 30, 2024 · The Climate DT is a step change in climate information, providing high-resolution data for adaptation, and a digital twin that integrates  ...
  75. [75]
    A Deep Learning Earth System Model for Efficient Simulation of the ...
    Aug 25, 2025 · Machine learning (ML) based atmosphere models have recently demonstrated the ability to accurately predict weather over 10 days, and have proven ...
  76. [76]
    The rise of machine learning in climate modelling - Jordan - 2025
    Apr 24, 2025 · ML could provide a step-change in climate modelling. For exampe, with ML, global models representing Earth's processes at resolutions finer than 1km with great ...What is a climate model? · What is machine learning? · How are climate models...
  77. [77]
    Simpler models can outperform deep learning at climate prediction
    Aug 26, 2025 · New research shows the natural variability in climate data can cause AI models to struggle at predicting local temperature and rainfall. Adam ...
  78. [78]
    [PDF] Significant improvement of cloud representation in the global climate ...
    Jul 12, 2019 · The improvement is particularly pronounced over the Southern Ocean. Trenberth and Fasullo (2010) showed that a significant lack of clouds over ...Missing: 2020s | Show results with:2020s
  79. [79]
    Impact of Microphysics and Convection Schemes on the Mean‐State ...
    Aug 12, 2025 · These improvements allow for a better representation of clouds and precipitation in model simulations of the present-day climate but also allow ...
  80. [80]
    Understanding CMIP6: Key Insights and Implications for Climate ...
    Feb 25, 2025 · While CMIP6 shows overall improvements, the advancements don't render CMIP5 obsolete; the choice between them depends on specific use cases.
  81. [81]
    Evaluating the Performance of Past Climate Model Projections
    Dec 4, 2019 · The first time series projections of future temperatures were computed using simple energy balance models in the early 1970s, most of which ...Missing: pre- | Show results with:pre-
  82. [82]
    [PDF] Taylor Diagram Primer Karl E. Taylor - PCMDI
    Taylor diagrams (Taylor, 2001) provide a way of graphically summarizing how closely a pattern. (or a set of patterns) matches observations.
  83. [83]
    Taylor Diagrams - Climate Data Guide
    Taylor diagrams provide a concise statistical summary of how well patterns match each other in terms of their correlation, their root-mean-square difference ...
  84. [84]
    Assessing sensitivities of climate model weighting to multiple ... - ESD
    Feb 3, 2023 · This project assesses the sensitivities of climate model weighting strategies and their resulting ensemble means to multiple components.
  85. [85]
    A Quantitative Method to Evaluate the Performance of Climate ...
    Aug 8, 2021 · A recent study proposed a method for quantitative evaluation of climate model simulations of TC track characteristics in a specific basin.
  86. [86]
    Comparison of indicators to evaluate the performance of climate ...
    Jul 11, 2023 · This study examines the behavior of six indicators, considering spatial correlation, distribution mean, variance, and shape.
  87. [87]
    [PDF] Taking climate model evaluation to the next level - OSTI.GOV
    Traditionally, many climate projections are shown as multi-model averages in the peer-reviewed literature and IPCC reports, with the spread across models ...
  88. [88]
    [PDF] Comparison of Climate Model Large Ensembles With Observations ...
    This tends to result in observational predictions that are still fairly consistent with tradi- tional climate model evaluation metrics like RMSE. Now we can ...
  89. [89]
    Observed Temperature Changes in the Troposphere and ...
    The results show a robust cooling of the stratosphere of about 1–3 K, and a robust warming of the troposphere of about 0.6–0.8 K over the last four decades ( ...
  90. [90]
    Coupled chemistry climate model simulations of stratospheric ...
    Jul 14, 2009 · Overall, the models agree better with observations than in previous assessments, primarily because of corrections in the observed temperatures.Missing: match | Show results with:match
  91. [91]
    Annual Mean Arctic Amplification 1970–2020: Observed and ...
    Jun 25, 2022 · Annual mean Arctic Amplification (AA) within the period 1970–2020 changed in steep steps around 1986 and 1999. It reached values over 4.0 ...<|separator|>
  92. [92]
    A Review of Arctic Sea Ice Climate Predictability in Large-Scale ...
    Apr 7, 2022 · We provide a high-level review of sea ice models used for climate studies and of the recent advances made with these models to understand sea ice ...Sea Ice Models Used For... · Historical Sea Ice Loss · Sea Ice Model Advances Since...
  93. [93]
    Persistent Discrepancies between Observed and Modeled Trends in ...
    It is shown that the latest generation of models persist in not reproducing the observations-based SST trends as a response to radiative forcing.Missing: AR6 | Show results with:AR6
  94. [94]
    Origins of Southern Ocean warm sea surface temperature bias in ...
    Aug 24, 2023 · Here we find that the warm SST bias in the SO features a zonally oriented non-uniform pattern mainly located between the northern and southern fronts of the ...
  95. [95]
    Biases in Climate Model Global Warming Trends Related to ...
    Apr 4, 2025 · Here we estimate the impact of this discrepancy on radiative feedback and global temperature trends, finding that the difference in sea ice ...
  96. [96]
    The problem of tropospheric temperature trend measurements for ...
    The problem is that the troposphere is not warming as fast as predicted by climate models, according to observations by John Christy and colleagues.Missing: failures | Show results with:failures
  97. [97]
    New confirmation that climate models overstate atmospheric warming
    Aug 25, 2020 · Two new peer-reviewed papers from independent teams confirm that climate models overstate atmospheric warming and the problem has gotten worse ...
  98. [98]
    Past warming trend constrains future warming in CMIP6 models - PMC
    Mar 18, 2020 · We show that projected future warming is correlated with the simulated warming trend during recent decades across CMIP5 and CMIP6 models.<|separator|>
  99. [99]
    Multi-decadal climate variability and satellite biases have amplified ...
    Jun 21, 2024 · Coupled model simulations, however, distinctly overestimate satellite-observed warming trends, and moreover fail to capture the cooling or muted ...
  100. [100]
    Understanding the Biases in Daily Extreme Precipitation ...
    Jun 17, 2025 · As a primary tool for understanding the past and predicting the future, climate models still exhibit significant biases in extreme precipitation ...
  101. [101]
    Evaluation of CMIP6 for historical temperature and precipitation over ...
    At the seasonal scale, most models exhibited a warm temperature bias in summer and a cold bias in winter. The CMIP6 MME displayed a higher reproducibility of ...
  102. [102]
    Understanding and Reducing Warm and Dry Summer Biases in the ...
    Abstract. Most climate models in phase 6 of the Coupled Model Intercomparison Project (CMIP6) still suffer pronounced warm and dry summer biases in the ...
  103. [103]
    Chapter 4 | Climate Change 2021: The Physical Science Basis
    This chapter assesses simulations of future global climate change, spanning time horizons from the near term (2021–2040), mid-term (2041–2060), and long term ( ...
  104. [104]
  105. [105]
    (PDF) Deep learning to represent subgrid processes in climate models
    The representation of nonlinear subgrid processes, especially clouds, has been a major source of uncertainty in climate models for decades.
  106. [106]
    Optimizing climate models with process knowledge, resolution, and ...
    Jun 19, 2024 · Accelerated progress in climate modeling is urgently needed for proactive and effective climate change adaptation.
  107. [107]
    Uncertainty in parameterized convection remains a key obstacle for ...
    Jun 9, 2023 · We find that differences in modeled column-average CO 2 are strongly correlated with the differences in the models' convection.
  108. [108]
    Uncertainty quantification based cloud parameterization sensitivity ...
    Oct 15, 2020 · Using uncertainty quantification techniques, we carry out a sensitivity analysis of a large number (17) of parameters used in the NCAR CAM5 cloud ...Spatial Distribution Of... · Methodology · Morris Method Based...
  109. [109]
    Calibration and Uncertainty Quantification of Convective Parameters ...
    Aug 19, 2021 · A primary source of uncertainties in climate models comes from representation of small-scale processes such as moist convection. Parameters in ...
  110. [110]
    Diurnal cloud cycle biases in climate models | Nature Communications
    Dec 22, 2017 · Here we quantify the mean, amplitude, and phase of the DCC in climate models and compare them with satellite observations and reanalysis data.
  111. [111]
    A Machine Learning Parameterization of Clouds in a Coarse ...
    Mar 4, 2024 · Cloud feedbacks on climate change are the largest driver of uncertainty in climate sensitivity to greenhouse gas increases (Caldwell et al., ...Introduction · Data and Methods · Results · Conclusions
  112. [112]
    Systematic Errors in Weather and Climate Models - AMS Journals
    The workshop brought together a wide range of experts on simulating the Earth system to advance the understanding of the root causes of systematic model errors ...
  113. [113]
    Climate impacts of parameterizing subgrid variation and partitioning ...
    Jan 4, 2023 · This study highlights the importance of subgrid surface energy variations and partitioning to the atmosphere in simulating the hydrological and energy cycles ...
  114. [114]
    Parameter Uncertainty Quantification in an Idealized GCM With a ...
    Jan 14, 2022 · We use time-averaged climate statistics to calibrate and quantify uncertainty of model parameters in an idealized general circulation model ...
  115. [115]
    Stable machine-learning parameterization of subgrid processes for ...
    Jul 3, 2020 · The parameterization leads to stable simulations at coarse resolution that replicate the climate of the high-resolution simulation. Retraining ...
  116. [116]
    BREAKING THE CLOUD PARAMETERIZATION DEADLOCK
    The cloud parameterization problem is "deadlocked" due to slow progress, and a new strategy is needed to represent smaller-scale processes in large-scale  ...
  117. [117]
    (PDF) Cloudiness Parameterization for Use in Atmospheric Models
    Jun 7, 2023 · This paper begins by providing a review of the parameterization of cloudiness that has been used for numerical weather predictions and climate ...
  118. [118]
    Realistic representation of mixed-phase clouds increases projected ...
    Jul 20, 2024 · Because cloud parameterizations assume homogeneously mixed ice and liquid particles, cloud parameterizations often grow too much ice at the ...
  119. [119]
    A turbulence-informed parameterization of phase partitioning in ...
    Jun 26, 2025 · In this study, we present a new cloud phase partitioning parameterization developed for the ICOLMDZ atmospheric model.<|separator|>
  120. [120]
    Tropical Cloud Feedbacks Estimated from Observed Multidecadal ...
    Cloud feedbacks are the largest sources of uncertainty in estimates of equilibrium climate sensitivity (Zelinka et al. 2020). Long-standing difficulties in ...
  121. [121]
    Relationship Between Tropical Cloud Feedback and Climatological ...
    Dec 13, 2024 · Global climate model (GCM) projections of future climate are uncertain largely due to a persistent spread in cloud feedback.
  122. [122]
    (PDF) An analytical framework reduces cloud feedback uncertainty ...
    Oct 5, 2025 · Cloud radiative feedback remains the largest source of uncertainty in future climate projections, but current constraints are insufficient. Here ...
  123. [123]
    Climate Models Underestimate Global Decreases in High‐Cloud ...
    Apr 9, 2025 · Here, we find that the response of high clouds to warming in many climate models is not supported by observational evidence.
  124. [124]
    Diurnal cloud cycle biases in climate models - PMC
    Dec 22, 2017 · Here we quantify the mean, amplitude, and phase of the DCC in climate models and compare them with satellite observations and reanalysis data.Missing: critiques | Show results with:critiques
  125. [125]
    Stable Machine‐Learning Parameterization of Subgrid Processes in ...
    Jul 10, 2025 · Accurately representing cloud and convection has long been a major challenge for accurate climate model simulations due to the relatively coarse ...
  126. [126]
    (PDF) The computational and energy cost of simulation and storage ...
    Oct 23, 2023 · This paper shows the main results obtained from the collection of performance metrics done for CMIP6 (CPMIP). The document provides the list of ...
  127. [127]
    STAR‐ESDM: A Generalizable Approach to Generating High ...
    Jul 23, 2024 · The horizontal spatial resolution of GCMs has increased significantly over the past few decades, with grid cells for CMIP6 models typically ...
  128. [128]
    The Limits of Climate Modeling - Yale E360
    Jun 16, 2008 · Some climate modelers say that even with the extraordinary supercomputing power now available, the answer is no. That, by being lured into ...
  129. [129]
    The computational and energy cost of simulation and storage ... - GMD
    Apr 19, 2024 · This paper presents the main findings obtained from the CPMIP (the Computational Performance Model Intercomparison Project), a collection of a common set of ...
  130. [130]
    Supercomputers can take months to simulate the climate – but my ...
    May 1, 2024 · Ocean models especially suffer from such poor “scaling”. Ten times faster. This is where the new computer algorithm that I've developed and ...
  131. [131]
    Exascale Computing and Data Handling - AMS Journals
    This paper describes key technical and budgetary challenges, identifies gaps and ways to address them, and makes a number of recommendations.
  132. [132]
    Scalability - ECMWF
    Supercomputer energy consumption at ECMWF would have to increase unviably if the more complex forecasting systems of the future were to be run on the current ...
  133. [133]
    Fast, accurate climate modeling with NeuralGCM - Google Research
    Jul 22, 2024 · NeuralGCM presents a new approach to building climate models that could be faster, less computationally costly, and more accurate than existing models.
  134. [134]
    Pervasive Warming Bias in CMIP6 Tropospheric Layers - 2020
    Jul 15, 2020 · The tendency of climate models to overstate warming in the tropical troposphere has long been noted. Here we examine individual runs from 38 ...
  135. [135]
    Variations of Tropical Lapse Rates in Climate Models and their ...
    In a nutshell, according to most radiosonde products, CMIP6 models overestimate upper tropospheric warming, and according to all radiosonde products on ...
  136. [136]
    Climate simulations: recognize the 'hot model' problem - Nature
    May 4, 2022 · In CMIP6, more than one-quarter of models have sensitivities that are greater than this, and around one-fifth show warming of at least 5 °C in ...Missing: peer | Show results with:peer
  137. [137]
    To What Extent Does Discounting 'Hot' Climate Models Improve the ...
    Oct 1, 2024 · However, a few of the most recent generation climate models 'run hot' in the historical period, widening the spread of future global warming.
  138. [138]
    The Impact of “Hot Models” on a CMIP6 Ensemble Used by Climate ...
    CMIP6 includes several “hot” climate models whose sensitivity to greenhouse gas forcings exceeds the likely range inferred from multiple lines of evidence.Introduction · Climate data · Model weighting · Summary and conclusionsMissing: implications | Show results with:implications
  139. [139]
    Guest post: How climate scientists should handle 'hot models'
    May 4, 2022 · The latest “CMIP6” generation of climate models includes a subset of “hot models” that point towards much greater warming than expected.
  140. [140]
    A Study of Climate Sensitivity Using a Simple Energy Balance Model ...
    The results of simple zonal energy balance climate models are rather sensitive to the parameterizations used to calculate the fluxes of solar radiation ...
  141. [141]
    A new energy-balance approach to linear filtering for estimating ...
    Sep 17, 2020 · This study proposes a new linear-filtering method for estimating historical radiative forcing from time series of global mean surface ...
  142. [142]
    Misdiagnosis of Earth climate sensitivity based on energy balance ...
    A recent paper, M15 [1], applies a simple energy balance model (EBM) in order to estimate climate response. Compared with other studies using a similar approach ...
  143. [143]
    Review of approaches for selection and ensembling of GCMs
    May 13, 2020 · Abstract. Global climate models (GCMs) are developed to simulate past climate and produce projections of climate in future.
  144. [144]
    Empirical: Modeling v. Observations | OSS Foundation
    Empirical data is based on observation, while models are capable of being verified by observation. Observations are the reality, and models are models verified ...
  145. [145]
    Neural general circulation models for weather and climate - Nature
    Jul 22, 2024 · In the hybrid model approach, a machine-learning component replaces or corrects the traditional physical parameterizations of a GCM. Until now, ...
  146. [146]
    For a Pluralism of Climate Modeling Strategies in - AMS Journals
    The continued development of general circulation models (GCMs) toward increasing resolution and complexity is a predominantly chosen strategy to advance ...
  147. [147]
    High-resolution meteorology with climate change impacts from ...
    Apr 9, 2024 · Here we present open-source generative machine learning methods that produce meteorological data at a nominal spatial resolution of 4 km at an hourly frequency.
  148. [148]
    High-resolution downscaling of CMIP6 Earth system and global ...
    Jan 12, 2024 · Four convolutional neural network (CNN) architectures were evaluated for their ability to downscale, to a resolution of 0.1 ∘ , seven CMIP6 ESGCMs over the ...
  149. [149]
    High-resolution downscaling with interpretable deep learning
    Here we test the extent to which developments in deep learning can out-perform existing statistical approaches for downscaling historical rainfall.
  150. [150]
    Potential for Machine Learning Emulators to Augment Regional ...
    In this paper, we explore the potential of machine learning (ML) to augment high-resolution climate projections from both RCMs and CPMs, with the aim of ...
  151. [151]
    Advancements and challenges of artificial intelligence in climate ...
    May 19, 2025 · Artificial intelligence (AI) has transformed climate modeling by improving predictive accuracy, processing efficiency, and data integration. ...
  152. [152]
    Advances and challenges in energy and climate alignment of AI ...
    The rapid growth of artificial intelligence (AI) infrastructure deployment presents significant challenges for global energy systems and climate goals.
  153. [153]
    ClimateSet | ClimateSet - A Dataset of Climate Models for Machine ...
    Climate models are critical tools for analyzing climate change and projecting its future impact. The machine learning (ML) community has taken an increased ...
  154. [154]
    Uncertainty-informed selection of CMIP6 Earth system model ... - ESD
    Oct 15, 2024 · In this work, we present a method to select a subset of the latest phase, CMIP6, featuring models for use as inputs to a sectoral impact or multisector ...Missing: implications | Show results with:implications<|control11|><|separator|>
  155. [155]
    Stochastic Parameterization: The Importance of Nonlocality and ...
    Aug 31, 2025 · In the weather paradigm the primary advantage of stochastic parameterizations is that they allow to quantify uncertainty due to model error in ...
  156. [156]
    Stochastic and Perturbed Parameter Representations of Model ...
    In particular, perturbed parameter ensembles are unable to represent structural uncertainty owing to the choices made when developing the parameterization ...
  157. [157]
    Regional climate models: 30 years of dynamical downscaling
    This paper describes the major achievements of RCMs, critically reviewing the main issues and limitations that have been featured in the literature.
  158. [158]
    Twenty-First-Century Challenges in Regional Climate Modeling in
    Aug 1, 2015 · Nevertheless, there are open-ended issues such as the best design for a regional climate model ensemble, the choice of GCMs and RCMs, possible ...
  159. [159]
    Climate Model Downscaling | Climate Data User Guide - EPRI
    Mar 26, 2025 · Downscaling refers to a set of methods used to improve the spatial and temporal resolution of information from GCMs.
  160. [160]
    Dynamical-generative downscaling of climate model ensembles
    We propose an approach combining dynamical downscaling with generative AI to reduce the cost and improve the uncertainty estimates of downscaled climate ...
  161. [161]
    Enhancing Regional Climate Downscaling through Advances in ...
    The primary hurdle is their high computational cost, which results in most CORDEX-type experiments today being performed at a “gray zone” resolution (12–25 km), ...
  162. [162]
    Quantifying CMIP6 model uncertainties in extreme precipitation ...
    Over these regions, the cohort of CMIP6 models does not provide robust evidence that global warming will intensify extreme precipitation. While the small ...
  163. [163]
    New Potential to Reduce Uncertainty in Regional Climate ...
    Jul 18, 2023 · Combining new constraints on future socio-economic trajectories and the climate system's response to emissions can substantially reduce the projection ...
  164. [164]
    Evaluating Probabilistic Deep Learning Methods for Uncertainty ...
    Climate models often exhibit biases in their precipitation predictions, particularly underestimating high-intensity events and overestimating low precipitation.
  165. [165]
    World Climate Research Programme (WCRP)
    The aims of WCRP is to facilitate analysis and prediction of Earth System variability and change for use in an increasing range of practical applications of ...Events · Core Projects · About us · Research on Climate...
  166. [166]
    CMIP6 - Coupled Model Intercomparison Project Phase 6 - PCMDI
    The WCRP Working Group on Coupled Modelling (WGCM) oversees the Coupled Model Intercomparison Project, which is now in its 6th phase.
  167. [167]
    Coupled Model Intercomparison Project: CMIP
    CMIP is a project of the World Climate Research Programme (WCRP) providing climate projections to understand past, present and future climate changes.CMIP Phase 6 (CMIP6) · About CMIP · CMIP Phase 5 (CMIP5) · CMIP Data
  168. [168]
    Overview: Climate Model Intercomparison Project (CMIP)
    The CMIP is a standard experimental framework for studying the output of coupled atmosphere-ocean general circulation models.
  169. [169]
    An evolving Coupled Model Intercomparison Project phase 7 ... - GMD
    Oct 1, 2025 · The Coupled Model Intercomparison Project (CMIP) coordinates community-based efforts to answer key and timely climate science questions, ...
  170. [170]
    ESMO: bridging climate modelling and observations communities ...
    ESMO coordinates, advances, and facilitates all modelling, data assimilation and observational activities within WCRP, working jointly with all other WCRP ...
  171. [171]
    Cordex – Coordinated Regional Climate Downscaling Experiment
    The CORDEX vision is to advance and coordinate the science and application of regional climate downscaling through global partnerships.Data access · Cordex-cmip6 · CORDEX CORE Simulations · Cordex-cmip5
  172. [172]
    CORDEX - Coordinated Regional Climate Downscaling Experiment
    Mar 3, 2021 · Summary: The Coordinated Regional Downscaling Experiment (CORDEX) is a CMIP6 diagnostic MIP requesting specific CMIP6 output for regional ...
  173. [173]
    NA-CORDEX: Home
    Regional climate change scenario data and guidance for North America, for use in impacts, decision-making, and climate science.