Fact-checked by Grok 2 weeks ago

Global surface temperature

denotes the spatially averaged temperature of Earth's near-surface air over land and sea surface over oceans, serving as a primary indicator of planetary thermal state in . Instrumental records, commencing reliably around 1850, derive from thermometers in weather stations, ship measurements, buoys, and floats, with post-1979 microwave data aiding sea surface components after calibration to in-situ observations. Multiple independent datasets, including NASA's GISTEMP, NOAA's GlobalTemp, the UK's HadCRUT, Japan's JMA, and Earth's analysis, converge on a long-term warming trend of approximately 1.1 to 1.3 °C from the 1850–1900 baseline to the present, with 2024 registering as the warmest year on record at about 1.55 ± 0.13 °C above pre-industrial levels per ensemble assessments. This upward trajectory exhibits decadal variability influenced by phenomena such as El Niño-Southern Oscillation, volcanic eruptions, and solar cycles, superimposed on a prevailing linear increase, though recent analyses detect no statistically significant acceleration beyond the post-1970s rate in most series. Uncertainties in these reconstructions stem from sparse early coverage, particularly in the and oceans, station relocations, and homogenization adjustments for non-climatic biases like urban heat islands, with error margins narrowing from ~0.2 °C pre-1900 to ~0.05 °C in recent decades across datasets. While peer-reviewed syntheses affirm the robustness of the observed warming, debates persist over the magnitude of adjustments—some amplifying trends—and the integration of proxy data for pre-instrumental context, revealing that current levels, though elevated relative to the , align within variability bounds when reconstructed from ice cores and sediments. These records underpin attributions linking much of the post-1950 rise to anthropogenic , tempered by natural forcings, yet underscore the need for ongoing scrutiny of measurement fidelity amid institutional tendencies toward trend-favoring methodologies.

Definition and Fundamentals

Conceptual Definition

The global mean surface temperature (GMST), often referred to as global surface air temperature, is the area-weighted average of near-surface air temperatures over continental land masses and sea surface temperatures (SST) over oceanic areas, yielding an integrated metric of the planet's near-surface thermal state. This composite index reflects the thermodynamic temperature at the atmosphere-surface interface, where land components derive from standardized air temperature readings and ocean components from water-skin or shallow-depth measurements, combined in proportion to Earth's surface coverage (roughly 29% land and 71% ocean). GMST is typically analyzed as temporal anomalies relative to a reference period—such as 1850–1900 to approximate pre-industrial conditions or 1951–1980 for instrumental-era baselines—to isolate deviations from historical norms while accounting for incomplete spatial coverage in early records. Near-surface air temperature over land is measured at heights of 1.5 to 2 meters above level using thermometers enclosed in ventilated shelters, such as Stevenson screens, to shield against direct sunlight, precipitation, and ground conduction while capturing representative atmospheric conditions. , by contrast, captures the temperature of the ocean's upper —ranging from the infinitesimal skin depth (influenced by evaporative cooling and infrared radiation) to bulk depths of 3–10 meters via ship engine intakes or moored buoys—introducing methodological variances that require harmonization for global averaging. These measurements are aggregated via grid-based (e.g., 5° × 5° cells), with area applied using latitude-dependent cosine factors to prevent polar overrepresentation, though the resulting GMST remains a statistical rather than a uniform physical observable, susceptible to in remote regions like the . Conceptually, GMST serves as a diagnostic for Earth's radiative imbalance, where sustained positive anomalies indicate net accumulation at the surface, but it does not equate to total planetary content, which includes deeper layers and cryospheric storage. Natural forcings like variations and volcanic aerosols, alongside internal variability (e.g., El Niño-Southern Oscillation), superimpose short-term fluctuations on any underlying trend, necessitating multi-decadal averaging—often 30-year periods—for robust trend assessment. Despite its utility, the metric's reliance on heterogeneous observing networks underscores inherent uncertainties in pre-1950 estimates, where station density was sparse and ship-based SST dominated.

Measurement Principles and Challenges

Global surface temperature combines near-surface air temperatures over land, measured approximately 2 meters above the ground, with sea surface temperatures representing the upper ocean layer, typically the top 10 meters. Land measurements rely on thermometers housed in , ventilated enclosures with louvered sides that shield instruments from direct sunlight, precipitation, and ground radiation while permitting free airflow to capture representative air temperatures. These stations, part of networks like the Global Historical Climatology Network (GHCN), number around 7,000 to 10,000 active sites globally, though coverage varies by region and era. Sea surface temperatures historically derived from ship-based methods, including canvas buckets (pre-1940s, prone to evaporative cooling) and engine intake water (post-1940s, potentially warmer due to depth), have transitioned to moored and drifting buoys since the 1970s for greater consistency. To derive a global metric, datasets compute temperature anomalies relative to a baseline period (e.g., 1951–1980), interpolate values onto a latitude-longitude grid (typically 5° × 5° or finer), apply area weighting to account for varying grid cell sizes toward the poles, and average land (30%) and ocean (70%) components. Key challenges include sparse spatial coverage, particularly in the , regions, and remote oceans before the mid-20th century, where data voids necessitate and amplify uncertainties—estimated at ±0.2°C for 19th-century globals versus ±0.05°C recently. inhomogeneities arise from transitions in and practices, such as shifts from non-standard exposures to Stevenson screens in the late or bucket-to-intake methods, introducing biases up to 0.3°C that require post-hoc corrections. effects, where station proximity to expanding cities elevates readings by 0.1–1°C locally, pose ongoing issues; while adjustments aim to mitigate this, analyses indicate residual contributions to land trends, potentially 20–25% in U.S. records and smaller globally, with debates over adjustment efficacy due to increasing near stations. Additional complications stem from time-of-observation biases, elevation variations, and siting changes (e.g., airport relocations), which homogenization algorithms address but cannot fully eliminate without metadata gaps, leading to divergent trends across datasets like NOAA, GISTEMP, and HadCRUT. These factors underscore the reliance on statistical infilling and the inherent limits of surface networks in capturing a truly , homogeneous signal.

Data Sources and Methods

Instrumental Surface Records (1850-Present)

Instrumental surface temperature records comprise in-situ measurements of near-surface air temperatures over land and sea surface temperatures (), with systematic global compilation beginning in 1850. These records draw from readings at stations and ship-based observations, initially concentrated in the Northern Hemisphere's mid-latitudes. Coverage in 1850 encompassed approximately 57% of Earth's surface, primarily and , expanding to over 75% by later decades through additional stations and maritime routes. Land measurements utilize thermometers housed in standardized shelters, such as Stevenson screens, recording daily maximum and minimum temperatures converted to monthly averages. The Global Historical Climatology Network (GHCN) aggregates from thousands of stations, with early years featuring fewer than 1,000 reporting sites worldwide, rising to over 2,000 by 1900 and exceeding 5,000 from the 1950s onward. prior to the mid-20th century relied on ships collecting water samples via buckets, subject to potential underestimation from evaporative cooling during hauling, shifting to engine room intake measurements after that could overestimate due to warm seawater exposure. Prominent datasets include HadCRUT from the UK Met Office and , NOAA GlobalTemp, GISTEMP, and Berkeley Earth, each integrating land and components to produce gridded anomalies relative to baselines like 1850–1900 or 1961–1990. Berkeley Earth, for instance, processes over 1.6 billion reports from multiple archives into a land- product spanning 1850 to present. These records exhibit close agreement on long-term trends despite methodological differences, though early sparse sampling, particularly in the and remote , necessitates and introduces higher in pre-1900 estimates.

Ocean and Buoy Measurements

Ocean surface temperatures, which cover approximately 71% of Earth's surface, are primarily measured using ship-based observations, moored and drifting buoys, and profiling floats such as those from the ARGO array. Historical ship measurements relied on methods like canvas or wooden buckets hauled from depth, uninsulated buckets, and later engine room intakes (ERIs), each introducing systematic biases due to heat exchange with air or hulls. Bucket methods typically recorded cooler temperatures than ERIs because of evaporative cooling during hauling, with differences up to 0.5°C depending on insulation and wind conditions. Modern buoy measurements, including moored buoys from networks like the National Data Buoy Center (NDBC) and drifting buoys, use hull-mounted or subsurface sensors that minimize exposure biases, providing more consistent skin-layer or bulk temperature readings. Collated comparisons show ship observations averaging 0.12°C warmer than nearby buoys during dawn conditions, attributed to residual heat from ship structures or measurement timing. The program, deploying over 3,900 autonomous profiling floats since 2000, contributes surface data during float ascents, enhancing spatial coverage in remote areas but primarily excels in subsurface profiles up to 2,000 meters. data have refined estimates of upper-ocean heat content, indicating accelerated warming rates post-2004 at approximately 0.4–0.6 W/m² globally. Datasets such as NOAA's Extended Reconstructed (ERSST) and the Hadley Centre's Had apply adjustments to homogenize these sources, correcting for method transitions like the shift from buckets to in the mid-20th century and the increasing fraction since the . In ERSST version 4, data are adjusted upward by about 0.1°C to align with ERI records, based on paired observations, to mitigate underestimation of warming from the cool . However, these adjustments assume ship data as a baseline despite known ship warm es, potentially amplifying long-term trends; independent -only analyses still confirm post-1980 warming of 0.7–0.8°C in adjusted records. Recent evaluations highlight uncertainties in pre-1940 data, with evidence of a cold in early ship s leading to underestimated early-20th-century warming by up to 0.2°C after statistical corrections. The transition to buoy dominance, now comprising over 50% of observations in some regions, improves by reducing method-specific errors but introduces challenges in blending with sparser historical ship data, affecting reconstructions. Peer-reviewed assessments affirm that post-adjustment SST trends from 1979–2015 align across datasets at 0.08–0.10°C per decade, with s and supporting robust ocean warming signals amid natural variability like El Niño.

Satellite and Upper-Air Observations

Satellite-based measurements of utilize microwave radiometers, such as the Microwave Sounding Unit (MSU) and Advanced MSU (AMSU), deployed on NOAA polar-orbiting starting with NOAA-4 in December 1978. These instruments detect thermal emissions from atmospheric oxygen at specific frequencies, enabling the derivation of brightness temperatures for distinct vertical layers: the lower (TLT, weighted from near-surface to approximately 8 km altitude), mid- (MTT), and tropospheric trends (TTT). The TLT product serves as a bulk indicator of lower atmospheric warming, influenced by surface conditions but extending aloft, and requires corrections for , sensor drift, and diurnal sampling biases. Principal datasets include the (UAH) version 6.0/6.1, processed by Roy Spencer and , and Remote Sensing Systems () version 4.0, with a third from the Center for Satellite Applications and Research (STAR). From January 1979 to December 2024, UAH reports a global TLT trend of +0.15 °C per decade (+0.22 °C per decade over land, +0.14 °C per decade over oceans). RSS estimates a steeper trend of approximately +0.21 °C per decade over the same period. These rates are derived after merging data across satellite platforms and applying empirical adjustments, though methodological differences—such as handling of NOAA-14 instrument degradation and tropical hot spot amplification—contribute to variances between datasets. Comparisons with surface records reveal tropospheric trends generally lower than 2-meter air increases (around +0.18 °C per in datasets like NOAA or HadCRUT5), especially in the where forcing predicts enhanced warming aloft due to moist . This discrepancy persists in UAH , showing near-zero tropical TLT trends in some analyses (+0.08 °C per to 2004), while aligns more closely with surface amplification. Independent validations, including reanalyses like ERA5, support warming signals but highlight uncertainties in pre-1990s from handover artifacts. Upper-air observations complement satellites via radiosondes—instruments attached to weather balloons that ascend to 30-40 km, measuring , , and at high vertical resolution. The Integrated Global Archive (IGRA) compiles data from over 2,800 stations worldwide, with consistent coverage since the 1950s, though global averages emphasize land biases. Homogenization is essential to correct for instrument shifts (e.g., from carbon hypsometers to thermocouples in the ), time-of-day changes, and urban effects, as often exhibit spurious cooling trends from upgrades. Homogenized datasets, such as RAOBCORE and , indicate global tropospheric warming of +0.1 to +0.2 °C per since 1979, broadly consistent with TLT after adjustments. For instance, lower tropospheric trends from select tropical stations +0.08 to +0.15 °C per to 2004, aligning with UAH but below . Stratospheric cooling (-0.3 °C per ) contrasts with tropospheric warming, as expected from and stratification. Critics, including analyses of adjustment methodologies, note that corrections frequently amplify post-1979 warming without fully independent metadata verification, potentially mirroring surface record issues where algorithmic homogeneity assumes error patterns favoring recent increases. Integration of and data enhances global coverage, particularly over oceans and remote regions where surface stations are sparse, but vertical weighting differences preclude direct equivalence to surface air metrics. validate layer assignments, with post-2000 convergence in trends amid improved instrumentation, though pre- records (1958-1979) show muted warming (+0.3 °C globally in 850-300 layer to 1987), underscoring adjustment sensitivities. Ongoing efforts, like merged reanalyses, aim to resolve residual uncertainties for attribution studies.

Proxy-Based Reconstructions

Proxy-based reconstructions estimate past surface temperatures using indirect environmental indicators preserved in geological and biological archives, extending records beyond the era beginning around 1850. These proxies include tree-ring widths and maximum latewood density for temperature-sensitive growth in forests, oxygen-18 isotope ratios in ice cores reflecting air temperature at precipitation sites, Mg/Ca ratios in foraminiferal shells from ocean sediments indicating sea surface temperatures, and borehole thermometry from heat diffusion in . Each proxy type responds to temperature with varying sensitivity, seasonal bias, and potential confounding influences from or CO2 fertilization, necessitating against overlapping data via transfer functions derived from or approaches. Statistical methods aggregate multiple proxies into hemispheric or global means, often employing principal component analysis to identify common signals amid noise, or optimal information extraction techniques to weight proxies by reliability. Multi-proxy ensembles, such as those from the PAGES 2k Consortium's database of 692 records across 648 sites covering continents and oceans, aim to enhance robustness by diversifying sources and reducing type-specific errors. Reconstructions for the past two millennia typically show Northern Hemisphere temperatures varying within ±0.5°C of the 1850-1900 baseline during the Medieval Warm Period (roughly 950-1250 CE) and cooling during the Little Ice Age (1450-1850 CE), with 20th-century warming exceeding these fluctuations by 0.6-1.0°C in median estimates. However, regional proxy evidence indicates the Medieval Warm Period approached or matched modern warmth in parts of the North Atlantic and Asia, challenging claims of global uniformity in pre-industrial variability. A persistent challenge is the divergence problem in dendrochronological proxies, where tree-ring indices from high-latitude sites fail to capture post-1960 warming despite tracking earlier 20th-century rises, potentially stemming from increased winter warming, , or elevated CO2 suppressing signals. This discrepancy has prompted truncation of calibration periods or exclusion of in some reconstructions, raising concerns over inflated pre-calibration variability or underestimated modern anomalies; critics argue it undermines the reliability of tree rings as centennial-scale thermometers without validation. Spatial coverage remains skewed toward the and land areas, with sparse and ocean data amplifying uncertainties in global means, estimated at ±0.2-0.5°C for millennium-scale reconstructions. Methodological sensitivities, including proxy selection and network , can alter amplitude of past variations by up to 50%, as demonstrated in ensemble tests. Over the epoch (last 11,700 years), proxy syntheses reveal a Holocene Thermal Maximum peaking 9,000-5,000 years ago, with summer temperatures 0.5-2°C above late 20th-century levels in continental interiors, driven by and ice-sheet retreat, followed by Neoglacial cooling toward the . Global annual mean reconstructions using data assimilation of pollen, chironomid, and non-marine proxies indicate peak warmth regionally asynchronous, with mid-Holocene averages comparable to or exceeding recent decades in and , though trends differ due to influences. Uncertainties in Holocene records stem from proxy-specific seasonal biases—e.g., tree rings favoring summer, ice cores winter—and dating imprecision exceeding 100 years in sediments, complicating precise global synchrony assessments. Despite advances in multiproxy integration, debates persist over whether current warming rates or levels are unprecedented, given evidence of comparable Holocene excursions under natural forcings alone.

Data Processing and Potential Biases

Homogenization Techniques

Homogenization techniques detect and adjust for artificial breaks in station temperature records caused by non-climatic factors, such as instrument replacements, station relocations, or changes in times, to ensure the series reflects climatic variability alone. These methods generally rely on relative comparisons with neighboring stations, assuming spatial in true signals, and proceed in two phases: changepoint detection via statistical tests on difference series and adjustment by shifting segments to align with references. Absolute methods, referencing theoretical standards, are less common for surface air temperatures due to sparse historical baselines. The pairwise homogenization algorithm (PHA) exemplifies relative methods, constructing difference time series between a candidate station and multiple nearby references, then identifying breaks where the mean difference shifts significantly using a two-phase test. Potential breaks are ranked by a quality metric balancing detection power and spatial coherence, with adjustments applied iteratively from largest to smallest to avoid propagating errors; this handles large efficiently, as validated in tests reducing trend biases under simulated inhomogeneities. NOAA applies PHA to monthly in the U.S. Historical Climatology (USHCN version 2.5, last updated May 2008) and upstream for Global Historical Climatology (GHCN) contributions. The Standard Normal Homogeneity Test (SNHT) provides an alternative detection tool, normalizing the series to zero and variance, then computing a from the cumulative sum of deviations to flag shifts, with critical values adjusted for sample size and break location (e.g., central breaks yield higher power). SNHT excels for additive offsets but assumes Gaussian residuals and may miss gradual changes; it has been used in networks like the Centenary for monthly means from 1901–1990. Berkeley Earth integrates homogenization into a Bayesian framework, fitting each station's series with piecewise linear trends and changepoints estimated via against hundreds of neighbors, simultaneously modeling the regional signal to derive spatially informed adjustments without sequential break processing. This approach, detailed in their averaging process, processes over 39,000 stations and yields trends aligning closely with after corrections. NASA's GISTEMP relies on NOAA's pre-homogenized GHCN and USHCN land data without further station-level corrections, instead smoothing anomalies over 1200 km radii and applying fixed urban-rural offsets derived from pairwise rural comparisons to mitigate site-specific biases. HadCRUT5, combining CRUTEM5 land and HadSST4 sea data, incorporates homogenization performed by national services prior to submission, with central steps emphasizing infilling via and ensemble uncertainty rather than raw adjustments.

Adjustments for Station Issues and Urban Heat

Adjustments for non-climatic station issues, such as relocations, instrumentation changes, and variations in observation timing, are applied through homogenization algorithms that detect and correct discontinuities in temperature records. The NOAA's Pairwise Homogenization Algorithm (PHA), used in the Global Historical Climatology Network monthly (GHCNm) dataset, identifies abrupt shifts by pairwise comparisons between a target station and highly correlated neighbors, assuming shared regional climate signals while isolating local artifacts. Detected breaks are adjusted to minimize differences across the network, with the algorithm favoring corrections that align earlier periods to the most recent instrumentation and location standards; for instance, in GHCNm version 3.2.0, this process reduced artificial cooling biases from undocumented changes. Similar techniques in the U.S. Historical Climatology Network (USHCN) address time-of-observation biases, which can inflate daily maximum temperatures by up to 0.3°C if readings shift from morning to afternoon without correction. Urban heat island (UHI) effects, arising from localized warming due to impervious surfaces, reduced vegetation, and in populated areas, introduce positive biases estimated at 0.1–1.0°C in urban stations relative to rural ones. Homogenization partially mitigates UHI by regressing urban records toward rural neighbors, as urban discontinuities manifest as step changes detectable against less-affected stations. Berkeley Earth's methodology explicitly models UHI by classifying stations via nighttime lights data and satellite-derived , reconstructing global land trends with rural subsets that yield warming rates within 0.01°C per decade of full-network results, indicating UHI contributes negligibly (<0.05°C total since 1950) to global averages due to the dominance of rural and oceanic coverage. Critics contend that homogenization can inadvertently blend UHI signals into rural records when urban stations comprise a significant fraction of neighbors, a phenomenon termed "urban blending" observed in analyses of over 800 European GHCN stations where post-adjustment rural trends aligned more closely with urban ones than raw data suggested. Independent validations, however, demonstrate convergence between adjusted land trends and satellite-derived lower-troposphere records, with U.S. rural stations post-1895 showing summer warming of 0.7–1.0°C after UHI corrections, comparable to national averages. These adjustments, while peer-reviewed and iteratively refined, remain debated for their reliance on statistical assumptions over metadata, with fewer than 20% of NOAA shifts corresponding to documented station events in some audits.

Criticisms of Adjustment Methodologies

Critics contend that homogenization and adjustment procedures applied to surface temperature records, intended to correct for non-climatic inhomogeneities such as station relocations, instrument changes, and time-of-observation biases, often fail to eliminate systematic errors or introduce new ones that exaggerate warming trends. These methods, employed by datasets like NOAA's Global Historical Climatology Network (GHCN) and NASA's GISS, rely on algorithms such as pairwise comparison, which assume nearby stations share similar climatic signals; however, when neighboring stations are influenced by correlated anthropogenic factors like urbanization, corrections can propagate biases across the network. A prominent critique focuses on inadequate mitigation of the urban heat island (UHI) effect, where urban development artificially elevates local temperatures. In their 2007 analysis of gridded global land temperatures from 1979 to 2002, Ross McKitrick and Patrick Michaels regressed temperature anomalies against socioeconomic variables proxying for development (e.g., GDP per capita and energy consumption); they found these factors accounted for roughly 50% of the observed warming trend after adjustments, indicating residual contamination not addressed by standard homogenization. This suggests that adjustments, by blending urban and rural records without sufficient isolation of UHI signals, underestimate the non-climatic contribution to reported trends. Further evidence of methodological flaws emerges from examinations of the "urban blending" phenomenon in homogenization. A 2023 study by Ronan Connolly and colleagues applied synthetic data tests to U.S. and global records, demonstrating that pairwise algorithms inadvertently transfer UHI-induced warming from urban breakpoints to rural stations during offset calculations, creating spurious upward trends in homogenized series by up to 0.2–0.5°C per century in affected regions. The authors argued this arises because algorithms prioritize spatial proximity over land-use differences, failing to distinguish climatic from developmental breaks, and recommended incorporating explicit UHI diagnostics absent in current NOAA and Hadley Centre procedures. Station siting quality represents another vector of criticism, with many legacy networks featuring sensors in suboptimal locations (e.g., near asphalt, air conditioning exhausts, or buildings), introducing warm biases estimated at 0.1–1.0°C locally. Evaluations of the revealed that over 80% of stations prior to 2010 rated poorly (Class 3–5) under 's own siting metadata, yet adjustments primarily target temporal breaks rather than spatial exposure issues. A 2021 assessment concluded that this uncorrected siting bias inflates U.S. temperature estimates by approximately 0.3–0.5°C since 1950, with homogenization exacerbating rather than resolving the discrepancy when compared to pristine networks like the , which records lower warming rates post-2005. Critics, including those analyzing version upgrades, note that iterative adjustments have progressively cooled early-20th-century U.S. records (e.g., by 0.4°F in maximum temperatures from 1930–2000), enhancing apparent century-scale warming by up to 40% relative to raw data in some analyses, though agencies maintain such changes align with independent validations. These concerns are compounded by limited independent auditing and reliance on automated algorithms without routine raw-versus-adjusted trend comparisons against unadjusted high-quality subsets. While proponents cite pairwise method benchmarks showing reduced error variance, detractors argue real-world tests against reference networks like USCRN reveal persistent divergences, underscoring the need for greater transparency in adjustment metadata and breakpoint rationales to verify claims of trend enhancement.

Overall Temperature Anomalies Since 1850

Instrumental records of global surface temperature anomalies, primarily from land stations and sea surface temperatures, begin reliably around 1850 and reveal a net warming of about 1.5°C from the 1850-1900 baseline to 2024. This estimate draws from independent analyses by , HadCRUT, NOAA GlobalTemp, and Berkeley Earth, which converge on a linear trend of roughly 0.06-0.08°C per decade over the full period, with acceleration to 0.18-0.20°C per decade since 1970. Early 20th-century anomalies fluctuated around -0.3 to -0.4°C relative to pre-industrial levels, influenced by natural variability including volcanic eruptions and ocean cycles, before rising steadily post-1950. Spatial coverage was sparse before 1900, concentrated in Europe and North America, leading to larger uncertainties estimated at ±0.1-0.2°C for global means in that era. Datasets employ infilling techniques for unsampled regions, such as oceans and polar areas, which constitute over 70% of Earth's surface; these methods assume anomalies propagate from observed points, introducing potential smoothing of extremes. Despite variations in processing—NASA and NOAA apply homogenization for station relocations and urban effects, while Berkeley Earth uses more stations with minimal adjustments—the post-1950 trends align closely, supporting a robust signal of multi-decadal warming. Annual anomalies reached +1.54°C above 1850-1900 in 2023, the first year to surpass the 1.5°C threshold, with 2024 exceeding it further. Decadal smoothing highlights the progression: 1850-1900 averaged near zero by definition, 1901-1950 showed minimal net change (~+0.1°C), 1951-2000 warmed by ~0.5°C, and 2001-2020 by another ~0.5°C, culminating in the 2020s as the warmest decade. Interannual variability, driven by , overlays the trend, with peaks in 1998, 2016, and 2023-2024 amplifying recent highs. Uncertainties diminish over time with denser networks, but pre-1880 estimates carry higher error bars due to fewer than 1,000 stations globally by 1850. Cross-validation with independent buoy and ship data post-1980 confirms the upward trajectory without systematic divergence.

Recent Developments (Post-2000 Including 2024-2025)

Global surface temperature anomalies have risen steadily since 2000, with major datasets reporting an average increase of approximately 0.18°C per decade from 2000 to 2020, accelerating in the 2010s and early 2020s. This trend reflects contributions from greenhouse gas forcings amid natural variability, including (ENSO) cycles. The 1998-2013 interval exhibited slower surface warming rates (about 0.05°C per decade in some records), prompting debate over a potential "hiatus" in the rate of rise relative to prior decades; analyses attributed this to internal variability such as enhanced ocean heat uptake in deeper layers rather than a cessation of anthropogenic warming, with subsequent data confirming resumed acceleration. The decade 2015-2024 marked the warmest on record across datasets from , , , , and , surpassing the 2011-2020 period by 0.2-0.3°C on average. Eighteen of the nineteen warmest years since 1850 have occurred since 2000, with 2024 confirmed as the hottest globally at 1.28°C above the 1951-1980 baseline () or approximately 1.55°C above pre-industrial levels ( aggregate). This record was influenced by a strong event peaking in 2023-2024, which amplified surface temperatures, alongside persistent anthropogenic forcings; 24% of Earth's surface set local annual records in 2024 per . In 2025, through August, global anomalies ranked second-warmest on record behind 2024, with January-March averaging among the highest starts to a year. April 2025 recorded 1.49°C above the 1850-1900 baseline (Berkeley Earth), the second-highest for that month. Transition to La Niña conditions by mid-2025 has moderated extremes, yet anomalies remain elevated above the 1991-2020 average across 91% of the globe as of early 2025. These developments align across independent datasets, though discrepancies arise from differing baseline periods and adjustment protocols; for instance, Copernicus ERA5 reanalysis shows 2024 at 1.6°C above 1850-1900, slightly higher than land-ocean index estimates.
YearAnomaly (°C, relative to 1850-1900, Berkeley Earth example)Notes
2024~1.62Warmest on record; El Niño peak
2023~1.54Second-warmest; record prior to 2024
2020~1.02Third-warmest pre-2023 streak
2025 (Jan-Aug)~1.50 (preliminary)Second to 2024; La Niña onset

Warmest Years, Decades, and Periods

According to multiple independent datasets, including those from NOAA, NASA, and Berkeley Earth, 2024 was the warmest year on record since instrumental measurements began in 1850, with a global surface temperature anomaly of approximately 1.29°C above the 20th-century average. This surpassed the previous record set in 2023, which had an anomaly of about 1.17°C. The third warmest year was 2016, influenced by a strong El Niño event. The ten warmest years in the instrumental record have all occurred within the last decade (2015–2024), reflecting accelerated warming in recent times. This ranking holds across major datasets such as NOAA's GlobalTemp, NASA's GISTEMP, and Berkeley Earth's global temperature series, despite minor variations in exact anomalies due to differences in baseline periods and coverage. For instance, NOAA reports the ranking as 2024, 2023, 2020, 2016, 2022, 2019, 2015, 2017, 2010, and 2005.
RankYearAnomaly (°C, relative to 20th-century average, NOAA)
12024+1.29
22023+1.17
32020+1.02
42016+0.99
52022+0.89
62019+0.86
72015+0.84
82017+0.83
92010+0.72
102005+0.68
The most recent decade, 2015–2024, is the warmest on record, with an average anomaly exceeding previous decades by a significant margin. Earlier decades, such as 2005–2014, ranked second warmest, while pre-1980 decades show much lower anomalies. This decadal warming trend is consistent across datasets, including and , which incorporate both land and ocean measurements. Longer periods, such as the ongoing warming phase since the early 2010s, exhibit sustained high temperatures, punctuated by interannual variability from phenomena like El Niño. As of mid-2025, the year is projected to rank among the top three warmest, based on January–July data showing the second-highest anomalies in NOAA's record.

Influencing Factors and Attribution

Natural Climate Variability

Natural climate variability encompasses fluctuations in global surface temperature arising from internal dynamics of the climate system, such as ocean-atmosphere interactions, and external natural forcings like variations in solar irradiance and volcanic aerosol emissions. These processes generate short-term to multidecadal variations superimposed on longer-term trends. The El Niño-Southern Oscillation (ENSO) represents a primary mode of interannual variability, with El Niño phases typically elevating global mean surface temperatures by 0.1 to 0.25 °C relative to neutral conditions, while La Niña phases produce cooling of similar magnitude. For instance, the strong 2023 El Niño contributed to a rapid 0.29 ± 0.04 K rise in global-mean surface temperature from 2022 to 2023. ENSO influences arise from altered heat redistribution in the Pacific Ocean, affecting atmospheric circulation and global energy balance. On decadal to multidecadal timescales, oscillations such as the and drive low-frequency variations in sea surface temperatures that partially imprint on global averages. The , characterized by North Atlantic SST anomalies, and , involving Pacific basin-wide patterns, exhibit phases lasting 20-60 years, with positive phases associated with hemispheric warming influences that can contribute up to 0.1-0.2 °C to global temperature anomalies during their peaks. These modes reflect internal ocean dynamics and do not exhibit a net trend over the instrumental period, unlike the observed century-scale warming. Solar variability, primarily through the 11-year sunspot cycle, modulates total solar irradiance by approximately 1 W/m², corresponding to global temperature perturbations of about 0.1 °C. However, reconstructions indicate that solar forcing has been stable or slightly declining since the mid-20th century, exerting minimal influence on recent warming. Volcanic eruptions provide episodic cooling through stratospheric sulfate aerosols that reflect sunlight, with major events like the 1991 Mount Pinatubo eruption inducing a global temperature drop of roughly 0.5 °C persisting for 1-3 years. Such effects are transient, as aerosols settle out within a few years, and do not offset multidecadal trends.
Variability ModeTimescaleApproximate Global Temperature Impact
ENSOInterannual±0.1–0.3 °C
PDO/AMODecadal–Multidecadal~0.1–0.2 °C phase-dependent
Solar Cycle~11 years~0.1 °C
Major Volcanic EruptionEpisodic (1–3 years)–0.5 °C
Collectively, these natural factors account for much of the observed year-to-year and decadal-scale fluctuations in global surface temperature records but fail to explain the sustained multi-decadal rise when considered in isolation from anthropogenic forcings.

Anthropogenic and Solar Forcings

Anthropogenic forcings primarily arise from emissions of well-mixed greenhouse gases (GHGs) such as carbon dioxide (CO₂), methane (CH₄), and nitrous oxide (N₂O), along with tropospheric ozone increases and aerosol effects. The effective radiative forcing (ERF) from CO₂ alone reached approximately 2.16 W/m² from 1750 to 2019, calculated via the logarithmic dependence of absorption on concentration, with atmospheric CO₂ rising from about 280 ppm to over 410 ppm by 2019 due to fossil fuel combustion, deforestation, and cement production. Total ERF from all well-mixed GHGs is estimated at around 3.24 W/m² over the same period, partially offset by a negative aerosol ERF of -1.3 [-2.0 to -0.6] W/m² from sulfate and black carbon particles that reflect sunlight and alter cloud properties. Net anthropogenic ERF thus totals about 2.72 [1.96 to 3.48] W/m² since pre-industrial times, with GHGs dominating the positive component and exerting a sustained influence on the energy budget imbalance. Solar forcings stem from variations in total solar irradiance (TSI), the primary measure of incoming solar energy at Earth's top-of-atmosphere, which fluctuates on cycles of about 11 years due to sunspot activity and longer-term trends. Reconstructions of TSI since 1850 indicate an increase of roughly 0.1 to 0.2 W/m² from the late 19th century to the mid-20th century, driven by rising sunspot numbers during that era, but with minimal net change or slight decline thereafter; for instance, satellite measurements since 1978 show TSI varying by about 1 W/m² over solar cycles but no significant upward trend, and a negative decadal trend of -0.15 W/m² since 1980. The ERF from these solar changes is small, typically 0.05 to 0.17 W/m² over the full instrumental period, as TSI variations are on the order of 0.1% and primarily affect the shortwave spectrum without strong feedback amplification in standard models. Indirect solar influences, such as ultraviolet variations affecting stratospheric ozone or cosmic ray modulation of clouds, remain debated and unquantified in mainstream ERF assessments due to insufficient empirical support for significant global surface impacts. Attribution analyses using detection and attribution methods, which compare observed temperature changes to model simulations with and without specific forcings, indicate that anthropogenic GHGs explain the majority of post-1950 global surface warming, with natural forcings including solar contributing negligibly or a slight cooling offset in recent decades. Simulations isolating solar forcing alone yield temperature responses of less than 0.1°C since 1850, far below the observed ~1.1°C rise, while including anthropogenic forcings matches the trend, including the divergence after the 1950s when solar activity plateaued amid rising GHG concentrations. Peer-reviewed studies consistently attribute over 100% of recent warming to anthropogenic factors when accounting for natural variability, as solar and volcanic forcings combined would have produced modest cooling without human influences. However, uncertainties persist in aerosol forcing sign and magnitude, and some reconstructions suggest early 20th-century warming aligned more closely with solar peaks, highlighting potential underestimation of solar sensitivity in certain models, though empirical TSI data limits its overall causal role.

Uncertainties in Causal Mechanisms

Equilibrium climate sensitivity (ECS), defined as the long-term global surface temperature response to a doubling of atmospheric CO2 concentration, remains a primary source of uncertainty in projecting future warming, with estimates ranging from 1.5°C to 4.5°C in assessments like IPCC AR6, though some recent analyses suggest the lower bound may be underestimated due to methodological biases in paleoclimate proxies and model tuning. Transient climate response (TCR), the shorter-term response under ongoing forcing, exhibits similar ranges, complicating attribution of observed trends to specific forcings as models diverge in simulating historical warming patterns. These uncertainties arise partly from incomplete constraints on radiative feedbacks, where water vapor and lapse rate feedbacks are relatively well-understood as positive but modest contributors (around 1-2 W/m² per degree), while cloud feedbacks introduce larger variability, potentially amplifying or dampening warming by 0.5-2 W/m² per degree depending on low-cloud responses to sea surface temperature changes. Aerosol radiative forcing, particularly from sulfate particles, exerts a cooling effect that masks anthropogenic greenhouse gas warming, but estimates of its magnitude vary widely from -0.1 to -1.0 W/m² historically, with indirect cloud-aerosol interactions adding further ambiguity as models struggle to replicate observed aerosol-cloud precipitation suppression. This forcing uncertainty propagates into attribution studies, where reducing aerosol emissions could unmask additional warming of 0.5-1°C by 2100 under moderate scenarios, yet observational constraints remain sparse due to regional variability and short aerosol lifetimes. Natural forcings, such as solar irradiance variations, contribute uncertainties of ±0.1 W/m² over decadal scales, with reconstructions differing by up to 0.2 W/m² due to proxy limitations like sunspot records, potentially explaining 10-20% of early 20th-century warming but less in recent decades. Internal climate variability, including ENSO and multi-decadal oscillations like the AMO, introduces noise that can account for 0.1-0.3°C fluctuations over decades, masking or exaggerating anthropogenic signals in short records and leading to attribution confidence levels below 90% for sub-centennial trends in some regions. Attribution models often underestimate this variability, as evidenced by CMIP6 ensembles failing to capture observed multidecadal oscillations, which inflates the attributed fraction of warming to greenhouse gases (typically 80-100% since 1950) while downplaying natural contributions. Ocean heat uptake efficiency adds another layer, with diffusion models implying slower surface warming than abrupt adjustments, but parametric uncertainties in mixing processes yield TCR spreads of 20-30%. Overall, these mechanisms underscore that while anthropogenic forcings dominate centennial-scale trends, precise partitioning remains hindered by model-observation discrepancies and forcing estimate errors, with emergent constraints from satellite data suggesting ECS values toward the lower half of prior ranges in some analyses.

Dataset Comparisons and Discrepancies

Consistency Across Major Records

Major global surface temperature datasets, including from the UK Met Office and University of East Anglia, NASA's , NOAA's , Berkeley Earth's land-ocean record, and Japan's JMA dataset, demonstrate strong consistency in their depiction of a long-term warming trend since 1850, with anomalies rising approximately 1.1–1.2°C relative to pre-industrial baselines. This alignment persists despite variations in data sources, such as the use of weather station networks, ship and buoy measurements for sea surface temperatures, and differing interpolation methods for sparse regions like the . All datasets concur on the identification of recent years as the warmest in the instrumental record, with the World Meteorological Organization confirming 2024 as the hottest year based on six international analyses, exceeding 2023 by a margin of about 0.1°C on average. The past decade (2015–2024) ranks as the warmest, with decadal averages showing anomalies 0.5–0.6°C above the 20th-century mean across records. Methodological differences contribute to minor variations in absolute anomalies, typically within 0.05–0.1°C for overlapping periods, but normalized time series track closely over multi-decadal scales. For instance, employs kriging to estimate polar gaps, resulting in slightly higher recent anomalies compared to 's earlier versions, which avoided infilling and thus underestimated Arctic amplification. , drawing from over 39,000 station records and minimal adjustments, independently validates the trend while showing modestly greater 21st-century warming due to enhanced Arctic coverage. Short-term fluctuations, such as the 1998 El Niño peak or the 2023–2024 warming spike, are captured similarly across datasets, though rankings of pre-2010 years occasionally differ (e.g., 1998 versus 2005 as warmest in some older analyses). These do not undermine the shared upward trajectory, as confirmed by ensemble comparisons showing correlation coefficients exceeding 0.99 for post-1950 anomalies. Reanalyses like ERA5 and JRA-55 further corroborate the observations, reinforcing consistency between surface records.

Surface vs. Satellite/Tropospheric Differences

Surface temperature records, compiled from thermometers at land stations measuring air temperatures approximately 2 meters above ground and from sea surface temperature (SST) observations via ships, buoys, and Argo floats, provide global near-surface anomaly estimates. Major datasets including , , and report linear warming trends of approximately 0.18–0.20°C per decade over the satellite era (1979–present). Satellite datasets, derived from Microwave Sounding Unit (MSU) and Advanced MSU (AMSU) instruments aboard NOAA polar-orbiting satellites, infer tropospheric temperatures by detecting microwave emissions from atmospheric oxygen, with channels weighted to represent bulk air temperatures in specific layers. The lower troposphere (TLT) channel, used for global comparisons, averages temperatures from near the surface up to about 6–8 km in mid-latitudes and higher in the tropics, emphasizing the free troposphere over the boundary layer. The University of Alabama in Huntsville (UAH) dataset yields a global TLT trend of +0.15°C per decade from January 1979 to January 2025, while the Remote Sensing Systems (RSS) dataset indicates a higher +0.184°C per decade (through recent updates). These methodological differences contribute to observed divergences: surface records capture boundary-layer effects like urban heat islands and SST variability, whereas satellite TLT integrates a thicker atmospheric column less influenced by local surface features but sensitive to convective processes aloft. Globally, UAH trends lag surface records by about 20–25%, though RSS aligns more closely; discrepancies widen in the tropics, where surface warming exceeds TLT rates in UAH data. Climate models project enhanced tropospheric warming relative to the surface under greenhouse gas forcing, particularly in the tropical mid-troposphere, due to moist adiabatic lapse rate adjustments and water vapor amplification, yielding expected amplification factors of 1.2–1.5 for TLT versus surface trends. Radiosonde and satellite observations, however, reveal weaker amplification, with tropical TLT trends often matching or falling below surface rates (e.g., amplification factors near 1.0 or less in adjusted datasets). This "tropical tropospheric discrepancy" persists post-adjustments for satellite orbital decay, diurnal drift, and radiosonde inhomogeneities, prompting explanations ranging from internal variability (e.g., Pacific Decadal Oscillation phases suppressing upper-air trends) to potential model overestimation of feedbacks or forcing responses. UAH and RSS divergences stem from processing choices, including UAH's emphasis on structural uncertainty and conservative drift corrections versus RSS's inclusion of additional calibration data, leading UAH to report systematically lower trends. Independent validations via reanalyses (e.g., ) and balloon data partially bridge gaps but confirm slower mid-tropospheric rates than multimodel means since 2000. These differences underscore challenges in vertical consistency testing for attribution, as surface-troposphere alignment is a fingerprint of radiative forcing, yet unresolved mismatches fuel debate over equilibrium climate sensitivity estimates derived from models.

Implications of Divergences

The divergences observed between surface temperature records and satellite-based tropospheric measurements, as well as among surface datasets themselves, widen the uncertainty range for global warming trends and challenge certain assumptions in climate modeling. Satellite records, such as the University of Alabama in Huntsville (UAH) dataset, indicate a lower tropospheric warming rate of approximately 0.13 °C per decade from 1979 through recent years, compared to surface estimates from NOAA and NASA GISS averaging 0.16–0.18 °C per decade over the same period. These differences arise partly from measurement methodologies—surface data relying on ground stations prone to urban heat island effects and adjustments, versus satellites capturing bulk atmospheric temperatures less influenced by local biases—but persist even after intercomparisons. Such variability implies that the consensus warming rate may be overstated in surface-centric analyses, affecting estimates of total warming since pre-industrial times by up to 20–30% depending on the dataset selected. A key implication concerns the tropical troposphere, where general circulation models predict warming amplification (1.5–2 times surface rates) due to enhanced convection under greenhouse forcing, yet satellite observations show rates closer to or below surface levels, with models exhibiting roughly twice the observed trend since 1979. This discrepancy has prompted debates over equilibrium climate sensitivity (ECS), with some analyses suggesting it points to lower ECS values (around 1.5–2.5 °C per CO2 doubling) than the 3 °C or higher implied by many models, potentially halving projected end-of-century warming under moderate emissions scenarios. Attributing the gap to natural multidecadal variability, such as Pacific Decadal Oscillation phases, or uncorrected satellite orbital decay, mainstream studies argue the mismatch does not undermine anthropogenic attribution but highlights transient internal dynamics masking forced signals. Persistent unresolved differences, however, underscore limitations in model physics, particularly lapse rate feedbacks and cloud responses, eroding confidence in high-sensitivity projections used for impact assessments. From an attribution standpoint, slower tropospheric trends relative to surfaces raise questions about the completeness of greenhouse gas dominance, as they align more closely with solar or oceanic variability influences in unadjusted data, though homogenized records minimize such signals through infilling and corrections. Datasets like RSS, after version 4 updates in 2017, show trends converging toward surface rates (about 5% higher warming than earlier estimates), but UAH's lower values continue to fuel arguments that surface adjustments systematically amplify trends by 10–20% post-1950. These inconsistencies imply that overreliance on any single record could bias detection of acceleration claims, such as the post-2010 warming surge, which satellites attribute more to El Niño transients than secular forcing. Policy-wise, the implications favor empirical caution over model-dependent alarmism, as divergent datasets expand plausible future scenarios and question the justification for interventions predicated on uniform, rapid warming. For example, if satellite-indicated rates prevail, the likelihood of exceeding 2 °C by 2100 under business-as-usual emissions drops significantly, reducing projected economic damages and adaptation needs. Analyses emphasizing these uncertainties argue against policies like net-zero transitions that assume high-confidence ECS and neglect natural variability's role, potentially diverting resources from verifiable benefits such as technological innovation. Conversely, reconciling efforts maintain that core warming signals cohere across records, supporting mitigation urgency, though the empirical breadth of divergences necessitates prioritized investment in observational improvements over premature regulatory commitments.

Long-Term Historical Context

Proxy Data from the Last Millennium

Proxy reconstructions of global surface temperatures over the last millennium rely on indirect indicators such as tree-ring widths, ice-core oxygen isotopes, coral growth bands, lake and marine sediments, and borehole thermometry, which provide estimates of past climate variability prior to widespread instrumental records beginning around 1850. These proxies exhibit varying sensitivities to local conditions, seasonal biases, and non-temperature factors like precipitation or solar activity, necessitating statistical methods to aggregate them into hemispheric or global series with associated uncertainty ranges that widen toward earlier centuries. Major multiproxy efforts, such as the PAGES 2k Consortium's database encompassing over 600 records, indicate that global mean temperatures from 1000 to 1850 AD fluctuated within a range of approximately 0.5–1°C, with cooler conditions during the (LIA, roughly 1450–1850 AD) featuring global anomalies of -0.3 to -0.6°C relative to the 19th-century baseline, linked to reduced solar irradiance, volcanic eruptions, and possibly ocean circulation changes. The preceding (MWP, circa 950–1250 AD) shows evidence of warmer-than-LIA conditions in many Northern Hemisphere regions, with proxy-derived anomalies up to +0.2°C globally in some syntheses, though spatial incoherence and limited Southern Hemisphere coverage prevent consensus on a synchronous global peak exceeding pre-industrial levels. Reconstructions like those from Mann et al. (2009) and PAGES 2k (2017) emphasize low pre-1850 variability and portray recent warming as unprecedented in rate and magnitude over the millennium, with global temperatures since 2000 AD diverging sharply upward by 0.8–1°C above the 1000–1850 mean. However, methodological critiques highlight issues such as proxy screening biases, principal component analysis artifacts producing "hockey stick" shapes even from red-noise inputs, and the post-1960 divergence in tree-ring data failing to track observed warming, suggesting potential underestimation of past variability or over-reliance on temperature-sensitive proxies. Alternative analyses incorporating borehole and non-tree-ring proxies yield higher medieval warmth, with some estimating MWP global means comparable to or exceeding mid-20th-century levels, underscoring uncertainties in spatial extrapolation and calibration. Comparisons across datasets reveal consistency in LIA cooling but divergence on MWP amplitude, with Northern Hemisphere series often showing stronger medieval peaks than global averages due to sparse tropical and Southern data; error bars in early periods can exceed ±0.5°C, complicating claims of modern uniqueness. These proxies affirm centennial-scale oscillations driven by natural forcings but exhibit limitations in resolving decadal extremes or global synchroneity, informing debates on the baseline for attributing 20th–21st-century changes.

Holocene and Quaternary Variations

The Quaternary Period, from approximately 2.58 million years ago to the present, features cyclic glacial-interglacial oscillations driven primarily by Milankovitch orbital forcings, with amplifying feedbacks from greenhouse gases, ice sheet dynamics, and vegetation changes. Global mean surface temperatures during these cycles varied by roughly 4–6 °C, with interglacial peaks comparable to the pre-industrial Holocene and glacial maxima, such as the Last Glacial Maximum around 21,000 years ago, being about 5 °C cooler on average. Antarctic ice core records, including the EPICA Dome C core spanning the last 800,000 years, reveal at least eight full glacial-interglacial transitions, with deuterium isotope proxies indicating temperature swings of up to 10 °C at the site, corresponding to global changes scaled by polar amplification factors of about 2:1. The Holocene epoch, initiating around 11,700 years before present after rapid deglaciation from the Pleistocene, exhibited initial warming followed by the (HTM) roughly 9,000–5,000 years ago, during which multi-proxy reconstructions estimate global mean surface temperatures were approximately 0.5–0.7 °C warmer than the late 19th-century baseline, with enhanced warmth at northern high latitudes due to boreal summer insolation peaks. This HTM phase supported expanded forests and reduced sea ice extent in proxy-indicated regions, though global estimates carry uncertainties from uneven proxy distributions and calibration variances across marine and terrestrial records. Post-HTM, Holocene temperatures followed a gradual cooling trajectory of about −0.08 °C per millennium on average, attributed to declining orbital insolation and feedback responses, culminating in cooler conditions during the Neoglacial advance around 4,000–2,000 years ago and further into the Little Ice Age. Reconstructions like those from 73 globally distributed proxies confirm this trend until the instrumental era, with Holocene global means generally within 1 °C of pre-industrial levels, highlighting the epoch's relative stability amid natural forcings but also underscoring proxy resolution limits for precise global averaging. Despite debates over the exact amplitude of HTM warmth—some analyses suggesting regional rather than uniform global peaks—the empirical proxy ensemble supports a mid-Holocene thermal optimum milder than often portrayed in earlier narratives, without exceeding recent anthropogenic-driven rates of change.

Deep-Time Geological Evidence

Deep-time geological evidence for global surface temperatures is derived from paleothermometric proxies embedded in sedimentary archives, including marine microfossils, carbonates, and organic biomarkers. Stable oxygen isotope ratios (δ¹⁸O) in biogenic apatite and calcite, such as from conodonts, brachiopods, and foraminifera, serve as a foundational proxy, with δ¹⁸O inversely related to formation temperature via fractionation equations, typically calibrated as 0.22‰ decrease per 1°C warming assuming constant seawater δ¹⁸O. Inorganic proxies like clumped isotopes (Δ₄₇) in carbonates provide absolute temperatures independent of seawater composition by measuring ¹³C-¹⁸O bonding preferences. Organic tracers, including the TEX₈₆ index from thaumarchaeotal GDGT lipids and alkenone Uᴷ'₃₇ ratios from haptophyte algae, yield sea surface temperature estimates via empirical calibrations (e.g., TEX₈₆: T ≈ 33 - 10 × TEX₈₆ for low latitudes). The PhanSST compilation aggregates 150,691 records from these five proxy types across over 1,600 sites, spanning the Phanerozoic (539 Ma–present), facilitating Bayesian or model-assisted reconstructions while metadata flags diagenetic risks. Phanerozoic GMST reconstructions, integrating proxy ensembles with Earth system models via data assimilation (e.g., PhanDA), reveal fluctuations from 11°C in icehouse phases to 36°C in hothouse intervals over 485 Ma, a broader range than δ¹⁸O-only estimates (14–34°C) due to refined calibrations and ice volume corrections. Paleozoic evidence shows Early Cambrian–Ordovician warmth (∼25–35°C GMST) from equatorial δ¹⁸O and widespread reef carbonates, punctuated by Late Ordovician glaciation (∼445 Ma, GMST ∼15°C) evidenced by tillites and δ¹⁸O spikes despite elevated CO₂ (>3,000 ppm). Late Paleozoic cooling (Carboniferous–Permian, ∼300–250 Ma, GMST ∼12–18°C) is indicated by Karoo glaciation proxies, including dropstones and high-latitude coals, reflecting Gondwanan ice sheets. Mesozoic greenhouse conditions (∼200–66 Ma, GMST ∼20–30°C) emerge from low δ¹⁸O in belemnites and rudist bivalves, with TEX₈₆ confirming tropical SSTs >30°C and minimal polar ice via sedimentological absence of glaciomarine deposits. Cenozoic records document a secular cooling trend from Paleogene hothouse (Eocene, ∼50 Ma, GMST ∼23–27°C) to Neogene icehouse, with Paleocene-Eocene Thermal Maximum (PETM, 56 Ma) δ¹⁸O excursions signaling +5–8°C transients amid benthic crises. Oligocene (∼34–23 Ma) Antarctic ice expansion is traced via benthic foraminiferal δ¹⁸O shifts (+1‰, implying 4–5°C deep-water cooling) and seismic evidence of ice-rafted debris, while Pliocene warmth (∼3–5 Ma, GMST ∼2–3°C above pre-industrial) features Mg/Ca-inferred SSTs 2–4°C warmer than today, correlating with higher sea levels (∼20 m). These patterns are reconstructed from deep-sea cores (e.g., ODP/IODP sites) and outcrop data, with spatial emphasis on Atlantic and Tethyan margins. Proxy limitations introduce systematic errors: δ¹⁸O assumes negligible ice volume and stable seawater δ¹⁸O, yet evidence of Paleozoic and Mesozoic glaciations (e.g., Andea-Saharan ice caps) implies uncorrected records underestimate pre-Quaternary warmth by 2–5°C; vital effects, variations, and diagenetic recrystallization further toward cooler estimates. Low-latitude dominance in archives (>70% of PhanSST data) may mask polar signals, while organic proxies like TEX₈₆ exhibit regional calibration variances (±2–3°C). Discrepancies arise in CO₂-temperature scaling, with some analyses attributing mismatches to non-greenhouse forcings like flux modulating , yielding empirically lower (∼1.7°C per CO₂ doubling) than model-derived values (3–8°C). Multi-proxy averaging mitigates single-method flaws, but temporal gaps (>10 Ma in some intervals) and proxy-model mismatches underscore the need for causal validation beyond correlations.

Robustness, Debates, and Alternative Interpretations

Strengths of the Evidence Base

The convergence of independent datasets constitutes a primary strength, as analyses from disparate teams employing distinct methodologies yield highly consistent long-term trends. Berkeley Earth's land/ocean temperature record, developed using over 39,000 station series and novel statistical for spatial interpolation, aligns closely with NASA's GISTEMP, NOAA's GlobalTemp, and the UK Met Office's HadCRUT5, all reporting approximately 1.1–1.2 °C of warming from 1850–1900 to recent decades. This agreement holds despite Berkeley Earth originating from skeptic-funded efforts to scrutinize prior records, incorporating from underrepresented regions and avoiding assumptions of prior homogeneity that could bias results upward. Homogenization techniques, essential for addressing non-climatic artifacts like station moves or instrument changes, have undergone rigorous validation against synthetic embedding known discontinuities, confirming their efficacy in detecting breaks with minimal trend distortion—typically recovering adjustments within 0.1–0.2 °C of truth in controlled tests. Pairwise comparison algorithms, as implemented in datasets like HadCRUT and Berkeley Earth, leverage neighboring records to identify and correct inhomogeneities statistically, with empirical studies demonstrating reduced variance in homogenized series compared to while preserving underlying climatic signals. Adjustments for (UHI) effects further bolster reliability, as rural-station subsets in Berkeley Earth and NOAA analyses exhibit warming rates comparable to all- averages (around 0.7 °C per century over land since 1900), indicating that urbanization contributes negligibly to global-scale trends after correction—estimated at less than 0.05 °C per decade globally. Standardized , such as Stevenson screens protecting thermometers from direct and since the mid-19th century, ensures measurement consistency grounded in thermodynamic principles, with from networks like GHCN documenting over 7,000 long-term stations worldwide for robust spatial coverage. Integration of data from ship measurements, buoys, and floats since the 1980s provides complementary evidence, with blended land-ocean indices showing synchronized decadal variability and trends across datasets, validated by cross-checks against independent proxies like temperatures that corroborate 20th-century warming without reliance on surface records alone. This multi-proxy consistency and the sheer volume of underlying observations—millions of daily readings archived in repositories like GHCN—underpin the evidence base's resilience to isolated data gaps or regional biases.

Major Controversies and Skeptical Perspectives

The Surface Stations Project, initiated by Anthony Watts in 2007, surveyed over 80% of U.S. Historical Network (USHCN) stations and determined that 89% failed to meet NOAA's siting standards, which require sensors to be at least 100 feet from artificial heating or radiating surfaces and 30 feet from asphalt. Poorly sited stations, often relocated near urban infrastructure like exhausts or tarmac, were argued to introduce spurious warming biases exceeding 2°C in daily maximum temperatures at some locations. A subsequent analysis by Watts and co-authors, using exposure ratings, found that trends from well-sited rural stations (Class 1 and 2) showed 0.155°C per decade warming from 1979–2008, compared to 0.248°C per decade for poorly sited (Class 3–5) stations, suggesting uncorrected siting issues inflate U.S. land trends by approximately 50%. Urban heat island (UHI) effects, where localized development traps heat and elevates readings, have been quantified in peer-reviewed work as contributing significantly to gridded temperature trends. McKitrick and Michaels (2007) regressed national temperature trends against socioeconomic variables as proxies for and found that such factors explained about half of the post-1979 land warming signal in global datasets, with GDP per capita correlating positively with trend magnitudes after controlling for and other climatic variables. Rural-only subsets exhibited lower trends, implying that blending urban and rural data without full correction overstates global land warming by 0.05–0.1°C per decade. Critics of mainstream adjustments, including Wickham et al. (2023), identified "urban blending" in homogenized records, where algorithms inadvertently average UHI biases from neighboring stations, reducing urban-rural trend differentials by up to 37% and propagating non-climatic signals into rural data. Homogenization procedures applied by NOAA and to raw records—intended to correct for station moves, instrument changes, and time-of-observation biases—have faced scrutiny for systematically cooling pre-1950 data and warming recent decades. Connolly et al. (2021) examined NOAA's Global Historical Climatology Network (GHCN) adjustments and found inconsistencies, such as over-correction of rural stations relative to ones, leading to exaggerated 20th-century warming in adjusted U.S. series by up to 0.6°C compared to raw data. Skeptics argue these pairwise comparisons rely on sparse networks prone to error propagation, particularly in data-poor regions, and fail to fully excise UHI contamination, as evidenced by persistent urban-rural disparities post-adjustment in independent audits. Such practices, while defended as necessary for instrumental inconsistencies, are contended to align records with model expectations rather than empirical fidelity, amplifying perceived signals amid natural decadal variability like the 1940–1970 plateau.

Empirical Tests of Record Reliability

Independent analyses of records, including those from NASA GISS, NOAA, HadCRUT, and Berkeley Earth, exhibit strong agreement in long-term warming trends of approximately 0.8–1.0 °C per century since the late 19th century, despite differences in data sources, homogenization methods, and infilling techniques. This consistency serves as an empirical test of reliability, as datasets developed by separate teams using varied approaches converge on similar results, suggesting robustness to methodological choices. Berkeley Earth's analysis, initially funded to scrutinize potential biases in existing records, incorporated over 37,000 stations and tested subsets excluding areas, finding rural-only trends closely matching overall estimates at about 0.9 °C per century from 1950–2011, indicating minimal from (UHI) effects after adjustments. However, peer-reviewed critiques highlight limitations in homogenization processes, such as "urban blending," where algorithms inadvertently incorporate station data into rural adjustments, potentially inflating warming trends in datasets like GHCN for regions including and the . Benchmarking studies using simulated temperature series with known inhomogeneities demonstrate that pairwise homogenization algorithms, employed in major datasets, recover true signals effectively, reducing errors in trend estimates by identifying and correcting non-climatic breaks like station relocations. Yet, evaluations of applied homogenizations in and GHCN data raise concerns, revealing that adjustments sometimes amplify rather than mitigate biases, with homogenized versions diverging from in ways that question their fidelity for regional trend assessments. For ocean surface temperatures, which comprise about 70% of global coverage, validations against independent buoy measurements and reanalyses confirm that updated datasets like ERSSTv5 align closely with instrumentally homogeneous records, supporting reliability in () components post-adjustments for historical measurement changes such as canvas bucket corrections. Conversely, statistical analyses indicate early-20th-century estimates (1900–1930) in major records suffer from cold biases due to uncorrected sampling errors, potentially understating historical variability and affecting long-term trend baselines. UHI impact assessments, through paired urban-rural station comparisons, estimate its contribution to global land trends at 0.05–0.14 °C per in some national scales, though global adjustments in datasets like NOAA's are claimed to neutralize this, with rural subsets showing comparable warming. Skeptical examinations, including U.S.-focused audits, argue that poor station siting near artificial heat sources contaminates records, with unadjusted data exhibiting lower trends, underscoring ongoing debates over adjustment efficacy despite simulation successes. Overall, while cross-validation bolsters confidence in broad trends, unresolved discrepancies in homogenization and bias propagation highlight needs for enhanced transparency and preservation in testing record integrity.

References

  1. [1]
    Data.GISS: GISS Surface Temperature Analysis (GISTEMP v4) - NASA
    The GISS Surface Temperature Analysis version 4 (GISTEMP v4) is an estimate of global surface temperature change.
  2. [2]
    Surface Temperature | NASA Earthdata
    Sep 30, 2025 · Surface temperature or surface air temperature (SAT) generally refer to the temperature of the atmosphere measured at approximately 6.5 feet ( ...Missing: definition | Show results with:definition
  3. [3]
    This is How Scientists Measure Global Temperature - NASA Science
    These individual measurements tell us the temperature at any one place for that day and time. Calculating an average global temperature requires collecting all ...
  4. [4]
    GLOBAL SURFACE TEMPERATURE CHANGE - Hansen - 2010
    Dec 14, 2010 · Measurement methods changed over time as ships changed, most notably with a change from measurements of bucket water to engine intake water.2. Background Of Giss... · 4. Urban Adjustments · 7. Comparison Of Giss, Ncdc...
  5. [5]
    Global Temperature Report for 2024 - Berkeley Earth
    Jan 10, 2025 · When compared to the 1850-1900 averages, the land average in 2024 has increased 2.28 ± 0.12 °C (4.11 ± 0.22 °F) and the ocean surface ...Missing: empirical | Show results with:empirical
  6. [6]
    WMO confirms 2024 as warmest year on record at about 1.55°C ...
    Jan 10, 2025 · There is a margin of uncertainty in all temperature assessments. All six datasets place 2024 as the warmest year on record and all highlight the ...
  7. [7]
    Global Temperature - Earth Indicator - NASA Science
    Sep 25, 2025 · Earth's global temperature in 2024 was 2.3 degrees Fahrenheit (or about 1.28 degrees Celsius) warmer than the 20th century baseline 1951 - 1980.Carbon Dioxide · Methane · ID: 4964Missing: definition | Show results with:definition
  8. [8]
    A recent surge in global warming is not detectable yet - Nature
    Oct 14, 2024 · Our results show limited evidence for a warming surge; in most surface temperature time series, no change in the warming rate beyond the 1970s is detected.Results · Changepoint Models · Testing Trend Differences
  9. [9]
    Did You Know? | Global Temperature Uncertainty
    NCEI provides values that describe the range of this uncertainty, or simply "range", of each month's, season's or year's global temperature anomaly.
  10. [10]
    Uncertainty Estimates for Sea Surface Temperature and Land ...
    This analysis estimates uncertainty in the NOAA global surface temperature (GST) version 5 (NOAAGlobalTemp v5) product.
  11. [11]
    A 485-million-year history of Earth's surface temperature | Science
    Sep 20, 2024 · A long-term record of global mean surface temperature (GMST) provides critical insight into the dynamical limits of Earth's climate and the ...Phanerozoic Temperatures · Phanerozoic Climate States... · Phanerozoic Gmst And...
  12. [12]
    [PDF] Chapter 5 - Global surface temperatures
    There are at least seven peer-reviewed data sets that estimate global average surface temperatures from direct in situ observations [11,30,36,44,65,66].
  13. [13]
    Chapter 1 — Global Warming of 1.5 ºC
    Definition of global average temperature. The IPCC has traditionally defined changes in observed GMST as a weighted average of near-surface air temperature (SAT) ...
  14. [14]
    Climate at a Glance - National Centers for Environmental Information
    This tool provides near real-time analysis of monthly temperature and precipitation for the globe and is intended for the study of climate variability.
  15. [15]
    The Elusive Absolute Surface Air Temperature (SAT) - NASA
    Mar 18, 2022 · This is the air temperature that would be measured 2 meters (around six feet) above the ground in a weather monitoring station with a Stevenson ...Missing: land | Show results with:land
  16. [16]
    Ask the Bureau: How is temperature measured? - Social Media Blog
    Apr 12, 2016 · A Stevenson screen is basically a box with louvres that allow air to circulate around the thermometer inside while protecting it from outside ...Missing: land | Show results with:land
  17. [17]
    Measurements of natural airflow within a Stevenson screen and ... - GI
    Aug 5, 2022 · This paper addresses that gap by summarizing the results of a 3-month field experiment in which airflow within a UK-standard Stevenson screen was measured.Missing: land | Show results with:land
  18. [18]
    [PDF] Comparing historical and modern methods of sea surface ... - OS
    Jul 30, 2013 · Here I review SST measurement methods, studies analysing shipboard methods by field or lab experiment and adjustments applied to historical SST ...
  19. [19]
    The Raw Truth on Global Temperature Records - NASA Science
    Mar 25, 2021 · Scientists have been building estimates of Earth's average global temperature for more than a century, using temperature records from weather stations.
  20. [20]
    How to Measure Global Average Temperature in Five Easy Steps
    Measure temperature in thousands of places, calculate anomalies, divide the planet into a grid, average anomalies, and compare with other years.
  21. [21]
    Taking the planet's temperature: How are global temperatures ...
    Oct 16, 2015 · To calculate global temperature, researchers average the readings for each grid point, weighted by the area associated with that grid point.
  22. [22]
    Implementing Full Spatial Coverage in NOAA's Global Temperature ...
    Jan 19, 2021 · NOAA's global surface temperature analysis has limited polar coverage, resulting in a small cold bias in recent decades We create a ...
  23. [23]
    Quantifying exposure biases in early instrumental land surface air ...
    Mar 13, 2024 · The transition from non-standard thermometer exposures to Stevenson-type screens introduced exposure biases into land air temperature records.
  24. [24]
    Explainer: How data adjustments affect global temperature records
    Jul 19, 2017 · Very local factors, such as trees growing over stations or poor station siting, can also cause problems.
  25. [25]
    Urban Heat Island Effects in U.S. Summer Surface Temperature ...
    Urban heat island (UHI) is a feature of human settlements, caused by replacing vegetation with buildings. UHI warming is 22% of observed warming in the US.Abstract · Introduction · Data and methods · Results
  26. [26]
    Global Temperature Data Sets: Overview & Comparison Table
    Each group has approached the above challenges somewhat differently. The final data sets differ in their spatial coverage, spatial resolution, starting year ...
  27. [27]
    The Berkeley Earth Land/Ocean Temperature Record
    Dec 17, 2020 · This combined product spans the period from 1850 to present and covers the majority of the Earth's surface: approximately 57 % in 1850, 75 % in ...
  28. [28]
    Planet Postcard: A Bucket Full of Data | News
    May 9, 2017 · Almost all sea surface temperature measurements were made using buckets until World War II, when it became more common to report the temperature ...
  29. [29]
    Temperature data (HadCRUT, CRUTEM, HadCRUT5, CRUTEM5 ...
    The number of available stations was small during the 1850s, but increases to over 2,000 stations by 1900 and more than 5,000 stations during most of the 1951- ...<|separator|>
  30. [30]
    Data Overview - Berkeley Earth
    Global Monthly Averages (1850 – Recent). Berkeley Earth combines our land data with a modified version of the HadSST ocean temperature data set. The result is a ...Temperature City List · Temperature Country List · Asia · Results by Region
  31. [31]
    Early-twentieth-century cold bias in ocean surface ... - Nature
    Nov 20, 2024 · Here we show that existing estimates of ocean temperatures in the early twentieth century (1900–1930) are too cold, based on independent statistical ...
  32. [32]
    Reassessing biases and other uncertainties in sea surface ...
    Jul 22, 2011 · A number of studies have attempted to quantify the measurement errors in observations made by ships, drifting buoys and moored buoys. Typically, ...
  33. [33]
    Systematic Differences in Bucket Sea Surface Temperature ...
    In this study, a systematic examination of differences in collocated bucket SST measurements from ICOADS3.0 is undertaken using a linear-mixed-effect model ...
  34. [34]
    [PDF] Influence of NDBC Buoy Design on Sea Surface Temperature ...
    Kennedy et al. (2011) found that ship measurements taken with collocated buoy measurements at dawn had a mean difference of +0.12°C; this study attempted to ...
  35. [35]
    After Two Decades, Argo at PMEL, Looks to the Future
    Oct 30, 2023 · Argo data have remarkably improved the accuracy of the rate of ocean warming (Johnson et al., 2016). We have shown that Argo and top-of-the- ...Missing: buoys peer
  36. [36]
    An Ensemble Data Set of Sea Surface Temperature Change From ...
    Jun 20, 2019 · We describe the construction of HadSST.4.0.0.0, a climate data set of sea surface temperature change from 1850 to 2018 A range of bias ...
  37. [37]
    A Buoy-Only Sea Surface Temperature Record Supports NOAA's ...
    Nov 27, 2015 · They identify a bias of around 0.1 C between buoys and ERIs and remove it by adjusting buoy records up to match ERI records in ERSST v4, as well ...
  38. [38]
    Assessing recent warming using instrumentally homogeneous sea ...
    Jan 4, 2017 · Sea surface temperature (SST) records are subject to potential biases due to changing instrumentation and measurement practices.
  39. [39]
    Global warming hiatus disproved — again | University of California
    Jan 4, 2017 · Nowadays, buoys cover much of the ocean and that data is beginning to supplant ship data. But the buoys report slightly cooler temperatures ...<|separator|>
  40. [40]
    Correcting Observational Biases in Sea Surface Temperature ...
    To account for physical separation between paired measurements, we first remove climatological differences expected from geographical and temporal displacement.
  41. [41]
    Upper Air Temperature - Remote Sensing Systems
    There are several methods available for the measurement of the upper air. Radiosondes (commonly called weather balloons).These are small instruments lifted ...Introduction · RSS Sounding Products · Uncertainty
  42. [42]
    Observed Temperature Changes in the Troposphere and ...
    Several merged satellite-based datasets have been constructed for providing continuous climate records in the stratosphere from 1979 to the present (McLandress ...
  43. [43]
    UAH v6.1 Global Temperature Update for December, 2024
    Jan 3, 2025 · The Version 6.1 global area-averaged temperature trend (January 1979 through December 2024) remains at +0.15 deg/ C/decade (+0.22 C/decade over ...
  44. [44]
    Historical Comparison of TLT Trends - Remote Sensing Systems
    Jul 6, 2017 · UAH and RSS agree better than they ever have, but only through about 2000. After that, they diverge fairly rapidly. Sometime in 2009, the ...
  45. [45]
    Tropospheric temperature change since 1979 from tropical ...
    Mar 16, 2007 · At the 58 sites the UAH data indicate a trend of +0.08 K decade−1, the RSS data, +0.15. When the largest discontinuities in the sondes are ...
  46. [46]
    Explainer: how surface and satellite temperature records compare
    Feb 1, 2016 · “[Satellites] do not directly measure temperatures, and are subject to large systemic biases due to orbital decay, diurnal sampling drifts, ...
  47. [47]
    Integrated Global Radiosonde Archive (IGRA)
    The Integrated Global Radiosonde Archive (IGRA) consists of radiosonde and pilot balloon observations from more than 2800 globally distributed stations.
  48. [48]
    A New Approach to Homogenize Global Subdaily Radiosonde ...
    ABSTRACT: This study develops an innovative approach to homogenize discontinuities in both mean and variance in global subdaily radiosonde temperature data ...
  49. [49]
    [PDF] Tropospheric temperature trends: history of an ongoing controversy
    In a 1990 paper, Spencer and Christy3 claimed that since the start of routine satellite temperature observations in 1979 there had been no tropospheric warming, ...
  50. [50]
    Upper-Air Temperature Trends over the Globe, 1958–1989 in
    New time series of the hemispheric and global mean temperature anomalies in the troposphere and lower stratosphere are presented for the period May 1958 ...
  51. [51]
    An Appraisal of the Progress in Utilizing Radiosondes and Satellites ...
    Considerable progress (Figure 2) has been made over the last few decades in using in situ RS measurements for upper air temperature observations.
  52. [52]
    Mapped: How 'proxy' data reveals the climate of the Earth's distant past
    Mar 29, 2021 · Proxy data provides an insight into the climate before dedicated records. It forms a fundamental part of the study of past climates, known as “ ...Proxy Map · Coral And Sponges · Marine Sediment
  53. [53]
    Proxy-based reconstructions of hemispheric and global surface ...
    Many previous proxy data studies have emphasized hemispheric or global mean temperature (3–14), although some studies have also attempted to reconstruct the ...
  54. [54]
    Proxy-Based Northern Hemisphere Surface Temperature ...
    The factors investigated include 1) the method used to assimilate proxy data into a climate reconstruction, 2) the proxy data network used, 3) the target season ...
  55. [55]
    Robustness of proxy‐based climate field reconstruction methods
    Jun 23, 2007 · We present results from continued investigations into the fidelity of covariance-based climate field reconstruction (CFR) approaches used in ...
  56. [56]
    A global multiproxy database for temperature reconstructions of the ...
    Jul 11, 2017 · The database gathers 692 records from 648 locations, including all continental regions and major ocean basins.
  57. [57]
    Chapter: 11 Large-Scale Multiproxy Reconstruction Techniques
    These large-scale multiproxy-based surface temperature reconstructions offer a quantitative assessment of large-scale surface temperature variations. They ...
  58. [58]
    PAGES2k: Advances in climate field reconstructions | PAGES
    A major outcome of the PAGES2k synthesis was the creation of continental-scale reconstructions of mean regional temperatures that span the last millennium or ...
  59. [59]
    Holocene global mean surface temperature, a multi-method ...
    Jun 30, 2020 · Reconstructing past temperature from proxy data relies on important assumptions that differ among proxy types and the methods used to convert ...
  60. [60]
    A review of the tree-ring evidence and possible causes - ScienceDirect
    The divergence problem has potentially significant implications for large-scale patterns of forest growth, the development of paleoclimatic reconstructions ...
  61. [61]
    Tree-ring proxies and the divergence problem - Skeptical Science
    May 1, 2024 · Tree-ring width and tree-ring density, both indicators of tree growth, serve as useful proxies for temperature.
  62. [62]
    Challenges and perspectives for large‐scale temperature ...
    Dec 13, 2016 · Here we critically assess some of the many challenges related to large-scale multiproxy temperature reconstructions.
  63. [63]
    The influence of decision-making in tree ring-based climate ... - Nature
    Jun 7, 2021 · Recent investigations suggest that methodology-induced challenges of proxy-target calibration, proxy network size, end-effects in time-series ...
  64. [64]
    Reconstructing Holocene temperatures in time and space using ...
    Dec 15, 2022 · We use paleoclimate data assimilation to create a spatially complete reconstruction of temperature over the past 12 000 years using proxy data.
  65. [65]
    Complex spatio-temporal structure of the Holocene Thermal Maximum
    Oct 3, 2022 · Here, we analyze a global database of Holocene paleotemperature records to investigate the spatiotemporal structure of the HTM.Fig. 7. Median Htm Ages And... · Fig. 8. Holocene Forcing · Complex Holocene Temperature...
  66. [66]
    Seasonality in Holocene Temperature Reconstructions in ...
    Dec 18, 2020 · Our reconstructed Holocene temperature record displays a steady long-term trend without distinct cooling or warming changes.
  67. [67]
    The Holocene temperature conundrum - PNAS
    Aug 11, 2014 · Here, we show that climate models simulate a robust global annual mean warming in the Holocene, mainly in response to rising CO 2 and the retreat of ice sheets.The Holocene Temperature... · Sign Up For Pnas Alerts · Model ExperimentsMissing: peer | Show results with:peer<|control11|><|separator|>
  68. [68]
    The seasonal temperature conundrum for the Holocene - Science
    Apr 25, 2025 · Our study highlights the current uncertainty in seasonal temperature reconstructions in the Holocene, with an implication that the MAT ...
  69. [69]
    A frequency-optimised temperature record for the Holocene
    Abstract. Existing global mean surface temperature reconstructions for the Holocene lack high-frequency variability that is essential for contextualising recent ...
  70. [70]
    Homogenization of Temperature Series via Pairwise Comparisons in
    An automated homogenization algorithm based on the pairwise comparison of monthly temperature series is described.
  71. [71]
    [PDF] Homogenization of Temperature Series via Pairwise Comparisons
    Apr 1, 2009 · The algorithm works by forming pairwise difference series between serial monthly temperature values from a network of observing stations.
  72. [72]
    Benchmarking the performance of pairwise homogenization of ...
    Mar 8, 2012 · Results indicate that all randomized versions of the algorithm consistently produce homogenized data closer to the true climate signal in the ...
  73. [73]
    U.S. Historical Climatology Network (USHCN)
    ... pairwise homogenization algorithm (PHA) that improve its overall efficiency. ... The temperature data were last homogenized by the PHA algorithm in May 2008.
  74. [74]
    On the critical values of the standard normal homogeneity test (SNHT)
    Nov 7, 2006 · The objective of this paper is to improve the critical values of the SNHT and extend them to large sample sizes.
  75. [75]
    Homogenization Techniques for European Monthly Mean Surface ...
    The third method is the standard normal homogeneity test (SNHT) described in Alexandersson (1986) and Alexandersson and Moberg (1997), which will also be ...<|separator|>
  76. [76]
    [PDF] Berkeley Earth Temperature Averaging Process - SciTechnol
    Mar 5, 2013 · We find that the global land mean temperature increased by 0.89 ± 0.06°C in the difference of the Jan 2000-Dec 2009 average from the Jan 1950- ...<|separator|>
  77. [77]
    Frequently Asked Questions (FAQ) - NASA GISS
    Apr 3, 2025 · homogenization applied by NCEI to any station (single or multiple records), whereas none was applied by GISS.
  78. [78]
    The Global Historical Climatology Network Monthly Temperature ...
    The outcome of running the PHA with different parameter settings is a set of detected shifts and adjustments for each of the GHCN series that span the possible ...
  79. [79]
    [PDF] FAQs on the Update to Global Historical Climatology Network ...
    In this way, the algorithm seeks to adjust all earlier measurement eras in a station's history to conform to the latest location and instrumentation. The.
  80. [80]
    [PDF] Quantifying the Effect of Urbanization on U.S. Historical Climatology ...
    Nov 15, 2012 · [1988] developed a specific adjustment to control for the apparent contribution of the urban heat island signal in USHCN temperature data.<|separator|>
  81. [81]
    [PDF] influence-of-urban-heating-on-global-temperature-land ... - ProCon
    The study found that urban warming does not unduly bias global temperature estimates, and its contribution to global warming is much smaller than observed.
  82. [82]
    Evidence of Urban Blending in Homogenized Temperature Records ...
    We show that the homogenization process unintentionally introduces new nonclimatic biases into the data as a result of an “urban blending” problem.
  83. [83]
    Major problems identified in data adjustments applied to a widely ...
    Feb 22, 2022 · They found that less than 20% of the adjustments NOAA had been applying corresponded to any event noted by the station observers - such as a ...
  84. [84]
    Quantifying the influence of anthropogenic surface processes and ...
    Aug 10, 2025 · Local land surface modification and variations in data quality affect temperature trends in surface-measured data.
  85. [85]
    Global Warming: Temperature Data Quality - Ross McKitrick
    Subsequently McKitrick and Michaels (2007) concluded that about half the reported warming trend in global-average land surface air temperature in 1980–2002 ...Missing: homogenization | Show results with:homogenization
  86. [86]
    (PDF) Has poor station quality biased U.S. temperature estimates?
    Mar 4, 2021 · When time-of-observation adjustments were applied to the records, this increased temperature trends by about 39%, and so the relative fraction ...
  87. [87]
    The Great Adjustment Debate: Is Climate Data Really Reliable?
    Sep 12, 2024 · One of the key criticisms is that many of the adjustments result in past temperatures being lowered, while more recent temperatures remain ...
  88. [88]
    Evaluating the impact of U.S. Historical Climatology Network ...
    Feb 5, 2016 · This means that adjusted (and raw) USHCN stations generally have a lower maximum temperature trend than their nearby USCRN pairs, similar to ...
  89. [89]
    NOAAGlobalTemp - National Centers for Environmental Information
    Yin, H. -M. Zhang, 2020: Uncertainty estimates for sea surface temperature and land surface air temperature in NOAAGlobalTemp version 5. J.<|control11|><|separator|>
  90. [90]
    HadCRUT5 - Met Office Hadley Centre observations datasets
    Sep 17, 2025 · HadCRUT5 is a gridded dataset of global historical surface temperature anomalies relative to a 1961-1990 reference period.Missing: homogenization | Show results with:homogenization
  91. [91]
    Global Temperature Report for 2023 - Berkeley Earth
    Jan 12, 2024 · We conclude that 2023 was the warmest year on Earth since 1850, exceeding the previous record set in 2016 by a clear and definitive margin.Annual Temperature Anomaly · Causes Of Warmth In 2023 · Comparisons With Other...
  92. [92]
    Global Temperature Anomalies from 1880 to 2024 - NASA SVS
    Jan 10, 2025 · Earth's global surface temperatures in 2024 were the warmest on record -- 1.28 degrees Celsius (2.30 degrees Fahrenheit) above the agency's 20th ...Missing: empirical | Show results with:empirical<|separator|>
  93. [93]
    Climate change: global temperature
    May 29, 2025 · Earth's temperature has risen by an average of 0.11° Fahrenheit (0.06° Celsius) per decade since 1850, or about 2° F in total.Missing: empirical | Show results with:empirical
  94. [94]
    World of Change: Global Temperatures - NASA Earth Observatory
    But the global temperature mainly depends on how much energy the planet receives from the Sun and how much it radiates back into space. The energy coming from ...
  95. [95]
    The global warming hiatus: Slowdown or redistribution? - PMC
    Global mean surface temperatures (GMST) exhibited a smaller rate of warming during 1998–2013, compared to the warming in the latter half of the 20th Century.
  96. [96]
    Why did Earth's surface temperature stop rising in the past decade?
    Jul 23, 2013 · The slowdown in the rate of average global surface warming that took place from 1998–2012 (relative to the preceding 30 years) has unequivocally ended.
  97. [97]
    The 2000–2012 Global Warming Hiatus More Likely With a Low ...
    Apr 18, 2021 · The probability of the observed 2000–2012 hiatus period to arise from internal variability driven by white noise is larger if climate sensitivity is low.Missing: debate | Show results with:debate
  98. [98]
    2024 was the world's warmest year on record - NOAA
    Jan 10, 2025 · 2024 was the planet's warmest year on record, according to an analysis by scientists from NOAA's National Centers for Environmental Information (NCEI).
  99. [99]
    Annual Global Temperature Records - NASA Earth Observatory
    The past five years have been the warmest years in the modern record, and 18 of the 19 warmest years have occurred since 2000. Published Feb 6, 2019. Image ...
  100. [100]
    Temperatures Rising: NASA Confirms 2024 Warmest Year on Record
    Jan 10, 2025 · Earth's average surface temperature in 2024 was the warmest on record, according to an analysis led by NASA scientists.
  101. [101]
    Global Climate Highlights 2024 | Copernicus
    Jan 10, 2025 · In 2024, annual surface air temperatures were above the 1991–2020 average across most of the globe (91%). Temperatures exceeded the average by ...Missing: empirical | Show results with:empirical
  102. [102]
    Assessing the Global Temperature and Precipitation Analysis in ...
    Sep 11, 2025 · The January–August global surface temperature was the second-highest on record behind 2024. According to NCEI's Global Annual Temperature ...
  103. [103]
    State of the climate: 2025 close behind 2024 as the hottest start to a ...
    Apr 24, 2025 · Global temperatures in the first quarter of 2025 were the second warmest on record, extending a remarkable run of exceptional warmth that began in July 2023.
  104. [104]
    April 2025 Temperature Update - Berkeley Earth
    May 13, 2025 · The April 2025 anomaly average measured 1.49 ± 0.12 °C (2.67 ± 0.22 °F) above the 1850-1900 average, making it the second warmest April on record.
  105. [105]
    Indicators of Global Climate Change 2024: annual update of key ...
    Jun 19, 2025 · The 2024-observed best estimate of global surface temperature (1.52 °C) is well above the best estimate of human-caused warming (1.36 °C).
  106. [106]
    Monthly Climate Reports | Global Climate Report | Annual 2024
    The year 2024 was the warmest year since global records began in 1850 at 1.29°C (2.32°F) above the 20th century average of 13.9°C (57.0°F).Global Temperatures · Regional Temperatures · Precipitation
  107. [107]
    2024 Was the Warmest Year on Record - NASA Earth Observatory
    Jan 10, 2025 · Global temperatures in 2024 were 1.28 degrees Celsius (2.30 degrees Fahrenheit) above the agency's 20th-century baseline (1951–1980), which tops ...
  108. [108]
    Eight warmest years on record witness upsurge in climate change ...
    2015 to 2022 are likely to be the eight warmest years on record. La Niña conditions have dominated since late 2020 and are expected to continue until the end ...
  109. [109]
    Global temperature | Climate Dashboard
    Four different data sets are shown - HadCRUT, NOAAGlobalTemp, GISTEMP, and Berkeley Earth - as well as two reanalyses - ERA5 and JRA-55. There is good agreement ...How Is Temperature Measured? · Find Out More? · Get The Data
  110. [110]
    Monthly Climate Reports | Global Climate Report | July 2025
    The global surface temperature for January–July 2025 was the second-highest in NOAA's 176-year record, at 1.18°C (2.12°F) above the 20th-century average. This ...Temperature · Precipitation · 500 Mb Maps
  111. [111]
    State of the climate: 2025 on track to be second or third warmest ...
    Jul 29, 2025 · As it passes its midway point, 2025 is on track to be the second or third warmest year on record, Carbon Brief analysis shows.
  112. [112]
    Climate change: evidence and causes | Royal Society
    Scientists have determined that, when all human and natural factors are considered, Earth's climate balance has been altered towards warming.Missing: adjustments | Show results with:adjustments
  113. [113]
    A global-scale multidecadal variability driven by Atlantic ...
    Global sea-surface temperature (SST) shows two dominant decadal variabilities: interdecadal Pacific oscillation (IPO) and Atlantic multidecadal oscillation (AMO) ...
  114. [114]
    1.4.2.5 El Niño-Southern Oscillation (ENSO) - Global Tipping Points
    Additionally, an extreme El Niño causes an increase in global mean surface temperature of up to 0.25°C (Hu and Fedorov 2017), contributing to the prevalence of ...
  115. [115]
    The 2023 global warming spike was driven by the El Niño–Southern ...
    Oct 10, 2024 · Global-mean surface temperature rapidly increased 0.29 ± 0.04 K from 2022 to 2023. Such a large interannual global warming spike is not ...
  116. [116]
    El Niño Southern Oscillation (ENSO)
    Nov 9, 2023 · Any transition from a La Niña to an El Niño phase will likely produce a rise in global average surface temperature, with warming tendencies ...
  117. [117]
    Atlantic Multi-decadal Oscillation (AMO) - Climate Data Guide
    Zhang and Delworth (2007) found that the AMO can contribute to the Pacific Decadal Oscillation (PDO) and the associated Pacific/North America (PNA) pattern at ...
  118. [118]
    The Pacific Decadal Oscillation (PDO) is not causing global warming
    Mar 31, 2024 · PDO as an oscillation between positive and negative values shows no long term trend, while temperature shows a long term warming trend.
  119. [119]
    Climate Change: Incoming Sunlight | NOAA Climate.gov
    During strong solar cycles, the Sun's total average brightness varies by up to 1 Watt per square meter; this variation affects global average temperature by 0.1 ...
  120. [120]
    4. What role has the Sun played in climate change in recent decades?
    The Sun provides the primary source of energy driving Earth's climate system, but its variations have played very little role in the climate changes observed ...
  121. [121]
    The Effect of Volcanoes on the Earth's Temperature
    Major volcanoes can have a significant effect on the earth's temperature. An eruption carries ash and gases into the upper atmosphere.
  122. [122]
    Guest post: Investigating how volcanic eruptions can affect climate ...
    Jun 23, 2025 · Eruptions produce temporary cooling lasting just a few years. They do not alter the underlying warming trend driven by human emissions. Our ...
  123. [123]
    Chapter 7: The Earth's Energy Budget, Climate Feedbacks, and ...
    Table 7.8 | Summary table of effective radiative forcing (ERF) estimates for AR6 and comparison with the four previous IPCC assessment reports. Prior to AR5 ...Figure 7.6Figure 7.8
  124. [124]
    Decomposing the effective radiative forcing of anthropogenic ... - ACP
    Jul 10, 2024 · These estimates are in good agreement with the IPCC AR6 WGI ERF assessment of −1.3 (−2.0 to −0.6) W m−2 for 1750–2014 using multiple ...
  125. [125]
    Negative trend in total solar irradiance over the satellite era - PMC
    Mar 10, 2025 · BTSI shows a declining trend in TSI since 1980 of −0.15 [−0.17,−0.13] W/m2 per decade. A focus of this study, discussed further below, is that ...Missing: 20th | Show results with:20th<|control11|><|separator|>
  126. [126]
    Time evolutions of various radiative forcings for the past 150 years ...
    Oct 7, 2006 · Global distributions of instantaneous radiative forcing from 1850–1854 to 1996–2000 means under all-sky condition due to LLGHGs, O3, and ...
  127. [127]
    Solar Contribution to Global Warming - Skeptical Science
    The studies find a relatively small solar contribution to global warming, less than 40% (0.3°C) of the observed warming over the past century.
  128. [128]
    Sensitivity of Attribution of Anthropogenic Near-Surface Warming to ...
    The impact of including comprehensive estimates of observational uncertainties on a detection and attribution analysis of twentieth-century near-surface ...<|separator|>
  129. [129]
  130. [130]
    The role of the sun in climate forcing - ScienceDirect.com
    Jan 1, 2000 · In fact a non-linear regression model to separate natural and anthropogenic forcing since 1850 is consistent with a solar contribution of about ...
  131. [131]
    The Effect of Different Climate Sensitivity Priors on Projected Climate ...
    May 10, 2025 · Uncertainty in equilibrium climate sensitivity distributions propagates through to future temperature projections. Prior distributions that ...Introduction · Materials and Methods · Results · Discussion
  132. [132]
    Observational evidence that cloud feedback amplifies global warming
    Jul 19, 2021 · This constraint supports that cloud feedback will amplify global warming, making it very unlikely that climate sensitivity is smaller than 2 °C.
  133. [133]
    Large uncertainty in future warming due to aerosol forcing - Nature
    Nov 14, 2022 · Here we highlight the stark disparity that different aerosol forcing can imply for future warming projections.
  134. [134]
    Reducing Aerosol Forcing Uncertainty by Combining Models With ...
    May 3, 2023 · Aerosol forcing uncertainty represents the largest climate forcing uncertainty overall. Its magnitude has remained virtually undiminished ...
  135. [135]
    Uncertainties in the attribution of greenhouse gas warming and ...
    Jun 13, 2016 · Some of the uncertainties could be reduced in future by having more model data to better quantify the simulated estimates of the signals and ...Introduction · Climate Response Patterns... · Results of an Optimal... · Discussion
  136. [136]
    Detection, attribution, and modeling of climate change: Key open ...
    May 13, 2025 · This paper discusses a number of key open issues in climate science. It argues that global climate models still fail on natural variability at all scales.
  137. [137]
    Relative Importance of Internal Climate Variability versus ...
    At the multidecadal scale, the ICV and anthropogenic climate change (ACC) are the two main components that influence the overall climate change (Hulme et al.3. Methodology · 4. Results · E. Uncertainty Of The...
  138. [138]
    Addressing misconceptions about Climate Sensitivity research
    Aug 13, 2025 · In 2020, Steven Sherwood and twenty four co-authors published a comprehensive assessment of Earth's climate sensitivity (S20) that claimed to ...
  139. [139]
    Opinion: Can uncertainty in climate sensitivity be narrowed further?
    Feb 29, 2024 · Some studies have reaffirmed the recent assessments. For example, Zhu et al. (2021) confirm that a GCM with a very high ECS level predicts ...
  140. [140]
    BEST: Berkeley Earth Surface Temperatures - Climate Data Guide
    The Berkeley Earth Surface Temperatures (BEST) are a set of data products, originally a gridded reconstruction of land surface air temperature records ...
  141. [141]
    Global surface temperature change analysis based on MODIS data ...
    Jan 15, 2017 · For example, GISS and NCDC indicate 2005 as the warmest year in their analyses, while HadCRUT has 1998 as the warmest year. The differences ...
  142. [142]
    Temperature - Copernicus Climate Change
    Apr 15, 2025 · The average rate of temperature increase, according to ERA5, is 0.21°C per decade from 1979 to 2024, with a 95% confidence interval of ±0.03°C. ...
  143. [143]
    UAH v6.1 Global Temperature Update for January, 2025: +0.46 deg. C
    Feb 4, 2025 · The Version 6.1 global area-averaged temperature trend (January 1979 through January 2025) remains at +0.15 deg/ C/decade (+0.22 C/decade over ...
  144. [144]
    Comparing Tropospheric Warming in Climate Models and Satellite ...
    We assess the validity of two highly publicized claims: that modeled tropospheric warming is a factor of 3–4 larger than in satellite and radiosonde ...
  145. [145]
    Explaining Differences Between Recent Model and Satellite ...
    Aug 1, 2019 · Here, we analyze the role played by tropical sea surface temperature (SST) variability in recent decades in setting TTT.
  146. [146]
    Multi-decadal climate variability and satellite biases have amplified ...
    Jun 21, 2024 · Most coupled model simulations substantially overestimate tropical tropospheric warming trends over the satellite era, undermining the ...
  147. [147]
    Natural variability contributes to model–satellite differences in ...
    Mar 22, 2021 · Our results indicate that multidecadal variability can explain current model–observational differences in the rate of tropical tropospheric warming.
  148. [148]
    Revisiting the controversial issue of tropical tropospheric ...
    Apr 15, 2013 · Controversy remains over a discrepancy between modeled and observed tropical upper tropospheric temperature trends. This discrepancy is reassessed using ...
  149. [149]
    55. Tropical tropospheric warming revisited: Part 2
    Jan 20, 2015 · Regarding the hotspot controversy, it is important to me to divide this into two distinct problems. 1) Can models simulate observed tropospheric ...<|control11|><|separator|>
  150. [150]
    Study: Why troposphere warming differs between models and ...
    Jun 21, 2017 · Since around the start of the 21st century, the tropospheric warming recorded by satellites has been slower than the rate projected by climate ...
  151. [151]
    Latest Global Temps « Roy Spencer, PhD
    Latest Global Average Tropospheric Temperatures. Since 1979, NOAA satellites have been carrying instruments which measure the natural microwave thermal ...
  152. [152]
    Explainer: how surface and satellite temperature records compare
    Feb 18, 2016 · The trend in the satellite data is 0.11C per decade since 1979, compared to 0.16C per decade in the surface record.
  153. [153]
    Consequential differences in satellite-era sea surface temperature ...
    These large discrepancies are perplexing given the agreement between global temperature datasets and the fact that 70% of the surface of the Earth is covered ...
  154. [154]
    Internal variability and forcing influence model–satellite differences ...
    Climate-model simulations exhibit approximately two times more tropical tropospheric warming than satellite observations since 1979.
  155. [155]
    Variations of Tropical Lapse Rates in Climate Models and Their ...
    Variations of Tropical Lapse Rates in Climate Models and Their Implications for Upper-Tropospheric Warming ... differences in tropical tropospheric warming.2. Methods · A. Cmip6 · B. Turbulent Entrainment
  156. [156]
    Major correction to satellite data shows 140% faster warming since ...
    Jun 30, 2017 · While the new RSS v4 record shows about 5% more warming than surface records since 1979, this behavior would to some extent be expected. Climate ...
  157. [157]
    New Study Underscores the Importance of Comparing Global ...
    Jul 16, 2025 · While all datasets clearly show significant warming, the differences between them raise questions about how we interpret records of global warming.
  158. [158]
    The Unreliability of Current Global Temperature and Solar Activity ...
    Dec 11, 2024 · The IPCC claims to have “settled the science” around the detection and attribution of climate change, but this debate has yet to be satisfactorily resolved.
  159. [159]
    Surface Temperature Reconstructions for the Last 2000 Years (2006)
    Surface temperature reconstructions use proxy data like tree rings, corals, and ice cores, often combining data from different locations to study past climate ...
  160. [160]
    [PDF] Continental-scale temperature variability during the last two millennia
    Reconstruction domains for the PAGES 2k regions reported here encompass 36% of the. 124. Earth's surface (Fig. 1). Although the regions largely coincide with ...
  161. [161]
  162. [162]
    [PDF] What is the 'Hockey Stick' Debate About?
    Apr 4, 2005 · We found that meaningless red noise could yield hockey stick-like proxy PCs. This allowed us to generate a “Monte Carlo” benchmark for ...
  163. [163]
    (PDF) How Reliable Are Global Temperature Reconstructions of the ...
    Oct 14, 2025 · In this analysis, we compared seven prominent hemispheric and global temperature reconstructions for the past 2000 years (T2k) which differed ...
  164. [164]
    How Reliable Are Global Temperature Reconstructions of ... - MDPI
    Mar 3, 2022 · In this analysis, we compared seven prominent hemispheric and global temperature reconstructions for the past 2000 years (T2k) which differed from each other ...Missing: reviewed | Show results with:reviewed
  165. [165]
    Progress and uncertainties in global and hemispheric temperature ...
    Jun 15, 2022 · To address this, the PAGES 2k Consortium (2019) median reconstruction highlighted in AR6 (Fig. 1d) incorporates the result of seven different ...
  166. [166]
    Researchers compare global temperature variability in glacial and ...
    Feb 5, 2018 · From the height of the last glacial period 21,000 years ago to our current interglacial period, the Earth has warmed by an average of 5 degrees ...
  167. [167]
    A global perspective on Last Glacial Maximum to Holocene climate ...
    Glacial–interglacial temperature increases range from 0.4 to 17 °C (Fig. 5). The magnitude of temperature changes increases poleward in both hemispheres, with ...
  168. [168]
    Ice core basics - Antarctic Glaciers
    Ice core records allow us to generate continuous reconstructions of past climate, going back at least 800,000 years[2]. By looking at past concentrations of ...
  169. [169]
    Holocene global mean surface temperature, a multi-method ... - Nature
    Jun 30, 2020 · For the mid Holocene (6.5–5.5 ka), Marcott et al.'s (ref.) reconstruction shows a GMST approximately 0.6 °C warmer than the 19th Century (Fig. 6) ...
  170. [170]
    A reconstruction of regional and global temperature for the past ...
    Mar 8, 2013 · Here we provide a broader perspective by reconstructing regional and global temperature anomalies for the past 11,300 years from 73 globally ...
  171. [171]
    Revisiting the Holocene global temperature conundrum - Nature
    Feb 15, 2023 · From a large variety of available evidence, we find support for a relatively mild millennial-scale global thermal maximum during the mid- ...Missing: thermal evidence
  172. [172]
    The PhanSST global database of Phanerozoic sea surface ... - Nature
    Dec 6, 2022 · Here, we present PhanSST, a database containing over 150,000 data points from five proxy systems that can be used to estimate past sea surface ...
  173. [173]
    Deep‐Time Paleoclimate Proxies - Macdonald - 2020 - AGU Journals
    Sep 11, 2020 · These estimates are hampered by uncertainties in the δ18O composition of seawater, ice volume, vital effects, local temperature gradients, and ...
  174. [174]
    Spatial biases in oxygen-based Phanerozoic seawater temperature ...
    Aug 1, 2025 · This work has limitations, with δ18Osw-salinity relationships being less reliable in both low-latitude epicontinental settings and high-latitude ...
  175. [175]
    The Phanerozoic climate - Shaviv - 2023
    Nov 3, 2022 · We review the long-term climate variations during the last 540 million years (Phanerozoic Eon). We first summarize the geological and geochemical datasets ...
  176. [176]
    [PDF] The Berkeley Earth Land/Ocean Temperature Record
    Dec 17, 2020 · It provides a global mean temperature record quite similar to records from Hadley's. HadCRUT4, NASA's GISTEMP, NOAA's GlobalTemp, and Cowtan and ...Missing: convergence | Show results with:convergence
  177. [177]
    Validation metrics of homogenization techniques on artificially ...
    Sep 14, 2021 · In a homogenization study performed at the Poznan meteorological station in Poland, after a correction of 0.5–0.6 °C was applied, the ...
  178. [178]
    Review and discussion of homogenisation methods for climate data
    Aug 9, 2025 · To date, the development and use of homogenization methods has focused mainly on temperature and precipitation time series and on monthly rather ...
  179. [179]
    [PDF] UC Berkeley - eScholarship
    4) adjusted version 2 data with the GISS UHI correction (TOB + pairwise homogenization +. GISS UHI adjustments; v2+step2). Each of these variants is described ...
  180. [180]
    Global Surface Temperature Dataset Updated | News
    Feb 14, 2024 · NOAAGlobalTemp is the authoritative dataset used to assess observed global climate change. NOAAGlobalTemp combines long-term sea surface (water) ...
  181. [181]
    [PDF] Is the U.S. Surface Temperature Record Reliable?
    This report, by meteorologist Anthony Watts, presents the results of the first-ever compre- hensive review of the quality of data coming from the National ...
  182. [182]
    Watts et al.: Temperature station siting matters - Climate Etc.
    Dec 17, 2015 · The study focuses on finding trend differences between well sited and poorly sited weather stations, based on a WMO approved metric for ...
  183. [183]
    Quantifying the influence of anthropogenic surface processes and ...
    Dec 14, 2007 · In the work by McKitrick and Michaels [2004] we regressed the spatial pattern of trends from 93 countries on a matrix of local climatic ...
  184. [184]
  185. [185]
    A new merge of global surface temperature datasets since the start ...
    Nov 6, 2019 · Several global ST datasets have been developed by different groups in NOAA NCEI, NASA GISS, UK Met Office Hadley Centre & UEA CRU, and Berkeley ...Missing: validation | Show results with:validation
  186. [186]
    Are surface temperature records reliable? - Skeptical Science
    1) The accuracy of the land surface temperature record was confirmed; · 2) The BEST study used more data than previous studies but came to essentially the same ...
  187. [187]
    (PDF) Evidence of Urban Blending in Homogenized Temperature ...
    Aug 9, 2025 · Evidence of Urban Blending in Homogenized Temperature Records in Japan and in the United States: Implications for the Reliability of Global Land ...
  188. [188]
    Benchmarking the performance of pairwise homogenization of ...
    Aug 10, 2025 · Results indicate that all randomized versions of the algorithm consistently produce homogenized data closer to the true climate signal in the ...
  189. [189]
    Evaluation of the Homogenization Adjustments Applied to European ...
    For Version 2 of the GHCN, NOAA applied very different homogenization techniques to both components. The USHCN component was homogenized using the station ...
  190. [190]
    Early-twentieth-century cold bias in ocean surface temperature ...
    Nov 20, 2024 · Here our objective is to evaluate the consistency of the early LSAT and SST record at the global scale. We develop GMST reconstructions using ...
  191. [191]
    Disentangling the trend in the warming of urban areas into global ...
    We present an approach that applies the concept of common trends to extract the global contributions to observed warming in cities.
  192. [192]
    On the reliability of the U.S. surface temperature record | Request PDF
    Aug 10, 2025 · In 2010, a group of scientists released a study where they had researched the reliability of the US surface temperature record; the authors ...