Fact-checked by Grok 2 weeks ago

Celsius

The is a system in which the freezing point of under standard atmospheric pressure is defined as 0 °C and the as 100 °C, dividing the interval between these points into 100 equal degrees. In the (SI), the degree Celsius (°C) is a derived equal in magnitude to the (K), with 0 °C exactly equivalent to 273.15 K, making it suitable for both absolute thermodynamic measurements and practical applications. Named after Swedish astronomer (1701–1744), the scale originated from his 1742 proposal presented to the Swedish Royal Academy of Sciences, where he described experiments using mercury thermometers to establish fixed points based on water's phase changes. Celsius's original version inverted the modern convention, assigning 0 °C to water's and 100 °C to its freezing point; this was reversed shortly after his death in 1744 by colleagues like , establishing the direct scale still used today. Initially known as the centigrade scale—reflecting its division into 100 grades—the name was officially changed to Celsius in 1948 by the International Committee for Weights and Measures, selecting "degree Celsius" from proposed names to standardize the unit. The scale is the standard for in the SI system and is employed globally in science, industry, and daily life, with water's at 0.01 °C serving as a precise reference for thermometers. It facilitates intuitive readings for human-perceived temperatures, such as body around 37 °C, and is used in most countries for weather reporting and , though the scale persists in the United States for non-scientific contexts. This widespread adoption underscores Celsius's role in standardizing across disciplines, from monitoring to chemical reactions.

Fundamentals

Definition

The Celsius scale, denoted by the symbol °C, is the () derived unit for expressing Celsius temperature, a measure of offset from the scale. It is defined by the equation t/^\circ\mathrm{C} = T/\mathrm{K} - 273.15, where t is the Celsius temperature and T is the in , establishing 0 °C as equivalent to 273.15 . This definition ensures that , the theoretical lowest temperature at which thermal motion ceases, corresponds to exactly -273.15 °C. The interval of 1 °C is equal in magnitude to 1 K, meaning temperature differences are identical on both scales, with the Celsius scale divided into 100 equal parts between the approximate freezing point (0 °C) and boiling point (100 °C) of water at standard atmospheric pressure. Prior to the 2019 SI revision, the scale was anchored to the triple point of water (the temperature at which water's solid, liquid, and gas phases coexist in equilibrium) at exactly 0.01 °C, providing a fixed reference for calibration. Since the 26th General Conference on Weights and Measures in 2018 (effective 20 May 2019), the Celsius scale is tied to the through a definition based on fundamental physical constants, of water's properties for greater universality and precision. The is now defined by fixing the k at exactly $1.380649 \times 10^{-23} J/K, where the joule is the unit of energy, ensuring the scale's stability without reliance on material artifacts or phase changes. Under this framework, the of water is measured to be approximately 0.01 °C, maintaining practical continuity with prior calibrations. The is the primary unit for everyday and scientific use in nearly all countries worldwide. The is the only major country where the predominates for everyday non-scientific purposes; a few other countries and territories, such as , , , , and , use alongside .

Scale Properties

The is an interval , characterized by its and additive nature, where differences remain constant regardless of the level. This stems from its definition as t = T - 273.15 K, where T is the in kelvins, ensuring that equal intervals represent equal changes in across the . For instance, the physical magnitude of a 10 ° rise from 15 ° to 25 ° is identical to that from -5 ° to 5 °, allowing reliable comparisons of variations without dependence on the reference point. A key property of the Celsius scale is the equality of its intervals with those on the scale, as the degree Celsius (°C) is by definition equal in magnitude to the (K). Consequently, the numerical value of any temperature difference or interval is the same when expressed in either unit; for example, a change of 5 °C corresponds exactly to 5 . This equivalence permits direct addition and subtraction of temperature intervals for computations, such as calculating or , without requiring unit conversion, which is particularly advantageous in scientific and engineering applications. Negative temperatures on the Celsius scale, denoted by values below 0 °C, represent conditions colder than the freezing point of under standard , equivalent to thermodynamic temperatures below 273.15 . These readings indicate sub-freezing states, such as those encountered in winter climates or processes, where transitions to or other substances exhibit reduced molecular motion compared to the reference. The scale's ability to accommodate negative values facilitates the description of such low-temperature phenomena without altering the interval structure. For precise measurements, the Celsius scale employs decimal subdivisions, enabling fine-grained reporting such as 18.7 °C, which aligns with the system's decimal framework for consistency and ease of calculation. This precision is essential in , where the specifies resolutions of 0.1 °C for air observations to ensure accurate interval comparisons in reports and data. In scientific contexts, such as and , the scale's decimal structure supports uncertainties as low as 0.1 , promoting standardized and reproducible interval reporting across global research efforts.

Historical Development

Invention and Early Proposals

The development of the Celsius temperature scale built upon earlier efforts to standardize thermometry, particularly the work of Danish astronomer , who in 1701 proposed one of the first practical scales using a . Rømer set zero at the freezing point of a solution, with the of at 60 degrees and pure water's freezing at 7.5 degrees, establishing a framework of fixed reference points based on observable physical changes rather than arbitrary markers. This approach influenced subsequent scientists by emphasizing reproducible intervals, though Rømer's scale used different references and a smaller range compared to later centesimal systems. In 1742, Swedish astronomer presented his proposal to the Swedish Royal Society of Sciences in , introducing a centesimal divided into 100 equal parts between two fixed points: the of water at 0 degrees and the freezing point at 100 degrees. Celsius based this on meticulous experiments using a mercury thermometer, demonstrating that the freezing point remained consistent across varying atmospheric pressures and latitudes, while the varied with pressure but could be standardized to sea-level conditions, thus aiming for an international standard to replace the proliferation of inconsistent scales like those of Réaumur and Delisle. He was particularly influenced by Joseph-Nicolas Delisle's reversed , which placed zero at the , and chose the inversion—higher numbers for colder temperatures—to minimize negative readings in typical conditions, reflecting a practical consideration for scientific measurements. Celsius's scale gained early traction in Sweden following its publication in the paper "Observations of two persistent degrees on a thermometer," with adoption by 1744 through instruments like those crafted by local makers such as Ekström and Strömer, and notably by Botanist for precise environmental recordings. Celsius himself conducted supporting tests, including measurements of variations in points under different pressures, which aligned closely with modern values (e.g., a decrease of approximately 1 °C for every 300 m increase in altitude) and underscored the scale's reliability for meteorological applications. This initial implementation in Swedish scientific circles laid the groundwork for further refinements, though the inversion persisted until shortly after Celsius's death in 1744.

Naming Conventions and Adoption

The inverted form of the temperature scale, with 0° as the freezing point of water and 100° as the , was independently proposed in 1743 by French physicist Jean-Pierre Christin and adopted in in 1744 by Botanist shortly after Anders Celsius's death, establishing the configuration that became the international standard. Originally termed the "centigrade" scale from the Latin words for "hundred" and "steps" to reflect its division into 100 equal intervals between the reference points, the name led to confusion with the centigrade unit of angular measurement (one-hundredth of a , or ). This ambiguity prompted the 9th General Conference on Weights and Measures (CGPM) to officially rename it the "Celsius" scale in 1948, honoring the original proposer while honoring the system's emphasis on clarity. By the 19th century, the centigrade (later Celsius) scale had become the de facto standard for temperature measurement in scientific literature and experiments across Europe, facilitating consistent international collaboration in fields like physics and chemistry. Its adoption accelerated globally through 20th-century metrication efforts, with the United Kingdom formally endorsing the metric system including Celsius in 1965 via government policy, leading to widespread implementation in education, industry, and weather reporting by the 1970s. Australia followed suit, with the Bureau of Meteorology switching weather forecasts from Fahrenheit to Celsius on September 1, 1972, as part of broader metric conversion. By the 1980s, Celsius had achieved near-universal integration as the SI unit for temperature in over 190 countries, supporting standardized scientific and trade practices worldwide. In the United States, Fahrenheit persists alongside Celsius due to entrenched cultural familiarity and the inertia of the customary measurement system, which was inherited from colonial practices and never fully supplanted despite metrication initiatives in the 1970s. Dual usage continues in sectors like , where Fahrenheit's finer gradations (180 divisions versus Celsius's 100 between water's phase transitions) aid precise monitoring of temperatures, such as the normal range around 98.6°F.

Notation and Representation

Symbol Usage

The standard symbol for the Celsius temperature unit is °C, consisting of the degree sign (°) immediately followed by a capital letter C with no intervening space, as defined in the (SI). This notation reflects the unit's status as the degree Celsius, a special name for a temperature interval equal in magnitude to the . According to SI recommendations, a must separate the numerical value from the symbol, as in 23 °C, to ensure clarity in expressing Celsius temperatures. In , the symbol °C is rendered in (upright) type, while any associated symbols (such as t for ) are italicized; the degree sign itself is positioned as a superscript-like for proper alignment. In plain text or non-technical contexts, the degree sign may appear at full size rather than raised. Formal SI usage advises against redundancy, such as writing "degrees °C," preferring simply the symbol or the full unit name " Celsius" (with lowercase "degree" and uppercase "Celsius"). Contextual applications of the °C symbol vary slightly by domain. In programming and digital encoding, it is often represented using Unicode as the degree sign (U+00B0) followed by "C," or the combined glyph ℃ (U+2103). Engineering drawings and technical specifications adhere to SI spacing rules, placing the symbol after values like 100 °C for boiling point annotations. Weather reports and meteorological contexts universally employ °C with a space before it, such as 15 °C, to denote ambient temperatures. Common errors in °C usage include employing a lowercase "c" instead of capital "C," inserting a space between the degree sign and "C" (e.g., ° C), or omitting the required space between the number and symbol (e.g., 20°C). Another frequent mistake is confusing the temperature symbol with the angular degree (also °), though context typically distinguishes them; additionally, the obsolete "degree centigrade" should be avoided in favor of "degree Celsius."

Typesetting and Unicode

The Celsius degree symbol is typically represented in Unicode as the combination of the degree sign U+00B0 (°) followed by the Latin capital letter C (U+0043), forming °C. Although a exists at U+2103 (℃), the Unicode Standard recommends using the sequence U+00B0 U+0043 in normal text for better compatibility and semantic clarity, with fonts encouraged to render this ligature-like combination tightly for optimal appearance. The degree sign U+00B0 was introduced in Unicode version 1.1 in June 1993, within the block. In mathematical and scientific typesetting, the is rendered as a small superscript circle immediately preceding the 'C', ensuring precise alignment and no space between the elements. For example, in , this is achieved with the notation ^{\circ}\mathrm{C}, where the degree is raised and the 'C' is in upright roman font to distinguish it as a unit symbol. adjustments are often applied in professional to optically center the superscript degree relative to the 'C', compensating for the degree's smaller width and preventing visual gaps, particularly in fonts. Legacy systems lacking full support may approximate the symbol using ASCII text such as "deg C" or "C", which avoids encoding issues but sacrifices precision. In , the degree sign is commonly inserted via the ° or numeric entity °, followed by 'C', as in °C, ensuring in web rendering. Subsequent versions, such as 4.0 (2003) and later, have improved font and rendering support for these characters through enhanced and decompositions, facilitating consistent display across platforms.

Conversions and Equivalences

Relation to Kelvin Scale

The Celsius and scales are directly linked through the (), where the (K) serves as the base unit for , and the degree Celsius (°C) is a derived unit equal in magnitude to the but offset for practical reference. The between the two is given by the formula T / \text{K} = t / ^\circ\text{C} + 273.15, where T is the in kelvins and t is the Celsius temperature; this relation holds exactly, as the numerical value of a temperature interval is identical in both units. In physics and , adding 273.15 to a Celsius value shifts measurements to the absolute scale without altering the size of temperature intervals, enabling applications like thermodynamic calculations where negative temperatures are invalid. Prior to the 2019 SI redefinition, the was defined as exactly 1/273.16 of the at the of (assigned 273.16 K, corresponding to 0.01 °C), which established the 273.15 K offset between 0 °C and through the ice point reference. The 2019 revision, effective May 20, fixed the at exactly $1.380\,649 \times 10^{-23} J/K to define the , decoupling it from the water for greater precision and universality while preserving the exact Celsius-Kelvin relation. In scientific practice, the kelvin is preferred for absolute thermodynamic quantities, such as in the PV = nRT, where temperature must be on an to ensure physical consistency. Conversely, Celsius is commonly used for temperature differences and , such as reporting, due to its alignment with familiar points like water's freezing and . This coexistence supports versatile applications across disciplines, from chemistry ( for reaction kinetics) to ( for climate data).

Relation to Fahrenheit Scale

The scale, proposed by in 1724, differs from the Celsius scale in both its reference points and interval size, leading to conversions that account for the offset in reference points and the difference in interval sizes. Fahrenheit initially defined 0 °F as the freezing point of a brine mixture of equal parts , , and (or salt in some accounts), and later set 32 °F as the freezing point of pure and 96 °F as the approximate normal , with the of at 212 °F; this created 180 divisions between freezing and boiling, compared to the 100 divisions in Celsius. These arbitrary zero points, rooted in Fahrenheit's early thermometry experiments rather than a universal standard like water's , necessitate an offset in conversions, unlike the direct interval scaling between Celsius and . The standard conversion formulas account for this offset and the interval ratio. To convert from Fahrenheit to Celsius: t(^\circ \text{C}) = \frac{t(^\circ \text{F}) - 32}{1.8} = (t(^\circ \text{F}) - 32) \times \frac{5}{9} This derives from aligning the scales: subtracting 32 °F shifts the Fahrenheit zero to match Celsius's 0 °C ( freezing), and dividing by 1.8 (or multiplying by 5/9) adjusts for the larger intervals, as 180 °F spans the same range as 100 °C. Conversely, to convert from Celsius to Fahrenheit: t(^\circ \text{F}) = t(^\circ \text{C}) \times 1.8 + 32 = t(^\circ \text{C}) \times \frac{9}{5} + 32 Here, multiplying by 9/5 expands the smaller Celsius intervals to Fahrenheit's scale, and adding 32 realigns the zero point. For temperature differences (intervals), the conversion is linear and scale-invariant: a change of 1 °C equals a change of 1.8 °F, so differences in Celsius are simply multiplied by 9/5 to obtain Fahrenheit equivalents, without the offset. In the United States, where Fahrenheit remains the predominant scale for everyday use, many weather applications and forecasts display both scales simultaneously to assist users familiar with Celsius, such as in scientific contexts or for international audiences. A common mental approximation for converting Fahrenheit to Celsius is to subtract 30 from the Fahrenheit value and divide by 2, providing reasonable accuracy for typical room and weather temperatures (e.g., 86 °F ≈ (86 - 30)/2 = 28 °C, close to the exact 30 °C).

Key Reference Points

Water's Phase Transitions

The Celsius scale was historically defined using the phase transitions of water as fixed reference points, with the freezing point of pure water set at 0 °C and the boiling point approximately at 100 °C under standard atmospheric pressure of 1 atm (101.325 kPa). These points provided the anchors for the scale, dividing the interval between them into 100 equal degrees. However, precise measurements show that the boiling point of pure water (using Vienna Standard Mean Ocean Water, VSMOW) at exactly 1 atm is 99.9839 °C, a value very close to the nominal 100 °C used for calibration purposes. In the modern SI system, adopted in 2019, these water-based points are secondary, as the Celsius scale is now derived directly from the Kelvin scale through fixed constants, ensuring the freezing point remains exactly 0 °C by definition while aligning with fundamental physical properties. A more precise reference for the scale prior to 2019 was the triple point of water, where solid, liquid, and vapor phases coexist in equilibrium at 0.01 °C and a pressure of 611.657 Pa. This point served as the primary reproducible standard for calibrating thermometers, offering higher accuracy than the freezing or boiling points, which can be affected by slight variations in pressure or purity. The triple point temperature is exactly 273.16 K on the Kelvin scale, linking Celsius directly to absolute temperature measurements. The sublimation curve of water, representing the transition from ice to vapor without melting, extends downward to approach absolute zero at -273.15 °C, where molecular motion theoretically ceases, though practical sublimation rates diminish exponentially at low temperatures. Environmental factors influence these phase transitions, altering the reference points from ideal conditions. Impurities, such as salts in seawater with a salinity of about 35 parts per thousand, depress the freezing point through colligative properties, lowering it to approximately -1.8 °C at 1 atm, which prevents surface oceans from freezing solid in polar regions. Similarly, the boiling point decreases with altitude due to reduced atmospheric pressure; at 2000 m, where pressure is roughly 0.8 atm, water boils at about 93 °C, affecting cooking and sterilization processes in high-elevation areas. These variations highlight the scale's reliance on standardized conditions for consistency.

Common Everyday Temperatures

In human physiology, the normal core body is 37.0 °C, which maintains optimal metabolic function and . A fever, indicating an to or , is typically defined as a temperature exceeding 38 °C. Conversely, sets in when core falls below 35 °C, impairing bodily functions and requiring immediate medical intervention. Everyday environmental temperatures on the Celsius scale span a wide range relevant to comfort and activity. , often considered the comfort zone for indoor living and working, falls between 20–25 °C, balancing with in occupied spaces. for cooking or sterilization reaches 100 °C at standard , a fixed reference point for heat-based processes. At the theoretical extreme, represents the coldest possible temperature at -273.15 °C, where molecular motion ceases, foundational to and . Weather extremes highlight the Celsius scale's utility in meteorology. The hottest surface air temperature officially recorded by the WMO is 56.7 °C in Death Valley, California, on July 10, 1913, though recent 2025 studies have questioned its accuracy, illustrating hyper-arid conditions that challenge human survival. The coldest verified reading is -89.2 °C at in on July 21, 1983, demonstrating polar night's radiative cooling in a high-elevation environment. For broader context, Earth's average global surface temperature is approximately 15 °C, serving as a baseline for climate monitoring and variability assessments. In food preparation and storage, the Celsius scale standardizes processes for and quality. Household freezers are set to -18 °C to inhibit and preserve perishables long-term. Typical in ovens occurs at 180 °C, allowing even distribution for items like cakes and breads without excessive drying or burning. These values, often equivalent to 0 °F for freezing and 350 °F for , underscore the scale's practicality in daily applications.

Temperature Intervals

Measuring Differences

Temperature intervals on the Celsius scale are calculated by direct subtraction of two temperature readings, expressed as Δt = t₂ - t₁ in degrees Celsius (°C), where the interval size remains identical to that on the Kelvin scale since one Celsius degree equals one Kelvin interval. This method focuses on changes rather than absolute values, making it suitable for quantifying thermal variations without reference to the scale's zero point. These intervals hold significant physical meaning, representing proportional changes in ; for instance, a 1 °C temperature rise in corresponds to an energy input of approximately 4.184 J per gram, based on its under standard conditions. In applications like , the change in of a is given by ΔL = α L Δt, where α is the linear coefficient and Δt is the Celsius interval, illustrating how such differences drive dimensional adjustments in contexts. In climate analysis, Celsius intervals are essential for tracking trends, such as annual warming rates or diurnal variations, providing a basis for assessing environmental impacts without scale-dependent biases. Representative examples include typical daily air temperature fluctuations of 10–15 °C in many mid-latitude regions, reflecting the diurnal cycle driven by solar heating, as observed in global meteorological datasets. In human physiology, a fever often involves a 2 °C rise above the normal body temperature of approximately 37 °C, signaling immune response without implying proximity to absolute limits. These differences are reference-independent, avoiding complications from absolute zero, as the interval magnitude depends solely on the scale's uniformity rather than its origin. When reporting Celsius intervals, precision should align with the measurement instrument's accuracy to ensure reliability; in settings, this often means resolutions of ±0.1 °C for high-precision thermometers calibrated against standards.

Practical Applications of Intervals

In , Celsius intervals are essential for quantifying and communicating temperature anomalies and trends, such as the observed increase of approximately 1.5°C since pre-industrial levels (1850–1900) as of 2024, when the annual average first exceeded 1.5°C above pre-industrial levels—the warmest year on record—according to the (WMO). This interval-based reporting allows scientists to track changes like the accelerated warming rate of 0.2°C per decade since 1970, facilitating comparisons across datasets and regions without reliance on absolute values. For instance, projections under high-emissions scenarios indicate potential intervals of up to 5°C by the end of the century, underscoring the scale of climate impacts in and seasonal outlooks. In , Celsius intervals guide the design and testing of systems under , particularly in assessing durability through controlled temperature differentials. For example, in and automotive applications, tests often involve cycling materials over 40°C intervals—such as from 20°C to 60°C—to evaluate and , ensuring compliance with standards like those from the American Society for Testing and Materials (ASTM). In (HVAC) systems, engineers target comfort differentials of around 10–15°C between indoor (typically 20–24°C) and outdoor conditions to optimize and occupant well-being, as outlined in Standard 55 for thermal environmental conditions. Medical applications leverage Celsius intervals for precise monitoring and therapeutic control of body variations. In treatments for cancer, clinicians induce controlled rises of about 1°C per hour to reach tumor temperatures of 40–43°C, enhancing radiotherapy efficacy while minimizing risks to healthy tissue, as demonstrated in clinical protocols from the . Similarly, in cooking and , intervals inform time adjustments; for every 10°C increase in temperature, cooking duration can be roughly halved to achieve equivalent doneness, based on principles validated in research. In climate policy, Celsius intervals define critical thresholds for international agreements, emphasizing differences from baseline periods rather than fixed points. The , adopted under the United Nations Framework Convention on Climate Change, commits nations to limit to well below 2°C above pre-industrial levels, with efforts to cap it at 1.5°C, using these intervals to frame emission reduction targets and strategies. This approach enables projections of interval-based impacts, such as a 2°C triggering irreversible sea-level rise, informing policy decisions on without absolute temperature references.