The Celsius scale is a temperature measurement system in which the freezing point of water under standard atmospheric pressure is defined as 0 °C and the boiling point as 100 °C, dividing the interval between these points into 100 equal degrees.[1] In the International System of Units (SI), the degree Celsius (°C) is a derived unit equal in magnitude to the kelvin (K), with 0 °C exactly equivalent to 273.15 K, making it suitable for both absolute thermodynamic measurements and practical applications.[1]Named after Swedish astronomer Anders Celsius (1701–1744), the scale originated from his 1742 proposal presented to the Swedish Royal Academy of Sciences, where he described experiments using mercury thermometers to establish fixed points based on water's phase changes.[2] Celsius's original version inverted the modern convention, assigning 0 °C to water's boiling point and 100 °C to its freezing point; this was reversed shortly after his death in 1744 by colleagues like Carl Linnaeus, establishing the direct scale still used today.[2] Initially known as the centigrade scale—reflecting its division into 100 grades—the name was officially changed to Celsius in 1948 by the International Committee for Weights and Measures, selecting "degree Celsius" from proposed names to standardize the unit.[3]The Celsius scale is the standard for temperature in the SI system and is employed globally in science, industry, and daily life, with water's triple point at 0.01 °C serving as a precise calibration reference for thermometers.[1] It facilitates intuitive readings for human-perceived temperatures, such as body temperature around 37 °C, and is used in most countries for weather reporting and education, though the Fahrenheit scale persists in the United States for non-scientific contexts.[4] This widespread adoption underscores Celsius's role in standardizing temperature across disciplines, from climate monitoring to chemical reactions.
Fundamentals
Definition
The Celsius scale, denoted by the symbol °C, is the International System of Units (SI) derived unit for expressing Celsius temperature, a measure of thermodynamic temperature offset from the Kelvin scale. It is defined by the equation t/^\circ\mathrm{C} = T/\mathrm{K} - 273.15, where t is the Celsius temperature and T is the thermodynamic temperature in kelvin, establishing 0 °C as equivalent to 273.15 K.[5][6] This definition ensures that absolute zero, the theoretical lowest temperature at which thermal motion ceases, corresponds to exactly -273.15 °C.[5]The interval of 1 °C is equal in magnitude to 1 K, meaning temperature differences are identical on both scales, with the Celsius scale divided into 100 equal parts between the approximate freezing point (0 °C) and boiling point (100 °C) of water at standard atmospheric pressure.[5][6] Prior to the 2019 SI revision, the scale was anchored to the triple point of water (the temperature at which water's solid, liquid, and gas phases coexist in equilibrium) at exactly 0.01 °C, providing a fixed reference for calibration.[5]Since the 26th General Conference on Weights and Measures in 2018 (effective 20 May 2019), the Celsius scale is tied to the kelvin through a definition based on fundamental physical constants, independent of water's properties for greater universality and precision. The kelvin is now defined by fixing the Boltzmann constant k at exactly $1.380649 \times 10^{-23} J/K, where the joule is the SI unit of energy, ensuring the scale's stability without reliance on material artifacts or phase changes.[5][6] Under this framework, the triple point of water is measured to be approximately 0.01 °C, maintaining practical continuity with prior calibrations.[5]The Celsius scale is the primary temperature unit for everyday and scientific use in nearly all countries worldwide. The United States is the only major country where the Fahrenheit scale predominates for everyday non-scientific purposes; a few other countries and territories, such as the Bahamas, Belize, Cayman Islands, Palau, and Liberia, use Fahrenheit alongside Celsius.[7]
Scale Properties
The Celsius scale is an interval scale, characterized by its linear and additive nature, where temperature differences remain constant regardless of the absolutetemperature level. This linearity stems from its definition as Celsius temperature t = T - 273.15 K, where T is the thermodynamic temperature in kelvins, ensuring that equal intervals represent equal changes in thermal energy across the scale. For instance, the physical magnitude of a 10 °C rise from 15 °C to 25 °C is identical to that from -5 °C to 5 °C, allowing reliable comparisons of temperature variations without dependence on the reference point.[5]A key property of the Celsius scale is the equality of its intervals with those on the Kelvin scale, as the degree Celsius (°C) is by definition equal in magnitude to the kelvin (K). Consequently, the numerical value of any temperature difference or interval is the same when expressed in either unit; for example, a change of 5 °C corresponds exactly to 5 K. This equivalence permits direct addition and subtraction of temperature intervals for computations, such as calculating heat transfer or thermal expansion, without requiring unit conversion, which is particularly advantageous in scientific and engineering applications.[5][8]Negative temperatures on the Celsius scale, denoted by values below 0 °C, represent conditions colder than the freezing point of water under standard atmospheric pressure, equivalent to thermodynamic temperatures below 273.15 K. These readings indicate sub-freezing states, such as those encountered in winter climates or refrigeration processes, where water transitions to ice or other substances exhibit reduced molecular motion compared to the triple point reference. The scale's ability to accommodate negative values facilitates the description of such low-temperature phenomena without altering the interval structure.[5][8]For precise measurements, the Celsius scale employs decimal subdivisions, enabling fine-grained reporting such as 18.7 °C, which aligns with the metric system's decimal framework for consistency and ease of calculation. This precision is essential in meteorology, where the World Meteorological Organization specifies resolutions of 0.1 °C for air temperature observations to ensure accurate interval comparisons in weather reports and climate data. In scientific contexts, such as thermodynamics and calibration, the scale's decimal structure supports uncertainties as low as 0.1 K, promoting standardized and reproducible interval reporting across global research efforts.[8][9]
Historical Development
Invention and Early Proposals
The development of the Celsius temperature scale built upon earlier efforts to standardize thermometry, particularly the work of Danish astronomer Ole Rømer, who in 1701 proposed one of the first practical scales using a mercury-in-glass thermometer. Rømer set zero at the freezing point of a brine solution, with the boiling point of water at 60 degrees and pure water's freezing at 7.5 degrees, establishing a framework of fixed reference points based on observable physical changes rather than arbitrary markers.[2] This approach influenced subsequent scientists by emphasizing reproducible intervals, though Rømer's scale used different references and a smaller range compared to later centesimal systems.[10]In 1742, Swedish astronomer Anders Celsius presented his proposal to the Swedish Royal Society of Sciences in Uppsala, introducing a centesimal scale divided into 100 equal parts between two fixed points: the boiling point of water at 0 degrees and the freezing point at 100 degrees. Celsius based this on meticulous experiments using a mercury thermometer, demonstrating that the freezing point remained consistent across varying atmospheric pressures and latitudes, while the boiling point varied with pressure but could be standardized to sea-level conditions, thus aiming for an international standard to replace the proliferation of inconsistent scales like those of Réaumur and Delisle.[2] He was particularly influenced by Joseph-Nicolas Delisle's reversed scale, which placed zero at the boiling point, and chose the inversion—higher numbers for colder temperatures—to minimize negative readings in typical conditions, reflecting a practical consideration for scientific measurements.[10]Celsius's scale gained early traction in Sweden following its publication in the paper "Observations of two persistent degrees on a thermometer," with adoption by 1744 through instruments like those crafted by local makers such as Ekström and Strömer, and notably by Botanist Carl Linnaeus for precise environmental recordings.[2] Celsius himself conducted supporting tests, including measurements of variations in boiling points under different pressures, which aligned closely with modern values (e.g., a decrease of approximately 1 °C for every 300 m increase in altitude) and underscored the scale's reliability for meteorological applications.[2] This initial implementation in Swedish scientific circles laid the groundwork for further refinements, though the inversion persisted until shortly after Celsius's death in 1744.
Naming Conventions and Adoption
The inverted form of the temperature scale, with 0° as the freezing point of water and 100° as the boiling point, was independently proposed in 1743 by French physicist Jean-Pierre Christin and adopted in Sweden in 1744 by Botanist Carl Linnaeus shortly after Anders Celsius's death, establishing the configuration that became the international standard.Originally termed the "centigrade" scale from the Latin words for "hundred" and "steps" to reflect its division into 100 equal intervals between the reference points, the name led to confusion with the centigrade unit of angular measurement (one-hundredth of a right angle, or gradian).[11] This ambiguity prompted the 9th General Conference on Weights and Measures (CGPM) to officially rename it the "Celsius" scale in 1948, honoring the original proposer while honoring the SI system's emphasis on clarity.[12][13]By the 19th century, the centigrade (later Celsius) scale had become the de facto standard for temperature measurement in scientific literature and experiments across Europe, facilitating consistent international collaboration in fields like physics and chemistry.[14] Its adoption accelerated globally through 20th-century metrication efforts, with the United Kingdom formally endorsing the metric system including Celsius in 1965 via government policy, leading to widespread implementation in education, industry, and weather reporting by the 1970s.[15] Australia followed suit, with the Bureau of Meteorology switching weather forecasts from Fahrenheit to Celsius on September 1, 1972, as part of broader metric conversion.[16] By the 1980s, Celsius had achieved near-universal integration as the SI unit for temperature in over 190 countries, supporting standardized scientific and trade practices worldwide.[17]In the United States, Fahrenheit persists alongside Celsius due to entrenched cultural familiarity and the inertia of the customary imperial measurement system, which was inherited from British colonial practices and never fully supplanted despite federal metrication initiatives in the 1970s.[4] Dual usage continues in sectors like medicine, where Fahrenheit's finer gradations (180 divisions versus Celsius's 100 between water's phase transitions) aid precise monitoring of human body temperatures, such as the normal range around 98.6°F.[18]
Notation and Representation
Symbol Usage
The standard symbol for the Celsius temperature unit is °C, consisting of the degree sign (°) immediately followed by a capital letter C with no intervening space, as defined in the International System of Units (SI).[5] This notation reflects the unit's status as the degree Celsius, a special name for a temperature interval equal in magnitude to the kelvin.[8]According to SI recommendations, a space must separate the numerical value from the symbol, as in 23 °C, to ensure clarity in expressing Celsius temperatures.[8] In scientific writing, the symbol °C is rendered in roman (upright) type, while any associated quantity symbols (such as t for temperature) are italicized; the degree sign itself is positioned as a superscript-like glyph for proper alignment.[5] In plain text or non-technical contexts, the degree sign may appear at full size rather than raised. Formal SI usage advises against redundancy, such as writing "degrees °C," preferring simply the symbol or the full unit name "degree Celsius" (with lowercase "degree" and uppercase "Celsius").[8]Contextual applications of the °C symbol vary slightly by domain. In programming and digital encoding, it is often represented using Unicode as the degree sign (U+00B0) followed by "C," or the combined glyph ℃ (U+2103). Engineering drawings and technical specifications adhere to SI spacing rules, placing the symbol after values like 100 °C for boiling point annotations. Weather reports and meteorological contexts universally employ °C with a space before it, such as 15 °C, to denote ambient temperatures.[8]Common errors in °C usage include employing a lowercase "c" instead of capital "C," inserting a space between the degree sign and "C" (e.g., ° C), or omitting the required space between the number and symbol (e.g., 20°C). Another frequent mistake is confusing the temperature symbol with the angular degree (also °), though context typically distinguishes them; additionally, the obsolete "degree centigrade" should be avoided in favor of "degree Celsius."[8]
Typesetting and Unicode
The Celsius degree symbol is typically represented in Unicode as the combination of the degree sign U+00B0 (°) followed by the Latin capital letter C (U+0043), forming °C.[19] Although a precomposed character exists at U+2103 (℃), the Unicode Standard recommends using the sequence U+00B0 U+0043 in normal text for better compatibility and semantic clarity, with fonts encouraged to render this ligature-like combination tightly for optimal appearance.[19] The degree sign U+00B0 was introduced in Unicode version 1.1 in June 1993, within the Latin-1 Supplement block.In mathematical and scientific typesetting, the degree symbol is rendered as a small superscript circle immediately preceding the 'C', ensuring precise alignment and no space between the elements. For example, in LaTeX, this is achieved with the notation ^{\circ}\mathrm{C}, where the degree is raised and the 'C' is in upright roman font to distinguish it as a unit symbol.[20]Kerning adjustments are often applied in professional typography to optically center the superscript degree relative to the 'C', compensating for the degree's smaller width and preventing visual gaps, particularly in sans-serif fonts.[21]Legacy systems lacking full Unicode support may approximate the symbol using ASCII text such as "deg C" or "C", which avoids encoding issues but sacrifices precision.[22] In HTML, the degree sign is commonly inserted via the named entity ° or numeric entity °, followed by 'C', as in °C, ensuring backward compatibility in web rendering. Subsequent Unicode versions, such as 4.0 (2003) and later, have improved font and rendering support for these characters through enhanced normalization and compatibility decompositions, facilitating consistent display across platforms.[19]
Conversions and Equivalences
Relation to Kelvin Scale
The Celsius and Kelvin scales are directly linked through the International System of Units (SI), where the kelvin (K) serves as the base unit for thermodynamic temperature, and the degree Celsius (°C) is a derived unit equal in magnitude to the kelvin but offset for practical reference.[5] The conversion between the two is given by the formulaT / \text{K} = t / ^\circ\text{C} + 273.15,where T is the thermodynamic temperature in kelvins and t is the Celsius temperature; this relation holds exactly, as the numerical value of a temperature interval is identical in both units.[1] In physics and engineering, adding 273.15 to a Celsius value shifts measurements to the absolute Kelvin scale without altering the size of temperature intervals, enabling applications like thermodynamic calculations where negative temperatures are invalid.[5]Prior to the 2019 SI redefinition, the kelvin was defined as exactly 1/273.16 of the thermodynamic temperature at the triple point of water (assigned 273.16 K, corresponding to 0.01 °C), which established the 273.15 K offset between 0 °C and absolute zero through the ice point reference.[23] The 2019 revision, effective May 20, fixed the Boltzmann constant at exactly $1.380\,649 \times 10^{-23} J/K to define the kelvin, decoupling it from the water triple point for greater precision and universality while preserving the exact Celsius-Kelvin relation.[5]In scientific practice, the kelvin is preferred for absolute thermodynamic quantities, such as in the ideal gas law PV = nRT, where temperature must be on an absolute scale to ensure physical consistency.[24] Conversely, Celsius is commonly used for temperature differences and environmental monitoring, such as weather reporting, due to its alignment with familiar reference points like water's freezing and boiling.[1] This coexistence supports versatile applications across disciplines, from chemistry (Kelvin for reaction kinetics) to meteorology (Celsius for climate data).[5]
Relation to Fahrenheit Scale
The Fahrenheit scale, proposed by Daniel Gabriel Fahrenheit in 1724, differs from the Celsius scale in both its reference points and interval size, leading to conversions that account for the offset in reference points and the difference in interval sizes.[25] Fahrenheit initially defined 0 °F as the freezing point of a brine mixture of equal parts ice, water, and ammonium chloride (or salt in some accounts), and later set 32 °F as the freezing point of pure water and 96 °F as the approximate normal human body temperature, with the boiling point of water at 212 °F; this created 180 divisions between freezing and boiling, compared to the 100 divisions in Celsius.[25] These arbitrary zero points, rooted in Fahrenheit's early thermometry experiments rather than a universal standard like water's triple point, necessitate an offset in conversions, unlike the direct interval scaling between Celsius and Kelvin.[25]The standard conversion formulas account for this offset and the interval ratio. To convert from Fahrenheit to Celsius:t(^\circ \text{C}) = \frac{t(^\circ \text{F}) - 32}{1.8} = (t(^\circ \text{F}) - 32) \times \frac{5}{9}This derives from aligning the scales: subtracting 32 °F shifts the Fahrenheit zero to match Celsius's 0 °C (water freezing), and dividing by 1.8 (or multiplying by 5/9) adjusts for the larger Fahrenheit intervals, as 180 °F spans the same range as 100 °C.[1] Conversely, to convert from Celsius to Fahrenheit:t(^\circ \text{F}) = t(^\circ \text{C}) \times 1.8 + 32 = t(^\circ \text{C}) \times \frac{9}{5} + 32Here, multiplying by 9/5 expands the smaller Celsius intervals to Fahrenheit's scale, and adding 32 realigns the zero point.[1]For temperature differences (intervals), the conversion is linear and scale-invariant: a change of 1 °C equals a change of 1.8 °F, so differences in Celsius are simply multiplied by 9/5 to obtain Fahrenheit equivalents, without the offset.[1] In the United States, where Fahrenheit remains the predominant scale for everyday use, many weather applications and forecasts display both scales simultaneously to assist users familiar with Celsius, such as in scientific contexts or for international audiences.[26] A common mental approximation for converting Fahrenheit to Celsius is to subtract 30 from the Fahrenheit value and divide by 2, providing reasonable accuracy for typical room and weather temperatures (e.g., 86 °F ≈ (86 - 30)/2 = 28 °C, close to the exact 30 °C).[27]
Key Reference Points
Water's Phase Transitions
The Celsius scale was historically defined using the phase transitions of water as fixed reference points, with the freezing point of pure water set at 0 °C and the boiling point approximately at 100 °C under standard atmospheric pressure of 1 atm (101.325 kPa).[1] These points provided the anchors for the scale, dividing the interval between them into 100 equal degrees. However, precise measurements show that the boiling point of pure water (using Vienna Standard Mean Ocean Water, VSMOW) at exactly 1 atm is 99.9839 °C, a value very close to the nominal 100 °C used for calibration purposes.[28] In the modern SI system, adopted in 2019, these water-based points are secondary, as the Celsius scale is now derived directly from the Kelvin scale through fixed constants, ensuring the freezing point remains exactly 0 °C by definition while aligning with fundamental physical properties.[29]A more precise reference for the scale prior to 2019 was the triple point of water, where solid, liquid, and vapor phases coexist in equilibrium at 0.01 °C and a pressure of 611.657 Pa.[28] This point served as the primary reproducible standard for calibrating thermometers, offering higher accuracy than the freezing or boiling points, which can be affected by slight variations in pressure or purity. The triple point temperature is exactly 273.16 K on the Kelvin scale, linking Celsius directly to absolute temperature measurements. The sublimation curve of water, representing the transition from ice to vapor without melting, extends downward to approach absolute zero at -273.15 °C, where molecular motion theoretically ceases, though practical sublimation rates diminish exponentially at low temperatures.[28]Environmental factors influence these phase transitions, altering the reference points from ideal conditions. Impurities, such as salts in seawater with a salinity of about 35 parts per thousand, depress the freezing point through colligative properties, lowering it to approximately -1.8 °C at 1 atm, which prevents surface oceans from freezing solid in polar regions.[30] Similarly, the boiling point decreases with altitude due to reduced atmospheric pressure; at 2000 m, where pressure is roughly 0.8 atm, water boils at about 93 °C, affecting cooking and sterilization processes in high-elevation areas.[31] These variations highlight the scale's reliance on standardized conditions for consistency.
Common Everyday Temperatures
In human physiology, the normal core body temperature is 37.0 °C, which maintains optimal metabolic function and homeostasis.[32] A fever, indicating an immune response to infection or inflammation, is typically defined as a temperature exceeding 38 °C.[33] Conversely, hypothermia sets in when core temperature falls below 35 °C, impairing bodily functions and requiring immediate medical intervention.[34]Everyday environmental temperatures on the Celsius scale span a wide range relevant to human comfort and activity. Room temperature, often considered the comfort zone for indoor living and working, falls between 20–25 °C, balancing thermal comfort with energy efficiency in occupied spaces. Boilingwater for cooking or sterilization reaches 100 °C at standard atmospheric pressure, a fixed reference point for heat-based processes. At the theoretical extreme, absolute zero represents the coldest possible temperature at -273.15 °C, where molecular motion ceases, foundational to thermodynamics and cryogenics.[1][35]Weather extremes highlight the Celsius scale's utility in meteorology. The hottest surface air temperature officially recorded by the WMO is 56.7 °C in Death Valley, California, on July 10, 1913, though recent 2025 studies have questioned its accuracy, illustrating hyper-arid conditions that challenge human survival.[36][37] The coldest verified reading is -89.2 °C at Vostok Station in Antarctica on July 21, 1983, demonstrating polar night's radiative cooling in a high-elevation ice sheet environment.[36] For broader context, Earth's average global surface temperature is approximately 15 °C, serving as a baseline for climate monitoring and variability assessments.[38]In food preparation and storage, the Celsius scale standardizes processes for safety and quality. Household freezers are set to -18 °C to inhibit bacterial growth and preserve perishables long-term.[39] Typical baking in ovens occurs at 180 °C, allowing even heat distribution for items like cakes and breads without excessive drying or burning.[40] These values, often equivalent to 0 °F for freezing and 350 °F for baking, underscore the scale's practicality in daily applications.
Temperature Intervals
Measuring Differences
Temperature intervals on the Celsius scale are calculated by direct subtraction of two temperature readings, expressed as Δt = t₂ - t₁ in degrees Celsius (°C), where the interval size remains identical to that on the Kelvin scale since one Celsius degree equals one Kelvin interval.[1] This method focuses on changes rather than absolute values, making it suitable for quantifying thermal variations without reference to the scale's zero point.[1]These intervals hold significant physical meaning, representing proportional changes in thermal energy; for instance, a 1 °C temperature rise in water corresponds to an energy input of approximately 4.184 J per gram, based on its specific heat capacity under standard conditions.[41] In applications like thermal expansion, the change in length of a material is given by ΔL = α L Δt, where α is the linear coefficient and Δt is the Celsius interval, illustrating how such differences drive dimensional adjustments in engineering contexts.[42] In climate analysis, Celsius intervals are essential for tracking trends, such as annual warming rates or diurnal variations, providing a basis for assessing environmental impacts without scale-dependent biases.[43]Representative examples include typical daily air temperature fluctuations of 10–15 °C in many mid-latitude regions, reflecting the diurnal cycle driven by solar heating, as observed in global meteorological datasets.[44] In human physiology, a fever often involves a 2 °C rise above the normal body temperature of approximately 37 °C, signaling immune response without implying proximity to absolute limits.[45] These differences are reference-independent, avoiding complications from absolute zero, as the interval magnitude depends solely on the scale's uniformity rather than its origin.[1]When reporting Celsius intervals, precision should align with the measurement instrument's accuracy to ensure reliability; in laboratory settings, this often means resolutions of ±0.1 °C for high-precision thermometers calibrated against standards.[46]
Practical Applications of Intervals
In meteorology, Celsius intervals are essential for quantifying and communicating temperature anomalies and trends, such as the observed global surface temperature increase of approximately 1.5°C since pre-industrial levels (1850–1900) as of 2024, when the annual average first exceeded 1.5°C above pre-industrial levels—the warmest year on record—according to the World Meteorological Organization (WMO).[47] This interval-based reporting allows scientists to track changes like the accelerated warming rate of 0.2°C per decade since 1970, facilitating comparisons across datasets and regions without reliance on absolute values. For instance, projections under high-emissions scenarios indicate potential global warming intervals of up to 5°C by the end of the century, underscoring the scale of climate impacts in weather forecasting and seasonal outlooks.In engineering, Celsius intervals guide the design and testing of systems under thermal stress, particularly in assessing material durability through controlled temperature differentials. For example, in aerospace and automotive applications, stress tests often involve cycling materials over 40°C intervals—such as from 20°C to 60°C—to evaluate creep and fatigue, ensuring compliance with standards like those from the American Society for Testing and Materials (ASTM). In heating, ventilation, and air conditioning (HVAC) systems, engineers target comfort differentials of around 10–15°C between indoor (typically 20–24°C) and outdoor conditions to optimize energy efficiency and occupant well-being, as outlined in ASHRAE Standard 55 for thermal environmental conditions.Medical applications leverage Celsius intervals for precise monitoring and therapeutic control of body temperature variations. In hyperthermia treatments for cancer, clinicians induce controlled rises of about 1°C per hour to reach tumor temperatures of 40–43°C, enhancing radiotherapy efficacy while minimizing risks to healthy tissue, as demonstrated in clinical protocols from the National Cancer Institute. Similarly, in cooking and food safety, intervals inform time adjustments; for every 10°C increase in oven temperature, cooking duration can be roughly halved to achieve equivalent doneness, based on heat transfer principles validated in food science research.In climate policy, Celsius intervals define critical thresholds for international agreements, emphasizing differences from baseline periods rather than fixed points. The Paris Agreement, adopted under the United Nations Framework Convention on Climate Change, commits nations to limit global warming to well below 2°C above pre-industrial levels, with efforts to cap it at 1.5°C, using these intervals to frame emission reduction targets and adaptation strategies. This approach enables projections of interval-based impacts, such as a 2°C threshold triggering irreversible sea-level rise, informing policy decisions on mitigation without absolute temperature references.[48]