Resistance thermometer
A resistance thermometer is a temperature sensor that measures temperature by correlating the electrical resistance of a sensing element—typically a metal wire or coil—with changes in temperature, based on the principle that the resistance of most metals increases predictably with rising temperature.[1] These devices, often referred to as resistance temperature detectors (RTDs), provide high accuracy and stability, making them essential for precise thermometry in scientific, industrial, and calibration applications.[2] The operating principle relies on passing a small, constant electrical current through the sensing element and measuring the resulting voltage drop, which varies with resistance and thus temperature, following standardized relationships such as those defined for platinum elements.[1] Platinum is the most common material due to its wide temperature range (from -200°C to 850°C), chemical inertness, and linear resistance-temperature characteristics, though other metals like nickel and copper are used for specific applications.[1] Resistance thermometers are categorized into types such as standard platinum resistance thermometers (SPRTs) for primary standards and industrial RTDs for practical use, with the former calibrated at fixed points like the triple point of water (0.01°C) to achieve uncertainties as low as ±0.001°C.[2] In the International Temperature Scale of 1990 (ITS-90), platinum resistance thermometers serve as the defining instruments for temperatures between 13.8033 K (triple point of equilibrium hydrogen) and 1234.93 K (silver freezing point), using specified interpolation procedures to approximate the thermodynamic scale with high reproducibility.[2] Key advantages include excellent long-term stability, low hysteresis, and compatibility with digital readout systems, though they exhibit slower response times compared to thermocouples and require protection from mechanical shock and contamination.[1] Applications span laboratory research, food processing, aerospace, and environmental monitoring, where accuracies of ±0.2°C or better are needed over extended periods.[3] While metallic RTDs dominate, semiconductor-based resistance thermometers known as thermistors offer higher sensitivity in narrow ranges (-50°C to 200°C) but suffer from nonlinearity and poorer stability, limiting their use to less demanding scenarios like data logging.[1] Calibration typically involves comparison to reference standards at fixed points or using traceable sensors, ensuring compliance with international guidelines for reliability.[1]Fundamentals
Resistance-Temperature Relationship
Resistance thermometers operate on the principle that the electrical resistance of certain metals increases with temperature, a phenomenon known as the positive temperature coefficient (PTC).[4] In pure metals such as platinum, this relationship is approximately linear over a wide range of temperatures, making them suitable for precise temperature sensing.[5] The PTC arises from the increased scattering of electrons by lattice vibrations as temperature rises, which impedes current flow and elevates resistance.[4] The resistance-temperature relationship is mathematically modeled using the Callendar-Van Dusen equation, which accounts for the slight non-linearity observed in platinum resistance thermometers.[6] For temperatures above 0°C, the simplified linear form is given by: R_t = R_0 (1 + \alpha t) where R_t is the resistance at temperature t in °C, R_0 is the resistance at 0°C, and \alpha is the temperature coefficient of resistance, typically 0.00385 °C⁻¹ for standard platinum elements.[5] For more accuracy across broader ranges, especially below 0°C, the full equation is: R_t = R_0 [1 + A t + B t^2 + C (t - 100) t^3] \quad (t < 0^\circ \text{C}) R_t = R_0 [1 + A t + B t^2] \quad (t \geq 0^\circ \text{C}) with coefficients A = 3.9083 \times 10^{-3}, B = -5.775 \times 10^{-7}, and C = -4.183 \times 10^{-12} for IEC-standard platinum RTDs.[7] Here, \alpha represents the average fractional change in resistance per degree Celsius near 0°C, derived from the slope of the curve.[6] Several factors influence the precision and stability of this relationship in platinum-based sensors. Material purity is critical, as impurities like iron or nickel reduce the temperature coefficient and introduce non-reversible resistance changes; higher purity yields a more consistent \alpha value close to 0.003851 °C⁻¹.[8] Strain from mechanical stress or thermal expansion mismatches can alter the resistance baseline and cause hysteresis, with supported wire configurations showing drifts up to 0.1% of the temperature span.[8] Annealing at elevated temperatures, such as above 250°C, mitigates these effects by relieving internal stresses and stabilizing the microstructure, thereby minimizing long-term drift to below 10 mK in well-processed elements.[8] For a standard platinum resistance thermometer with R_0 = 100 \, \Omega at 0°C and \alpha = 0.00385 \, ^\circ \text{C}^{-1}, the resistance increases to approximately 138.5 Ω at 100°C, illustrating the predictable PTC behavior central to temperature measurement.[5]Operating Principle
A resistance thermometer, or resistance temperature detector (RTD), operates by applying a small, constant excitation current to the sensing element, typically a coiled wire or thin film of metal such as platinum. This current generates a voltage drop across the element that is directly proportional to its electrical resistance, in accordance with Ohm's law: V = I \times R, where V is the voltage, I is the current, and R is the resistance.[9] The measured resistance value is then correlated to temperature using a predefined calibration curve or equation, exploiting the fact that the resistance of metals increases nearly linearly with temperature.[9] For enhanced precision in resistance measurement, RTDs are commonly integrated into bridge circuits, such as the Wheatstone bridge, which consists of four resistors arranged in a diamond configuration with the RTD as one arm. In this setup, a voltage is applied across the bridge, and the output voltage across the midpoint detects any imbalance caused by changes in the RTD's resistance due to temperature variations; when balanced, the output is zero, but temperature-induced resistance shifts unbalance the bridge, producing a measurable differential voltage proportional to the temperature change.[10] The raw resistance signal from the RTD is typically too small for direct use in instrumentation, necessitating signal conditioning to convert it into a more usable form, such as a voltage or current output compatible with analog-to-digital converters or control systems. This process often involves amplification of the bridge output and linearization to account for any non-linearities in the resistance-temperature curve, ensuring accurate temperature readout across the operating range.[9] However, practical challenges include the effects of lead wire resistance, which can add extraneous resistance to the measurement and introduce errors, especially over long distances, and self-heating of the sensing element caused by the excitation current's power dissipation (P = I^2 R), which artificially elevates the local temperature.[10] These issues are qualitatively mitigated by employing low excitation currents, generally 1 mA or less, to limit self-heating to negligible levels (e.g., less than 0.01°C error), and by circuit designs that compensate for lead resistance without relying on specific wire lengths.[9]Types and Materials
Element Types
Resistance thermometers, also known as resistance temperature detectors (RTDs), utilize sensing elements made from metals whose electrical resistance varies predictably with temperature. The most common materials for these elements are platinum, nickel, and copper, each selected based on their temperature coefficient of resistance (TCR), stability, and operational range. Platinum is favored for its high accuracy and long-term stability, with standard types like Pt100 (100 ohms at 0°C) and Pt1000 (1000 ohms at 0°C) conforming to IEC 60751 standards.[11] Nickel offers cost-effectiveness and good sensitivity, as in Ni120 elements, while copper provides a highly linear response but limited stability due to oxidation. Less common materials include nickel-iron alloys, used in specialized applications requiring moderate stability at lower costs.[12] The performance characteristics of these materials vary significantly, influencing their suitability for different environments. Platinum elements exhibit low hysteresis and excellent stability, operating over a wide range from -200°C to 850°C, making them ideal for precision measurements in laboratory and industrial settings. Nickel elements, with a TCR of approximately 0.00672 Ω/Ω/°C, function from -80°C to 260°C but show moderate hysteresis. Copper, with a TCR of 0.00427 Ω/Ω/°C, covers -100°C to 260°C and has low hysteresis, though its sensitivity (0.039 Ω/°C from 0-100°C) is lower than nickel's (0.806 Ω/°C). For comparison:| Material | Temperature Range (°C) | TCR (Ω/Ω/°C) | Stability | Hysteresis | Cost |
|---|---|---|---|---|---|
| Platinum | -200 to 850 | 0.00385–0.00392 | Best | Low | Higher |
| Nickel | -80 to 260 | 0.00672 | Moderate | Moderate | Low |
| Copper | -100 to 260 | 0.00427 | Good | Low | Low |
| Nickel-Iron | -100 to 204 | 0.00518–0.00527 | Moderate | Moderate | Low |
Classifications of RTDs
Resistance temperature detectors (RTDs) are classified according to international standards that define their accuracy and performance characteristics, primarily through the IEC 60751 standard for industrial platinum resistance thermometers. This standard outlines tolerance classes based on the deviation in resistance from the nominal value at specified temperatures, ensuring consistency across applications. The classes include AA, A, B, and C, with Class AA offering the highest precision at ±0.10°C at 0°C, Class A at ±0.15°C at 0°C, Class B at ±0.30°C at 0°C, and Class C at ±0.60°C at 0°C.[13][14] Additionally, precision subclasses such as 1/10 DIN (equivalent to ±0.03°C at 0°C) and 1/3 DIN (±0.10°C at 0°C) are defined for applications requiring enhanced accuracy beyond standard classes.[15] Classifications also extend to usage contexts, distinguishing RTDs by their design suitability for different environments. Industrial RTDs are typically rugged, featuring sheathed probes encased in metal tubes to withstand harsh conditions like vibration, moisture, and mechanical stress in manufacturing or process control settings.[16] In contrast, laboratory RTDs prioritize high precision with bare or minimally protected elements, allowing for sensitive measurements in controlled environments where accuracy outweighs durability.[17] Within these, construction types include wire-wound RTDs, which use coiled platinum wire for superior stability and accuracy in laboratory use, and thin-film RTDs, which deposit a thin platinum layer on a substrate for compact, cost-effective integration in original equipment manufacturer (OEM) applications like automotive or consumer electronics.[18][19] Standard nominal resistances form another key classification, with Pt100 RTDs—commonly using platinum elements—defined as having 100 Ω at 0°C and Pt1000 variants at 1000 Ω at 0°C, both adhering to a temperature coefficient of 0.00385 Ω/Ω/°C per IEC 60751.[20][14] These come with tolerance bands tied to accuracy classes; for example, a Class A Pt100 has a tolerance of ±0.15 Ω at 0°C, scaling with temperature according to the standard's formulas.[21] Emerging classifications prior to 2025 include smart RTDs with integrated digital outputs and wireless capabilities for Internet of Things (IoT) applications, enabling remote monitoring in industrial sites without physical wiring. These incorporate microcontrollers for signal conditioning and wireless transmission protocols like Bluetooth or Wi-Fi, enhancing data accessibility in distributed sensor networks.[22]Design and Configuration
Construction
Resistance thermometers, also known as resistance temperature detectors (RTDs), are constructed with a core sensing element typically made of platinum wire or a thin film of platinum, which changes resistance predictably with temperature.[8] The sensing element is supported by a structure such as a ceramic bobbin, alumina substrate, or glass core to maintain stability and minimize mechanical strain.[12] Insulation materials like glass frit, epoxy, or ceramic surround the element to prevent electrical shorts and protect against environmental contaminants.[12] The entire assembly is encased in a protective sheath, often stainless steel or Inconel, to shield the internal components from corrosion and physical damage in industrial settings.[8] Assembly begins with the fabrication of the sensing element using techniques such as wire winding, where fine platinum wire (20–50 μm diameter) is helically wound around a mandrel or bobbin, or thin-film deposition, involving sputtering or screen-printing platinum onto a ceramic substrate followed by laser trimming to achieve precise resistance values like 100 Ω at 0 °C.[8] For wire-wound elements, the coil is often embedded in alumina powder or fused with glass for support, while thin-film elements are coated with protective layers.[12] Hermetic sealing, using glass-to-metal or epoxy bonds, is applied during assembly to encapsulate the element and prevent moisture or gas ingress that could alter performance.[8] These methods ensure the device's integrity across a wide temperature range, from -200 °C to 850 °C.[23] To withstand harsh environments, resistance thermometers incorporate features for vibration resistance, such as mineral-insulated construction or solid epoxy encapsulation, which secure the sensing element against mechanical shocks common in industrial applications.[12] Moisture-proofing is achieved through hermetic seals and non-porous sheaths, preventing condensation or corrosive ingress that could degrade the platinum element.[8] Pressure tolerance is provided by robust sheaths and thermowells, enabling operation up to 100 bar in process vessels and pipelines.[24] These protections maintain accuracy and longevity in demanding conditions like chemical processing or high-vibration machinery. Size variations in resistance thermometers range from miniature probes with diameters as small as 1 mm for thin-film elements, ideal for surface measurements with rapid response times under 1 second, to larger immersion probes up to 300 mm in length and 10 mm in diameter for deep-tank applications.[8] Smaller designs, such as those with 6 mm inserts, offer faster thermal response due to reduced mass, while extended lengths accommodate varying immersion depths without compromising stability.[23] These configurations balance sensitivity, durability, and installation requirements across diverse uses.[12]Wiring Configurations
Wiring configurations in resistance thermometers are designed primarily to compensate for errors introduced by the resistance of lead wires connecting the sensing element to the measurement instrument, which can significantly affect accuracy, particularly over longer distances.[25] These errors arise because lead wires, typically made of materials like copper, exhibit resistance that varies with temperature and adds to the measured resistance of the thermometer element, leading to inaccuracies proportional to the wire length and environmental conditions.[26] By employing additional wires, configurations enable the subtraction or isolation of these extraneous resistances, ensuring the measured value more closely reflects the true temperature-dependent resistance of the element.[27] The general principles of these setups balance simplicity, accuracy, and practicality: a two-wire configuration suits basic applications where lead lengths are minimal, as it directly measures the total resistance without compensation but introduces the full lead error.[27] In contrast, three-wire and four-wire arrangements achieve higher precision by addressing lead imbalances; the three-wire method assumes equal resistance in paired leads to partially cancel out variations, while the four-wire approach fully eliminates lead effects through separate current-carrying and voltage-sensing paths.[25] These multi-wire methods are essential for maintaining measurement integrity when lead resistances become comparable to the element's resistance, such as in platinum-based thermometers where small changes (e.g., 0.385 Ω per °C for a 100 Ω nominal Pt100) must be resolved accurately.[26] Key error sources mitigated by advanced configurations include the temperature coefficient of lead wire resistance, which can cause drifts of several degrees Celsius over extended cables (e.g., up to 4°C error for 12 feet of 28 AWG wire in a two-wire setup at 80°C), and additional resistances from junctions or connections that amplify imbalances.[25] Self-heating from excitation current through leads can also contribute marginally but is generally secondary to lead resistance effects in configuration choices.[26] Selection of a wiring configuration depends on factors such as cable length, required accuracy, and cost constraints: two-wire is ideal for short runs under 10 meters where errors remain below 0.5°C, offering the lowest cost and simplest installation; three-wire provides a cost-effective compromise for industrial settings with moderate precision needs over distances up to 100 meters; and four-wire is preferred for laboratory or high-stakes applications demanding sub-0.1°C accuracy, despite higher material and instrumentation expenses.[27]Wiring Configurations
Two-Wire Configuration
The two-wire configuration represents the simplest wiring setup for a resistance thermometer, where the sensing element is connected directly to the measurement device using a single pair of lead wires. This arrangement forms a basic circuit in which an excitation current flows through both the sensor and the leads, and the total measured resistance is the sum of the sensor's resistance and the resistance of the lead wires. As a result, the lead wire resistance introduces an additive offset to the temperature reading, which cannot be automatically distinguished from the sensor's resistance in this setup.[27][28] The primary limitation of the two-wire configuration arises from the uncompensated lead resistance, which varies with the temperature of the wires themselves and the ambient conditions along the cable run. For copper lead wires, the temperature coefficient of resistance is approximately 0.0039 Ω/Ω/°C, leading to changes in lead resistance that exacerbate measurement errors, especially over longer distances. For instance, a total lead resistance of 1 Ω can cause an error of about 2.5°C in a standard Pt100 resistance temperature detector (RTD) at 0°C, since the sensor's sensitivity is roughly 0.385 Ω/°C. Over a 100 m cable run with typical 20 AWG copper wire (approximately 0.033 Ω/m per conductor), the round-trip lead resistance might add 6-7 Ω, potentially resulting in errors of 1-2°C or more, depending on wire gauge and temperature gradients.[27][28] This configuration is suitable for applications involving short lead lengths, typically less than 10 m, where the added resistance remains negligible, or in low-precision industrial monitoring scenarios such as basic process control or non-critical environmental sensing.[27][29][30] To mitigate these errors without switching to more complex wiring, low-resistance leads—such as thicker gauge copper wires—can be employed to minimize the offset, or software-based corrections can be applied if the lead resistance is measured and characterized separately. However, such mitigations are approximate and do not fully eliminate temperature-induced variations in lead resistance.[27][31]Three-Wire Configuration
The three-wire configuration for resistance thermometers (RTDs) utilizes two leads to supply excitation current through the sensing element and a third lead connected to one end of the RTD to sense the voltage drop directly at the sensor, with the assumption that all three lead resistances are equal. This setup forms a compensation loop where the resistance of the leads carrying current is measured separately and subtracted from the total measured resistance, isolating the RTD's true resistance.[29][32] The compensation mechanism operates typically within a Wheatstone bridge circuit, where the third lead places the lead resistance in one bridge arm opposite to the RTD arm, allowing the bridge to balance by effectively subtracting the lead resistance from the measurement. When lead resistances are matched, this cancels the error introduced by the leads, reducing temperature measurement inaccuracies to less than 0.1°C for lead lengths up to 100 m using standard copper wiring.[33][34] In practice, the three-wire configuration is widely implemented in industrial temperature transmitters and control systems, where equal-length leads are used to maintain balance and ensure reliable compensation without additional calibration. This approach builds on the simplicity of two-wire setups by addressing lead-induced errors that can exceed several degrees Celsius for longer distances, making it suitable for moderate-precision applications in process industries.[29][33] However, if the lead resistances are unequal—due to differences in length, material, or temperature exposure—a residual error arises, as the compensation only partially accounts for the mismatch. For instance, a 0.327 Ω imbalance in a Class A platinum RTD can introduce an error of approximately 0.85°C at 0°C, limiting overall accuracy to around ±0.5°C in typical unbalanced scenarios. Thus, it is recommended for applications requiring better than ±1°C precision but not the full elimination of lead effects.[32][34]Four-Wire Configuration
The four-wire configuration, also known as the Kelvin or four-terminal sensing setup, employs two pairs of leads connected to the resistance temperature detector (RTD): one pair supplies a precise excitation current to the sensor, while the other pair measures the voltage drop directly across the RTD element itself.[6] This arrangement ensures that the current-carrying leads (force leads) are separate from the voltage-sensing leads (sense leads), with the sense leads connected as close as possible to the RTD terminals to isolate the measurement from external influences.[35] The mechanism for error elimination relies on the principle of Kelvin sensing, where negligible current flows through the high-impedance sense leads, preventing voltage drops due to lead wire resistance from affecting the measurement. As a result, the measured voltage accurately reflects only the RTD's intrinsic resistance, fully compensating for lead lengths up to several hundred meters without introducing offsets. This configuration achieves the highest precision among RTD wiring methods, enabling accuracies better than 0.01°C in laboratory-grade systems, independent of lead resistance variations.[6][35][36] Four-wire RTDs are primarily applied in laboratory standards for calibration references and high-end process control environments requiring ultra-precise temperature monitoring, such as in semiconductor fabrication or pharmaceutical manufacturing, where a stable, low-noise current source is essential for maintaining performance.[37][28] However, the setup incurs higher costs due to the additional wiring and connectors, increased complexity in installation and maintenance, and potential susceptibility to contact noise at the sense points if not properly shielded.[6][38]Calibration and Standards
Calibration Procedures
Calibration of resistance thermometers, also known as resistance temperature detectors (RTDs), is essential to verify and adjust their resistance-temperature relationship for accurate measurements, ensuring traceability to international standards. Primary methods include fixed-point calibration using stable reference temperatures such as the ice point at 0 °C and the boiling point at 100 °C under standard atmospheric pressure, comparison calibration against reference standards, and the use of automated temperature baths for controlled immersion.[8][39][40] In fixed-point calibration, the RTD is first immersed in an ice bath prepared with crushed ice and distilled water, allowed to stabilize for several minutes to achieve 0 °C, and its resistance is measured using a precise ohmmeter, typically with a four-wire configuration to eliminate lead resistance errors. The process is repeated at the boiling point by immersing the RTD in vigorously boiling distilled water at 100 °C, ensuring the probe does not touch the container to avoid conduction errors. For comparison calibration, the RTD under test (unit under test, UUT) is placed alongside a calibrated reference standard, such as a standard platinum resistance thermometer (SPRT), in a stirred liquid bath or dry-block calibrator maintained at multiple temperatures between -80 °C and 450 °C; resistances are measured at each point, and deviations are recorded. Automated baths facilitate this by providing uniform temperature control and automated data logging, often integrating with software for real-time adjustments.[8][41][40] Following measurements, corrections are applied by fitting the data to the Callendar-Van Dusen equation, which models the non-linear resistance-temperature behavior as R(t) = R_0 [1 + A t + B t^2 + C (t - 100) t^3 ] for temperatures below 0 °C, where R_0 is the resistance at 0 °C, and A, B, C are coefficients determined from the calibration points to minimize residuals, typically achieving fits with deviations under 0.01 °C. This equation, standardized in IEC 60751, allows interpolation across the operating range. Calibration addresses key error sources, including hysteresis from mechanical strain or oxidation, which is quantified by thermal cycling between extreme temperatures and limited to less than 0.003 °C through annealing; insulation resistance degradation due to moisture or contamination, requiring values above 100 MΩ and checked via high-voltage testing; and stem conduction heat loss, mitigated by immersing the probe to at least 10-20 times its sheath diameter plus the sensing element length. All procedures ensure traceability to the International Temperature Scale of 1990 (ITS-90) through reference standards calibrated at national metrology institutes.[6][8][42] Industrial RTDs are typically calibrated annually to maintain accuracy, as recommended by IEC 60751, which specifies tolerance classes (e.g., Class A: ±(0.15 + 0.002|t|) °C) and requires verification of compliance. Uncertainty budgets are calculated to include contributions from reference standards, immersion errors, and self-heating (limited by using measurement currents below 1 mA), often achieving expanded uncertainties of ±0.05 °C or better at k=2 for ranges up to 450 °C.[8][42][39]Standard Resistance Data
Standard resistance data for resistance thermometers, particularly platinum and nickel types, are defined by international and national standards to ensure consistency in manufacturing and application. The IEC 60751 standard specifies the nominal resistance-temperature relationship for platinum RTDs, such as Pt100 and Pt1000, over a range from -200°C to 850°C, with Pt100 having a base resistance of 100.00 Ω at 0°C.[43] This standard aligns closely with ASTM E1137, which provides equivalent specifications for industrial platinum resistance thermometers in the United States, facilitating global interoperability.[44] For nickel RTDs like Ni100, the DIN 43760 standard defines the characteristics, with a base resistance of 100.00 Ω at 0°C and an operational range typically up to 300°C. These nominal values serve as benchmarks for sensor verification, allowing manufacturers and users to confirm compliance with tolerance classes without relying on full continuous curves.[45] The following table presents selected nominal resistance values for Pt100 sensors according to IEC 60751, along with tolerance limits for common accuracy classes at 0°C. Tolerances are expressed in ohms for Class A (±0.06 Ω or ±0.15°C) and Class B (±0.12 Ω or ±0.3°C), which apply across the temperature range with scaling factors defined in the standard.[43]| Temperature (°C) | Nominal Resistance (Ω) | Class A Tolerance (Ω) | Class B Tolerance (Ω) |
|---|---|---|---|
| -200 | 18.52 | ±0.35 | ±0.70 |
| 0 | 100.00 | ±0.06 | ±0.12 |
| 50 | 119.40 | ±0.10 | ±0.20 |
| 100 | 138.51 | ±0.14 | ±0.27 |
| 200 | 175.86 | ±0.22 | ±0.44 |
| 300 | 212.05 | ±0.30 | ±0.60 |
| 400 | 247.94 | ±0.38 | ±0.76 |
| 500 | 283.92 | ±0.46 | ±0.92 |
| 600 | 320.03 | ±0.54 | ±1.08 |
| Temperature (°C) | Nominal Resistance (Ω) |
|---|---|
| -50 | 69.1 |
| 0 | 100.00 |
| 50 | 130.9 |
| 100 | 161.8 |
| 150 | 192.7 |
| 200 | 223.6 |
| 250 | 254.5 |
Temperature-Resistance Characteristics
The temperature-resistance relationship in resistance thermometers is inherently non-linear, particularly below 0°C, and is modeled using polynomial approximations for accurate conversion between resistance and temperature. For platinum-based RTDs like the Pt100, which has a nominal resistance of 100 Ω at 0°C, the International Electrotechnical Commission (IEC) standard 60751 defines the Callendar-Van Dusen equation to describe this characteristic across its operational range of -200°C to 850°C. For temperatures above 0°C, the resistance R(t) at temperature t (in °C) is given by: R(t) = 100 \left[ 1 + 3.9083 \times 10^{-3} t - 5.775 \times 10^{-7} t^2 \right] For temperatures below 0°C, an additional cubic term is included: R(t) = 100 \left[ 1 + 3.9083 \times 10^{-3} t - 5.775 \times 10^{-7} t^2 + \left( -4.183 \times 10^{-12} \right) (t - 100) t^3 \right] This approximation captures the slight curvature in the resistance-temperature curve, with non-linearity becoming more pronounced at sub-zero temperatures, leading to deviations of up to 0.5% from linearity below -100°C.[6] Platinum RTDs exhibit a temperature coefficient of resistance (TCR) of approximately 0.00385 °C⁻¹, providing high stability with typical long-term drift rates below 0.05°C per year under standard conditions, making them suitable for precision applications over wide ranges. In contrast, nickel RTDs, standardized under DIN 43760 with a nominal 100 Ω at 0°C and TCR of 0.00618 °C⁻¹, offer higher sensitivity—about 60% greater change in resistance per degree than platinum—but are limited to narrower ranges of -80°C to 260°C due to increased thermal instability and higher drift rates, often exceeding 0.1°C per year. Copper RTDs, with a TCR of 0.00427 °C⁻¹ and nominal resistances like 10 Ω at 0°C, demonstrate the most linear behavior across -100°C to 200°C, but their practical range is curtailed by rapid oxidation above 150°C, resulting in sensitivity losses and drift up to 0.2°C per year.[46][45][47] The following table illustrates representative resistance values for these materials at selected temperatures, highlighting the progressive increase and relative linearity (calculated per IEC 60751 for platinum, DIN 43760 linear approximation for nickel, and ASTM E1137 linear for copper; actual curves show minor deviations below 0°C):| Temperature (°C) | Pt100 (Ω) | Ni100 (Ω) | Cu10 (Ω) |
|---|---|---|---|
| -50 | 80.33 | 69.10 | 7.86 |
| 0 | 100.00 | 100.00 | 10.00 |
| 50 | 119.40 | 130.90 | 12.14 |
| 100 | 138.51 | 161.78 | 14.27 |
| 150 | 157.34 | 192.67 | 16.40 |
| 200 | 175.86 | 223.55 | 18.54 |