An infrared detector is an electro-optical device that converts incident infrared radiation, typically in the wavelength range of 0.75 to 1000 micrometers, into an electrical signal proportional to the radiation's intensity, enabling the detection of heat or invisible light emissions from objects above absolute zero.[1] These detectors operate on two primary principles: thermal detection, where absorbed infraredenergy causes a measurable temperature rise in a material (such as a change in resistance or voltage), and photon (quantum) detection, where individual photons excite electron-hole pairs in a semiconductor, generating a photocurrent or photovoltage.[2]Thermal detectors, including bolometers, pyroelectric sensors, and thermopiles, function at or near room temperature with broadband spectral response but slower response times exceeding milliseconds, while photon detectors, such as those based on mercury cadmium telluride (HgCdTe), indium antimonide (InSb), or indium gallium arsenide (InGaAs), often require cryogenic cooling for optimal performance and offer faster responses under nanoseconds with wavelength-specific sensitivity.[3] Key performance metrics for infrared detectors include responsivity (output signal per unit input power), noise-equivalent power (NEP, the minimum detectable power), and normalized detectivity (D*, a figure of merit accounting for detector area and bandwidth), which together determine their sensitivity and suitability for low-light or high-speed applications.[2] Historically, early infrared detectors emerged in the 1930s using materials like lead sulfide for basic photodetection, with significant advancements in the 1950s–1970s driven by semiconductor technologies like InSb and HgCdTe, primarily for military night vision systems.[2]Infrared detectors are essential across diverse fields due to their ability to sense thermal signatures invisible to the human eye, with applications spanning military surveillance (e.g., missile tracking and thermal imaging), medical diagnostics (e.g., thermography for fever detection), environmental monitoring (e.g., gas analyzers for CO₂ at 4.3 μm), astronomy (e.g., detecting distant celestial heat sources), and industrial uses (e.g., non-contact thermometers and flame monitors).[1] Uncooled microbolometer arrays have revolutionized consumer and automotive sectors, enabling forward-looking infrared (FLIR) systems in vehicles for pedestrian detection, while cooled photon detectors dominate high-precision scientific instruments like Fourier-transform infrared (FTIR) spectrometers for molecular analysis.[3] Ongoing challenges include reducing cooling requirements for photon detectors to improve portability and cost, mitigating noise from background fluctuations, and enhancing quantum efficiency (the fraction of photons converted to signal) in emerging materials like quantum dots or type-II superlattices; recent advancements as of 2025 include room-temperature mid-infrared photodetectors using novel nanomaterials such as graphene and quantum dots, enabling broader adoption in portable devices.[2][4] These advancements continue to expand infrared technology's role in telecommunications (e.g., fiber-optic signal monitoring at 1.3–1.55 μm) and remote sensing, underscoring their interdisciplinary impact.[1]
History
Early discoveries
In 1800, British astronomer William Herschel conducted an experiment dispersing sunlight through a prism and measuring the temperature across the visible spectrum using a thermometer, discovering that the highest temperatures occurred beyond the red end, in an invisible region he termed "calorific rays," later identified as infrared radiation.[5][6] This finding established infrared as a form of heatradiation extending the electromagnetic spectrum.[7]In 1821, Thomas Johann Seebeck observed the thermoelectric effect, where a temperature difference between junctions of dissimilar metals generates a voltage, laying the groundwork for devices to detect infrared radiation.[8] Building on this, in the 1820s, scientists developed thermopiles—series of thermocouples—that converted infrared-induced heat into measurable electrical signals, enabling quantitative detection of thermal radiation.[9]During the 1830s, Italian physicist Macedonio Melloni advanced infrared detection by refining the thermopile into the more sensitive thermomultiplier, which amplified signals from weak heat sources and allowed detection of infrared rays through various media, including over distances and obstacles.[10][11] Melloni's instrument demonstrated the propagation of infrared like light, marking a key step toward practical thermal sensing.[12] These 19th-century innovations provided the foundational tools for infrared detection that influenced 20th-century technological developments.
20th-century developments
The development of infrared detectors accelerated during World War II, driven by military demands for night vision and targeting systems. In Germany, lead sulfide (PbS) detectors, initially explored in the 1930s by E.W. Kutzscher, were mass-produced starting in 1943 for applications such as the Kiel IV night vision system, offering sensitivity up to approximately 3 μm in the near-infrared range.[13] In the United States, Robert J. Cashman advanced lead telluride (PbTe) detectors after 1944, improving performance for similar military uses.[13] These lead salt photon detectors marked the shift toward practical semiconductor-based infrared sensing, though they required cooling for optimal operation.[14]Post-war advancements in the late 1940s and 1950s built on these foundations, with the refinement of thermal detectors like bolometers—originally invented by Samuel P. Langley in the 19th century but enhanced for greater sensitivity—and the integration of photomultiplier tubes for image intensification.[13] By 1945, image converter tubes, such as the RCA 1P25 developed during the war, were adapted for broader infrared viewing applications.[13] The 1950s saw the emergence of cooled photon detectors, including early mercury cadmium telluride (HgCdTe) alloys discovered by W.D. Lawson in 1959, which allowed tunable bandgaps for mid-wave infrared (MWIR, 3-5 μm) detection, and indium antimonide (InSb) devices operational by the early 1960s in systems like the first TNO thermal imager.[13][15] These innovations were propelled by military funding during the Cold War and the space race, which funded developments like PbS and PbTe seeker heads by 1955 and later transferred technology to astronomical applications.[13]Infrared detector systems evolved through distinct generations in the latter half of the 20th century, reflecting advances in array technology and integration. First-generation systems, from the 1950s to 1960s, relied on scanning mechanisms with single-element or linear arrays of photoconductive detectors, such as PbS, InSb, or early HgCdTe, often cooled to 80 K, as seen in devices like the AGA Thermografiesystem 660 in 1965.[13][16] Second-generation systems emerged in the 1970s, introducing staring focal plane arrays (FPAs) with two-dimensional photovoltaic arrays of HgCdTe, InSb, or platinumsilicide (PtSi), enabled by charge-coupled devices (CCDs) invented in the late 1960s and hybrid integration with silicon readout circuits; the U.S. Common Module standard in 1975 exemplified this shift toward higher resolution and reduced scanning.[13][14] By the 1990s, third-generation systems incorporated multicolor and dual-band capabilities, with quantum wellinfrared photodetectors (QWIPs) invented in the mid-1980s and first demonstrated in 1987, allowing GaAs-based detection across multiple infrared bands without tunable semiconductors.[13][17]Military and space programs, including high-volume production in the 1970s, were instrumental in these generational leaps, prioritizing performance for forward-looking infrared (FLIR) and reconnaissance applications.[13]Parallel to these photon detector advancements, the 1990s saw breakthroughs in uncooled thermal detectors, particularly microbolometer FPAs, which operated at room temperature without cryogenic cooling. In 1994, Honeywell patented a vanadium oxide (VOx)-based microbolometer under the U.S. government's High-Density Array Development (HIDAD) program, enabling sensitive long-wave infrared (LWIR, 8-12 μm) imaging in compact, low-cost formats. This technology was licensed to companies like Raytheon, leading to the production of high-resolution arrays, such as 320×240 pixel devices by the early 2000s, and facilitating the transition of infraredimaging from specialized military use to broader commercial, automotive, and consumer applications.[18]
Fundamentals
Definition and overview
Infrared detectors are specialized devices designed to detect infrared radiation, which spans wavelengths from approximately 0.7 to 1000 micrometers (μm), and convert this energy into measurable electrical signals or visual images.[19] These detectors play a crucial role in capturing electromagnetic waves beyond the visible spectrum, enabling applications that rely on thermal signatures rather than reflected light. The discovery of infrared radiation itself dates back to 1800, when astronomer William Herschel identified it through experiments measuring temperature variations in the solar spectrum.[20]The infrared spectrum is commonly divided into several bands based on wavelength ranges, each associated with distinct detection challenges and applications: near-infrared (NIR, 0.7–1.4 μm), short-wave infrared (SWIR, 1.4–3 μm), mid-wave infrared (MWIR, 3–8 μm), long-wave infrared (LWIR, 8–15 μm), and far-infrared (FIR, 15–1000 μm).[21][22] These divisions reflect the varying atmospheric transmission windows and energy levels of the radiation, influencing detector design and sensitivity.Unlike visible light detectors, which rely on reflected photons for imaging, infrared detectors primarily sense thermal radiation emitted by objects at temperatures above absolute zero, allowing detection in low-light or no-light conditions such as night vision and remote sensing.[23] This capability stems from the fact that all matter with a temperature above 0 K emits infrared radiation according to blackbody principles. Key advantages include non-contact temperature measurement, which enables safe assessment of high-heat or hazardous environments, and the invisibility of infrared to the human eye, facilitating covert operations.[24][25]As of 2025, infrared detectors span consumer electronics like remote controls and smartphones to industrial and military systems, underscoring their pervasive integration across sectors.[26] The supporting market for these technologies is valued at approximately $0.65 billion in 2025, reflecting robust demand driven by advancements in imaging and sensing.[26]
Operating principles
Infrared detectors operate by converting incoming infrared radiation into a measurable electrical signal, primarily through two distinct physical mechanisms: thermal and quantum processes. In thermal detection, infrared photons are absorbed by the detector material, leading to a rise in temperature that induces physical changes, such as variations in electrical resistance or mechanical properties, without direct conversion of photons to electrons. This heating effect relies on the detector's sensitivity to temperature-induced modifications in its structure or electrical characteristics.Quantum detection, in contrast, involves the direct interaction of infrared photons with the detector's electronic structure, where photons with sufficient energy excite electrons from the valence band to the conduction band across the material's bandgap, generating a photocurrent or photovoltage. The energy of an infraredphoton is given by E = \frac{hc}{\lambda}, where h is Planck's constant, c is the speed of light, and \lambda is the wavelength; for infrared wavelengths typically ranging from 0.78 to 1000 μm, this energy falls between approximately 0.001 and 1.8 eV, necessitating materials with narrow bandgaps to enable such transitions.The detection of infrared radiation is fundamentally tied to blackbody emission principles, as most natural and artificial sources approximate blackbodies whose spectral radiance follows Planck's law:B(\lambda, T) = \frac{2hc^2}{\lambda^5} \frac{1}{e^{hc / \lambda k T} - 1},where k is Boltzmann's constant and T is the temperature in Kelvin; this equation describes how the intensity of infrared emission increases with temperature, particularly in the longer wavelengths relevant to thermal imaging. Quantum detectors often require cryogenic cooling, such as to temperatures below 77 K using liquid nitrogen, to suppress thermal noise generated by random electron excitations that could mimic photon-induced signals, thereby enhancing signal-to-noise ratios.
Detector types
Thermal detectors
Thermal detectors convert incident infrared radiation into heat, which induces a measurable change in the detector's physical properties, such as resistance, polarization, or mechanical displacement. This indirect detection mechanism allows them to operate without cryogenic cooling, making them suitable for a wide range of applications.[27][28]Bolometers represent a primary type of thermal detector, where absorbed infraredradiation raises the temperature of a sensing element, causing a corresponding change in its electrical resistance. The resistance variation is typically measured using a bias current or voltage, with the temperature coefficient of resistance (TCR) determining sensitivity. Microbolometers, a compact evolution of this technology, employ materials like vanadium oxide (VOx) or amorphous silicon (a-Si) and operate uncooled at room temperature. These devices are integral to forward-looking infrared (FLIR) cameras, enabling thermal imaging in consumer and industrial settings with array formats such as 320 × 240 pixels.[27][29][28]Pyroelectric detectors exploit the temperature-dependent spontaneous polarization in ferroelectric materials to generate a voltage or current signal. When infrared radiation heats the material, a change in polarization produces a measurable charge proportional to the rate of temperature variation. Lithium tantalate (LiTaO3) is a widely used material due to its high pyroelectric coefficient (~230 μC/m²K) and low dielectric loss, enabling uncooled operation with responsivities exceeding 7 kV/W. These detectors are often configured in arrays for imaging applications, offering fast response times on the order of milliseconds.[30][27]Thermopiles consist of multiple thermocouples connected in series, where the absorbed infraredradiation creates a temperature gradient across the junctions, producing a voltage output via the Seebeck effect. Commonly constructed using materials like bismuth and antimony or thin-film polysilicon, thermopiles operate uncooled and provide broadband response from near-IR to far-IR (approximately 2–20 μm). They exhibit typical voltage responsivities of 50–500 V/W and are extensively used in non-contact thermometers, such as ear and forehead devices, as well as in nondispersive infrared (NDIR) gas sensors for detecting species like CO₂.[31]Golay cells provide another historical approach to thermal detection, utilizing the pneumatic expansion of a gas within a sealed chamber heated by absorbed infrared radiation. This expansion deflects a flexible membrane, which is optically monitored to produce an electrical signal via a photodiode. While effective with responsivities around 1.5 × 105 V/W, Golay cells are now less common due to their bulkier design compared to solid-state alternatives, though they remain relevant for specialized broadband measurements.[27]A key advantage of thermal detectors is their ability to function at room temperature without cooling, facilitating compact and cost-effective systems. They exhibit broadband spectral response covering approximately 0.1–100 μm, insensitive to specific photon wavelengths, which contrasts with narrower-band quantum detectors. Common configurations include uncooled focal plane arrays, powering consumer devices like handheld thermal cameras for security and maintenance tasks.[27][28][29]
Quantum detectors
Quantum detectors, also referred to as photon detectors, function by absorbing infraredphotons that excite electrons across the bandgap or between quantized energy levels, generating a measurable electrical signal through direct photon-electron interactions. This contrasts with thermal detectors by offering higher sensitivity and faster response times, particularly in narrow spectral bands, though often requiring cryogenic cooling to suppress dark current and thermal generation.[32]Photoconductive detectors operate on the principle of increased electrical conductivity when infrared photons generate free charge carriers that enhance current flow under an applied bias voltage. Lead sulfide (PbS) detectors are widely used for near-infrared detection in the 1–3 μm range, while lead selenide (PbSe) extends sensitivity to 1.5–4.7 μm, both benefiting from polycrystalline thin-film structures that allow room-temperature operation in some configurations.[33][34]Photovoltaic detectors generate a photocurrent or photovoltage without external bias, leveraging p-n junction diodes where photon absorption creates electron-hole pairs separated by the built-in electric field. Indium antimonide (InSb) photovoltaic detectors are a benchmark for mid-wave infrared applications, offering high quantum efficiency and detectivity up to 5.5 μm, typically requiring cooling to 77 K for optimal performance. Similarly, mercury cadmium telluride (HgCdTe) photovoltaic detectors are widely used for both mid-wave (3–5 μm) and long-wave (8–12 μm) infrared detection, with compositionally tunable cutoffs extending to over 20 μm, achieving background-limited detectivities exceeding 10^{10} cm Hz^{1/2}/W when cooled to 77 K.[35][36][37]Quantum well infrared photodetectors (QWIPs) exploit inter-subband transitions in artificially confined quantum wells, where electrons are excited from bound ground states to higher energy levels within the conduction band of semiconductor heterostructures. Fabricated using mature gallium arsenide (GaAs)/aluminum gallium arsenide (AlGaAs) materials, QWIPs enable tunable detection in the mid- to long-wave infrared (4–20 μm) with high uniformity in focal plane arrays, though they require grating structures for light coupling due to polarization selection rules.[38][39]Quantum detectors are frequently designed for operation in atmospheric transmission windows to maximize signal propagation, such as the mid-wave infrared band (3–5 μm) and long-wave infrared band (8–12 μm), where water vapor and carbon dioxide absorption is minimal. While cooled modes, exemplified by mercury cadmium telluride (HgCdTe) devices at 77 K, achieve background-limited performance with detectivities exceeding 10^10 cm Hz^{1/2}/W, uncooled variants like certain PbS/PbSe photoconductors offer portability at the cost of lower sensitivity.[40][37]
Materials
Materials for thermal detectors
Thermal detectors, such as bolometers and pyroelectrics, convert incident infrared radiation into measurable temperature-induced changes in material properties like electrical resistance or polarization. The choice of materials is critical for achieving high sensitivity, low noise, and compatibility with uncooled operation, with selection based on figures of merit including the temperature coefficient of resistance (TCR) for resistive sensors or the pyroelectric coefficient for polarization-based devices.[30]Vanadium oxide (VOx), particularly in its semiconducting forms like VO2 or mixed stoichiometries, is a preferred thermistor material for microbolometers due to its high TCR of approximately 2–3%/K at room temperature, which enables efficient conversion of thermal energy to electrical signals. This property supports sensitive imaging in the long-wave infrared (LWIR) band of 8–14 μm, where atmospheric transparency is optimal for applications like thermal imaging. VOx films are typically deposited via sputtering or reactive evaporation to achieve uniform resistivity (around 0.1–1 Ω·cm) and stability, though controlling oxygen content during fabrication is essential to maintain optimal TCR without phase transitions that could degrade performance.[41][42][43]Amorphous silicon (a-Si) serves as a cost-effective alternative to VOx in uncooled microbolometer focal plane arrays, offering a TCR of about 2%/K alongside compatibility with standard CMOS processes for large-scale integration. Its lower processing temperature (below 400°C) reduces thermal stress on supporting structures, making it suitable for high-volume production of arrays with pixel pitches as small as 17 μm. While a-Si's TCR is slightly inferior to VOx, its electrical noise is often lower, and hydrogen passivation during plasma-enhanced chemical vapor deposition (PECVD) enhances stability and uniformity across arrays exceeding 1 megapixel.[44][45]For pyroelectric detectors, ferroelectric materials such as barium titanate (BaTiO3) and lead zirconate titanate (PZT) are employed, leveraging their high pyroelectric coefficients—up to 40 nC/cm²·K for optimized PZT compositions—to generate charge from temperature fluctuations induced by modulated IR radiation. BaTiO3 provides lead-free options with coefficients around 10–50 nC/cm²·K depending on poling and doping, while PZT variants near the morphotropic phase boundary achieve superior values through sol-gel or sputtering deposition, enabling detectivities (D*) exceeding 109 cm·Hz1/2/W in thin-film forms. These materials require precise poling to align domains and minimize dielectric losses (tan δ < 0.02), ensuring responsive operation at frequencies up to 100 Hz without cryogenic cooling.[30][46][47]Superconducting transition edge sensors (TES) utilize materials like niobium (Nb) or aluminum (Al), often in bilayer configurations (e.g., Al-Mn or Nb-Al), to provide ultra-sensitive bolometry through a sharp resistance transition at cryogenic temperatures (around 100–500 mK). Nb offers high critical temperature (Tc ~9 K) for wiring and ground planes, while Al enables low-Tc sensing elements with steep dR/dT slopes exceeding 108 Ω/K, achieving noise-equivalent powers (NEP) below 10-18 W/√Hz for far-infrared detection. These sensors demand dilution refrigeration but excel in photon-noise-limited applications like astronomy.[48][49]Emerging two-dimensional materials, such as graphene, are being explored for thermal detectors due to their exceptional thermal sensitivity and low noise, enabling bolometers with NEP as low as 10^{-19} W/√Hz at room temperature as of 2025.[50]Fabrication of thermal detector arrays presents significant challenges in MEMS integration, particularly for scaling to micro-scale pixels while maintaining thermal isolation and material integrity. Depositing VOx or a-Si thermistors on suspended membranes requires precise control of film stress (to avoid buckling) and etch selectivity during surface micromachining with sacrificial layers like amorphous silicon or polyimide, often leading to yield issues from non-uniformity or contamination. For pyroelectrics and TES, integrating ferroelectric or superconducting layers involves high-vacuum lithography and low-temperature annealing to prevent degradation, with vacuum packaging essential to minimize convective losses—yet achieving hermetic seals at wafer-scale remains a key bottleneck for commercial viability.[42][51][52]
Materials for quantum detectors
Quantum infrared detectors rely on semiconductor materials with precisely engineered bandgaps to enable photonabsorption and carrier generation at specific infrared wavelengths. Among these, mercury cadmium telluride (HgCdTe, often abbreviated as MCT) stands out as a cornerstone material due to its direct bandgap that can be continuously tuned from approximately 0.1 eV to 1.5 eV by adjusting the cadmium composition x in the formula \ce{Hg_{1-x}Cd_xTe}.[53] This compositional flexibility allows HgCdTe to be optimized for key atmospheric windows, making it the dominant material for mid-wave infrared (MWIR, 3–5 μm) and long-wave infrared (LWIR, 8–12 μm) detection in photovoltaic configurations.[54] For instance, compositions with x \approx 0.2–0.4 yield bandgaps suitable for MWIR applications, while lower x values around 0.2 target LWIR performance.[55]Another established material is indium antimonide (InSb), a narrow-bandgap semiconductor with a fixed bandgap of approximately 0.17 eV at room temperature, which corresponds to sensitivity in the 3–5 μm MWIR range.[56]InSb detectors benefit from high electron mobility and achieve quantum efficiencies exceeding 80%, often approaching 100% internally with appropriate antireflection coatings, enabling efficient photon-to-electron conversion.[57] Its use is particularly prevalent in cooled focal plane arrays for high-performance imaging.To address limitations in traditional bulk materials, such as high dark currents in LWIR detection, type-II superlattices like InAs/GaSb have emerged as advanced alternatives. These heterostructures feature spatially separated electron and hole wavefunctions, resulting in an effective bandgap tunable for LWIR (8–12 μm and beyond) while suppressing Auger recombination rates compared to equivalent bulk HgCdTe.[58] This reduction in non-radiative recombination enhances operating temperature and uniformity in large-format arrays.[59]Lead salt compounds, including lead sulfide (PbS) and lead selenide (PbSe), offer additional options for shorter-wavelength quantum detection. PbS, with sensitivity in the near-infrared (up to ~3 μm), and PbSe, extending into the mid-infrared (up to ~7 μm), both exhibit high absorption coefficients on the order of $10^4–$10^5 cm^{-1}, allowing thin films to achieve strong photonabsorption.[60] These polycrystalline materials are valued for their compatibility with photoconductive modes and cost-effective deposition.In recent developments, colloidal quantum dots (QDs), particularly PbS QDs, represent an emerging class of materials for quantum infrared detectors. These solution-processable nanocrystals enable bandgap tuning via quantum confinement, supporting room-temperature operation and integration into flexible substrates for wearable or conformable sensing applications.[61] Their versatility has led to broadband detectors spanning near- to mid-infrared with improved stability through ligand engineering.[62]Additionally, two-dimensional materials such as black phosphorus and transition metal dichalcogenides (e.g., MoS2) are gaining attention for quantum IR detection due to tunable bandgaps and high carrier mobilities, enabling SWIR to MWIR sensitivity in thin, flexible devices as of 2025.[63]
Performance characteristics
Key metrics
The performance of infrared detectors is evaluated through several key metrics that quantify their sensitivity, efficiency, and speed. Detectivity, denoted as D^*, serves as a primary figure of merit for sensitivity, independent of detector area and bandwidth. It is defined as D^* = \frac{\sqrt{A \Delta f}}{NEP}, where A is the detector area in cm², \Delta f is the bandwidth in Hz, and NEP is the noise equivalent power in W; the units are cm √Hz / W (or Jones).[64][65] For cooled mercury cadmium telluride (MCT) detectors, typical D^* values range from $10^{10} to $10^{12} cm √Hz / W, reflecting their high sensitivity in background-limited conditions.[66] Uncooled thermal detectors, such as microbolometers, typically achieve D^* values of $10^8 to $10^9 cm √Hz / W.[3]Noise equivalent power (NEP) represents the minimum detectable incident power that produces a signal-to-noise ratio of 1, typically normalized to 1 Hz bandwidth, with units W/√Hz. It is calculated as the RMS noise divided by the responsivity. Lower NEP values indicate better performance, with quantum detectors often achieving NEP on the order of $10^{-14} W/√Hz under optimal cooling.[64][65][66]Responsivity (R) measures the output signal per unit input optical power, defined as R = \frac{I_{ph}}{P_{in}}, where I_{ph} is the photocurrent in A and P_{in} is the input power in W; units are A/W.[64][66] In photovoltaic infrared detectors, such as those based on MCT, responsivity can reach 3–5 A/W, enabling strong signal generation from low-power infrared sources.[32]Quantum efficiency (\eta) quantifies the fraction of incident photons that generate charge carriers, expressed as \eta = \frac{h \nu I_{ph}}{e P_{in}}, where h is Planck's constant, \nu is the photon frequency, e is the electron charge, I_{ph} is photocurrent, and P_{in} is input power; \eta is dimensionless and typically ranges from 0 to 1.[65][66] High \eta values, often exceeding 70% in MCT quantum detectors, directly enhance overall detectivity by maximizing photon-to-electron conversion.[66]Response time (\tau) is the time constant for the detector signal to settle to 63% of its final value after a step input, influencing the maximum operating frequency. Quantum detectors exhibit fast response times of less than 1 ns, limited primarily by carrier transit and recombination, while thermal detectors have slower times on the order of milliseconds due to thermal diffusion processes.[3][66] This distinction makes quantum types suitable for high-speed applications requiring rapid signal recovery.
Noise and limitations
Infrared detectors are subject to various noise sources that degrade signal-to-noise ratio and limit sensitivity, particularly in low-signal environments. These noises arise from thermal fluctuations, statistical variations in charge carriers, and material imperfections, influencing the overall detectivity and dynamic range of the devices. Understanding these mechanisms is crucial for optimizing detector performance across thermal and quantum types.Johnson-Nyquist noise, also known as thermal noise, originates from random thermal motion of charge carriers and is prominent in uncooled infrared detectors where cooling is not feasible. This noise is characterized by the root-mean-square current fluctuation given by \sigma_J = \sqrt{4kT \Delta f / R}, where k is Boltzmann's constant, T is the temperature, \Delta f is the bandwidth, and R is the detector resistance. In unbiased high-operating-temperature (HOT) mid-infrared detectors, such as those based on photovoltaic structures, Johnson-Nyquist noise often dominates due to its dependence on temperature and resistance, setting a fundamental limit on the noise equivalent power in ambient conditions. For uncooled bolometric detectors, this thermal noise correlates with the device's electrical properties and can be mitigated through material optimization, but it remains a key constraint in room-temperature operation.Shot noise stems from the Poisson statistics of discrete charge carriers, whether photo-generated or thermally excited, and is particularly relevant in quantum detectors under low-flux conditions. The noise current is expressed as \sigma_{shot} = \sqrt{2q I_{dc} \Delta f}, with q as the elementary charge and I_{dc} the direct current (such as dark or photocurrent). In photovoltaic infrared detectors, shot noise arises from the random arrival of photons or carriers, limiting the signal detection in scenarios where the photocurrent is comparable to darkcurrent contributions. This noise sets the fundamental quantum limit for ideal detectors, though practical devices often exhibit higher levels due to additional generation-recombination processes.1/f noise, or flicker noise, manifests as low-frequency fluctuations and is prevalent in amorphous materials used for uncooled thermal detectors, such as amorphous silicon or vanadium oxide thin films. This noise spectrum follows a $1/f dependence, increasing inversely with frequency and thereby restricting applications requiring low-speed or DCsignal processing, like imaging in static scenes. In bolometric films with high temperature coefficient of resistance, such as hydrogenated amorphous germanium, 1/f noise levels are elevated due to defect trapping and surface effects, often exceeding thermalnoise at frequencies below 100 Hz and degrading the noise equivalent temperaturedifference. Comparative studies of prospective infraredimaging films highlight that amorphous structures inherently produce higher 1/f noise than crystalline alternatives, necessitating passivation techniques to suppress it for practical focal plane arrays.Beyond intrinsic noises, infrared detectors face performance limitations from environmental and material factors. In high-background-flux scenarios, such as outdoor imaging under sunlight, detectors reach background-limited infrared performance (BLIP), where photonnoise from the scene dominates over internal noises, capping the specific detectivity regardless of cooling. For quantum detectors employing narrow-gap semiconductors like InAs/GaSb superlattices, dark current—arising from thermal generation across the bandgap—poses a severe constraint, as it generates excess carriers that mimic signal and amplify noise in long-wavelength infrared regimes. This thermally activated dark current follows an exponential dependence on temperature and bandgap, often requiring cryogenic cooling to achieve BLIP operation below 80 K.A key trade-off in infrared detector design involves operating temperature: elevating it from cryogenic levels (e.g., 77 K for liquid nitrogen cooling) to higher values (e.g., 200 K for HOT configurations) reduces cooling costs and system complexity but exacerbates noise. In mercury cadmium telluride (MCT) detectors, higher temperatures increase thermal generation rates, boosting dark current and Johnson noise by factors of 10 to 100 per 50 K rise, while enabling uncooled or thermoelectric cooling at the expense of reduced detectivity. This balance is evident in mid-wave MCT arrays, where 77 K operation yields near-BLIP performance but demands bulky cryocoolers, whereas 200 K designs prioritize portability for tactical applications despite elevated noise floors.
Applications
Military and security
Infrared detectors play a pivotal role in military and security applications, enabling enhanced situational awareness, threat detection, and precision targeting in low-visibility conditions. These systems leverage both cooled quantum detectors, such as those based on mercury cadmium telluride (HgCdTe), for high-sensitivity mid-wave infrared (MWIR) and long-wave infrared (LWIR) imaging, and uncooled thermal detectors like microbolometers for rugged, cost-effective deployment.[67][68]Historically, the development of infrared detectors traces back to World War II, where lead sulfide (PbS) detectors were employed in early night vision devices known as snooperscopes or sniperscopes. These PbS-based systems, sensitive to wavelengths up to approximately 3 μm, were mounted on rifles like the M1 carbine to provide passive infrared illumination and detection, marking the first combat use of infrared technology in 1944-1945 for nighttime engagements.[69][70] Although limited by low resolution and the need for active infrared sources, these devices demonstrated the tactical advantage of infrared over darkness, influencing post-war advancements in detector materials.[71]In modern night vision and targeting systems, third- and fourth-generation MWIR/LWIR focal plane arrays using HgCdTe detectors enable precise guidance in munitions like the FGM-148 Javelin anti-tank missile. The Javelin's command launch unit incorporates a 64 x 64 pixel HgCdTe focal plane array operating in the 8-12 μm LWIR band, allowing fire-and-forget targeting of armored vehicles at ranges exceeding 2.5 km by detecting thermal signatures without line-of-sight dependency on the operator.[72][67] This quantum detector technology provides high detectivity and low noise, essential for discriminating hot engine exhaust from cooler vehicle bodies in cluttered environments.[73]Missile warning systems utilize dual-band UV/IR detectors to detect incoming threats and reject countermeasures like flares, improving survivability of aircraft and ground vehicles. Two-color mid-wave infrared sensors, often combining 3-5 μm and 4-5 μm bands, enhance plume-to-background contrast by analyzing spectral signatures, significantly reducing false alarms from decoys compared to single-band systems.[74] These detectors integrate with automated cueing to trigger evasive maneuvers or flare deployment within milliseconds of launch detection.[75]For bordersurveillance, uncooled microbolometer cameras offer reliable perimeter monitoring without cryogenic cooling, making them ideal for fixed and mobile deployments in harsh environments. These vanadium oxide (VOx)-based detectors, operating in the 8-14 μm LWIR range, enable detection of human-sized targets at distances up to 20 km, supporting real-time intruder tracking along international borders.[76] Systems like those from Lynred incorporate 640 x 480 pixel arrays with NETD under 50 mK, facilitating 24/7 operation for airport, harbor, and frontier security.[77]As of 2025, advancements in AI-integrated hyperspectral infrared detectors are transforming threat detection by enabling material-specific identification in real-time military operations. These systems fuse hyperspectral imaging across multiple IR bands with convolutional neural networks to detect explosives or concealed threats from standoff distances, achieving identification accuracies above 95% even in obscured conditions.[78] Integrated into tactical unmanned aerial systems (UAS), such as those demonstrated by Lockheed Martin and Arquimea, they provide anomaly detection for improvised explosive devices (IEDs) and drone swarms, enhancing intelligence, surveillance, and reconnaissance (ISR) capabilities.[79][80]
Scientific and commercial
Infrared detectors play a pivotal role in astronomical research, enabling the observation of distant celestial phenomena through mid-infrared spectroscopy. The James Webb Space Telescope (JWST), launched in December 2021 and fully operational by 2025, utilizes HAWAII-2RG mercury cadmium telluride (HgCdTe) detector arrays in its Near Infrared Spectrograph (NIRSpec) to capture spectra in the 0.6 to 5.3 micrometer range, facilitating studies of star formation, exoplanet atmospheres, and galaxyevolution.[81] These arrays, with 2048 × 2048 pixels and very low dark current meeting the requirement of below 0.01 electrons per second per pixel, provide the high sensitivity required for detecting faint infrared signals from deep space, where low noise performance is essential for resolving subtle spectral features.[82]In medical applications, uncooled bolometer-based infrared detectors support thermography for non-invasive diagnostics. These devices, operating at room temperature without cryogenic cooling, detect thermal variations on the skin surface to identify physiological anomalies, such as elevated temperatures indicative of fever during screening protocols.[83] For breast cancer detection, uncooled microbolometer arrays in thermal cameras visualize hypervascularity and inflammation around tumors through temperature asymmetries, offering a complementary tool to mammography despite limitations in standalone accuracy.[84]Industrial uses leverage infrared detectors for safety and efficiency monitoring. Tunable diode laser absorption spectroscopy paired with indium antimonide (InSb) detectors enables precise gas leak detection by targeting mid-infrared absorption lines of hydrocarbons like methane, allowing remote sensing over long distances, such as several kilometers, with sensitivities below 1 part per million.[85] Thermal imaging with uncooled focal plane arrays supports predictive maintenance by identifying overheating in electrical panels, bearings, and pipelines, reducing downtime through early fault detection in manufacturing and energy sectors.[86]Consumer products integrate near-infrared detectors for enhanced functionality and safety. Smartphones employ near-infrared illuminators and sensors at wavelengths around 940 nanometers for secure face unlock, capturing depth and liveness data to prevent spoofing even in low light.[87] In automotive advanced driver-assistance systems (ADAS), long-wave infrared (LWIR) detectors, typically in the 8-14 micrometer band, enable pedestrian detection by sensing body heat signatures at night or in adverse weather, contributing to automatic emergency braking that reduces collision risks by up to 30 percent.[88]By 2025, emerging trends include flexible quantum dot (QD)-based infrared sensors tailored for wearables, utilizing colloidal QDs like lead sulfide for broadband detection in the near- to mid-infrared, enabling compact, skin-conformable devices for continuous health monitoring such as vital sign tracking.[89] These advancements also extend to environmental monitoring, where LWIR detectors on satellite platforms and ground networks detect wildfire hotspots through thermal anomalies, supporting rapid response to early-stage fires over vast areas.[90]