TNT equivalent
TNT equivalent is a standardized measure of explosive energy, defined as the mass of trinitrotoluene (TNT) that would release an equivalent amount of destructive output, typically assessed through parameters such as peak overpressure or impulse from blast waves.[1] This convention facilitates direct comparisons of yields among diverse explosives, where the equivalence factor for a given material is derived empirically by matching its effects to those of TNT under controlled conditions.[2] The unit is expressed in tons (or kilotons and megatons for larger events), with one metric ton of TNT conventionally equivalent to approximately 4.184 gigajoules of energy, though practical equivalence often prioritizes blast effects over total chemical energy release due to variations in detonation products and afterburning.[3] It originated in military and industrial contexts for evaluating munitions but extends to nuclear yields—such as the 15-kiloton Hiroshima bomb—and geophysical phenomena like volcanic blasts or meteor airbursts, enabling consistent risk assessments and scaling laws in safety engineering and hazard modeling.[4] Key limitations include context-dependent values, as confined versus open-air detonations or fuel-rich compositions can alter effective yields, necessitating test-specific calibrations for accuracy.[5]Fundamentals
Definition
The TNT equivalent is a conventional measure for expressing the energy output or blast effects of an explosion in terms of the equivalent mass of trinitrotoluene (TNT) required to produce a comparable result. This standardization facilitates comparisons across diverse explosive sources, including conventional chemical detonations, nuclear fission or fusion reactions, asteroid impacts, and volcanic eruptions, by normalizing their destructive potential relative to TNT's well-characterized performance. Unlike direct energy units such as joules, TNT equivalence emphasizes practical blast parameters like peak overpressure and impulse, though it is often approximated via total chemical or physical energy release when direct measurement is infeasible.[1][6] The baseline for equivalence is derived from TNT's detonation characteristics, where 1 metric tonne (1,000 kg) of TNT is defined to yield approximately 4.184 gigajoules (4.184 × 10^9 joules) of energy, equivalent to about 1,000 megacalories. This value stems from calorimetric measurements of TNT's heat of explosion, adjusted for detonation efficiency, and serves as the reference for scaling yields—e.g., a 1-kiloton event corresponds to 1,000 tonnes of TNT. Equivalence ratios for other materials are determined experimentally through methods like cratering tests, air-blast pressure recordings, or cylinder expansion, revealing that many high explosives (e.g., PETN or RDX) exhibit TNT equivalences exceeding 1.0 due to higher detonation velocities and brisance, while factors like confinement or afterburn can alter effective yields.[3][7]Standard Units
The standard unit for TNT equivalent is the tonne of TNT, defined using the metric tonne (1,000 kilograms) as the base mass equivalent. This unit quantifies explosive energy release by convention, with one tonne of TNT equivalent to exactly 4.184 gigajoules (4.184 × 10^9 joules).[8][3] This defined value derives from the international calorie standard (1 kcal = 4.184 kJ), equating to one gigacalorie per tonne, rather than empirical detonation measurements of TNT, which vary between 4.0 and 4.3 gigajoules due to composition factors like purity and detonation efficiency.[3] For scaling larger yields, metric prefixes are applied: one kilotonne (kt or kT) equals 1,000 tonnes (4.184 × 10^12 joules), and one megatonne (Mt) equals 1,000,000 tonnes (4.184 × 10^15 joules).[9] These units facilitate comparisons across explosive events, converting directly to SI joules for precise calculations while providing intuitive mass-based scaling for non-specialists. Although occasionally expressed in short tons (907 kg) in U.S. contexts, international standards prioritize the metric tonne to align with global metrology.[8]Historical Development
Origins and Derivation
The concept of TNT equivalence originated in explosives engineering practices to standardize comparisons of explosive performance across different materials, addressing the absence of a universal metric for blast effects, cratering, or structural damage prior to widespread adoption in military and industrial testing. Trinitrotoluene (TNT) was selected as the reference standard owing to its chemical stability, insensitivity to shock, consistent detonation velocity of about 6,900 m/s, and reproducible energy output under controlled conditions, allowing reliable benchmarking against more variable explosives like nitroglycerin or picric acid.[1] Derivation of TNT equivalence relies on empirical testing rather than purely thermodynamic calculations, as blast effects depend on factors such as detonation pressure, brisance, and gas expansion, which correlate imperfectly with chemical heat of explosion. Common methods include the ballistic mortar test, where the explosive's ability to propel a mortar pendulum is measured against TNT; the sand crush or Trauzl lead block expansion test for relative volume displacement; and air blast overpressure recordings scaled to TNT's characteristic impulse profile. These yield a relative effectiveness factor (often denoted as RE factor), where TNT is defined as 1.0, enabling conversion such that an explosive's yield in TNT tons equals its mass times its RE factor.[1] The baseline energy release for TNT is calibrated at 4.184 megajoules per kilogram, derived from bomb calorimetry of its detonation products (primarily CO, CO₂, N₂, and H₂O), though equivalence ratios frequently deviate from this value due to differences in coupling efficiency to air or ground— for instance, high explosives like RDX may show 1.5–1.6 RE in blast tests despite similar or higher chemical energy. This approach prioritizes observable hydrodynamic and shock phenomena over isolated combustion enthalpy, reflecting causal mechanisms in real detonations where incomplete energy transfer to mechanical work occurs.[1]Evolution in Measurement Standards
The concept of TNT equivalence originated in the late 19th century with empirical tests designed to quantify relative explosive power, such as the Trauzl lead block test introduced around 1880, which measured volume expansion in a lead container to assess brisance, and the ballistic mortar test, which gauged projectile propulsion as a proxy for total energy output.[10] These methods compared an unknown explosive's performance directly to TNT under controlled conditions but suffered from variability due to factors like charge density, confinement, and test geometry, with no absolute energy calibration.[1] Sand crush and plate dent tests later supplemented these, focusing on shock pressure and detonation products, yet equivalence ratios often differed by 20-50% across tests, reflecting TNT's selection as a reference for its chemical stability and reproducible detonation velocity of approximately 6,900 m/s.[1][11] During World War II and the Manhattan Project, the advent of nuclear weapons necessitated scalable yield comparisons, shifting emphasis toward air blast overpressure and impulse measurements, calibrated against large-scale TNT detonations like those in Operation Sailor Hat (1962-1965), which validated scaling laws for spherical charges.[12] Early nuclear yield estimates, such as the 1945 Trinity test's 21 kilotons, relied on empirical correlations from conventional explosives, but inconsistencies in TNT's effective energy release—due to incomplete combustion and afterburning—prompted refinements in measurement protocols, including standardized densities (typically 1.6 g/cm³) and hemispherical charge geometries for blast equivalence.[1] By the 1950s, military standards like those in U.S. Army manuals incorporated multiple metrics, recognizing that heat-of-explosion equivalence (around 4.2 MJ/kg for TNT) diverged from blast equivalence, where only about 50% of chemical energy converts to air shock waves.[1] In the post-war era, the convention evolved into a fixed energy-based standard for consistency across applications: one metric tonne of TNT is defined as releasing exactly 4.184 × 10⁹ joules, derived from approximating TNT's heat of detonation at 1 international kilocalorie per gram (4.184 MJ/kg), though laboratory measurements vary from 4.0 MJ/kg in open conditions to 4.6 MJ/kg under confinement due to differences in reaction completeness.[1] This value, solidified by the 1960s in nuclear effects literature and engineering handbooks, prioritizes computational uniformity over precise calorimetry, enabling hopkinson-cranz scaling for blast predictions (Z = R/W^{1/3}, where W is TNT mass).[12] Contemporary critiques highlight ongoing limitations, as equivalence depends on phenomenology—e.g., lower for impulse than peak pressure—and recommend supplementing with absolute performance data from high-speed gauging and hydrodynamic simulations to address historical variabilities.[1]Calculation and Methodology
Energy Release Basis
The TNT equivalent on an energy release basis quantifies the total chemical energy liberated per unit mass during detonation, standardized for trinitrotoluene (TNT) at 4.184 megajoules per kilogram (MJ/kg). This defined value establishes that one metric tonne of TNT releases precisely 4.184 gigajoules (GJ), equivalent to $10^9 thermochemical calories, aligning the unit with historical calorimetric standards for computational convenience in yield assessments.[13][14] Empirical measurements of TNT's heat of detonation, accounting for the rapid decomposition reaction \ce{2C7H5N3O6 -> 3N2 + 5H2O + 12CO + 7C} (or variants with CO2 formation under oxygen-rich conditions), typically yield 4.0–4.7 MJ/kg, influenced by factors such as charge density, confinement, and post-detonation combustion of carbon residues.[15][16][17] The standardized 4.184 MJ/kg, however, prioritizes uniformity over variability in lab-derived values, enabling direct proportionality for equivalence: the TNT mass m is m = E / (4.184 \times 10^6) kg, where E is the total event energy in joules. This basis treats the explosion as a near-complete conversion of molecular bond energy into gaseous expansion, heat, and kinetic energy of products, with approximately 25–30% of the output manifesting as blast wave energy in air bursts. For non-chemical events like nuclear fission or impacts, equivalence scales the prompt energy release analogously, though differences in radiation fractions or multi-phase dynamics necessitate adjustments for specific applications.[7] Limitations arise when total energy poorly predicts localized effects, as equivalence ratios can deviate by 20–50% from pressure-based metrics due to variances in detonation velocity (around 6900 m/s for TNT) and brisance.[18]Blast and Pressure Equivalence Methods
Blast equivalence methods for determining TNT equivalent yield focus on matching the dynamic pressure profiles of the blast wave produced by an unknown explosive to those generated by a reference TNT charge under comparable conditions, typically in free air. These approaches prioritize observable blast effects, such as peak incident (side-on) overpressure P_s and positive-phase impulse i_s, over total chemical energy release, as blast wave propagation depends on factors like detonation velocity, product temperature, and gas expansion efficiency. Experimental setups involve detonating the test explosive at ground zero or in a spherical charge configuration, with piezoelectric gauges positioned at multiple scaled distances Z = r / W^{1/3} (where r is radial distance in meters and W is equivalent TNT mass in kilograms) to capture time-resolved pressure histories. The equivalence factor f is then derived as f = (W_{\text{test}} / W_{\text{TNT}}), where W_{\text{TNT}} is back-calculated from measured parameters using TNT-calibrated models to achieve parity in P_s or i_s at specified Z.[1][19] A primary tool for this calculation is the Kingery-Bulmash parameterization, an empirical curve fit to large datasets of TNT air-blast measurements, expressing P_s, i_s, and duration as functions of Z for hemispherical surface bursts or spherical free-air detonations. For a given measured P_s at distance r, the method inverts the fit to solve for W_{\text{TNT}} such that the predicted P_s matches the observation, yielding f = W_{\text{test}} / W_{\text{TNT}}. This yields blast-specific equivalences, which can diverge from calorimetric values; for instance, high-brisance explosives like PETN exhibit f \approx 1.4-1.6 for peak overpressure due to sharper shock fronts, but lower f for impulse owing to faster decay in the afterflow phase. Impulse-based equivalence, computed as \int_0^{t_d} (P(t) - P_0) dt where t_d is positive duration and P_0 ambient pressure, better captures structural loading and is preferred for damage assessment, as it integrates both peak and duration effects.[20][21][19] Pressure equivalence is particularly useful for near-field applications, where initial shock strength dominates, and can be estimated from detonation parameters via P_s \propto \rho_0 D^2 (initial density \rho_0, velocity D), scaled to TNT's P_{s,\text{TNT}} \approx 2.0 \times 10^5 MPa at the Chapman-Jouguet plane. However, discrepancies arise for explosives with afterburning or non-ideal detonation, as TNT's lower detonation temperature (≈3200 K) results in more efficient far-field coupling compared to hotter aluminized compositions, potentially underpredicting f by 10-20% if impulse is ignored. Validation requires multiple gauges to confirm self-similarity in the intermediate-to-far field (Z > 1 m/kg^{1/3}), with uncertainties reduced via least-squares fitting to curve families. These methods underpin standards like those in UFC 3-340-02 for munitions testing, emphasizing empirical measurement over theoretical prediction due to variability in explosive composition and confinement.[7][1][19]Applications
Conventional Explosives
The TNT equivalent provides a standardized measure for evaluating the blast effects and energy output of conventional explosives, enabling consistent comparisons across diverse chemical formulations such as ammonium nitrate-fuel oil (ANFO), cyclotrimethylenetrinitramine (RDX), and pentaerythritol tetranitrate (PETN). This metric accounts for differences in detonation characteristics, including heat of explosion and shock wave propagation, relative to TNT as the baseline. In military and industrial contexts, it informs weapon design, storage safety distances, and hazard assessments by correlating explosive mass with equivalent TNT yield via experimentally derived relative effectiveness factors or direct blast testing.[1][7] The reference energy release for TNT detonation is 4.184 megajoules per kilogram, derived from calorimetric measurements of its complete combustion under explosive conditions. Equivalence for other conventional explosives is often calculated as mass multiplied by a relative effectiveness (RE) factor, where TNT has an RE of 1.0; for example, PETN-based charges exhibit higher air-blast equivalence due to greater detonation pressure, as determined through arena tests measuring peak overpressure. This approach prioritizes empirical validation over theoretical predictions, as blast scaling laws like Hopkinson-Cranz exhibit deviations for non-ideal explosives at varying distances and geometries.[7][22] In military applications, yields of conventional bombs are typically expressed in tons of TNT equivalent to gauge lethality and collateral damage. The U.S. GBU-43/B Massive Ordnance Air Blast (MOAB), deployed in 2017, contains 8,500 kg of H-6 explosive (a mixture including RDX and aluminum) and delivers approximately 11 short tons (10 metric tonnes) TNT equivalent, producing a blast radius exceeding 150 meters for lethal overpressures. Russia's Aviation Thermobaric Bomb of Increased Power (FOAB), tested on September 11, 2007, reportedly achieves 44 tons TNT equivalent through volumetric combustion enhancement, surpassing the MOAB by a factor of four despite a similar total mass, though independent verification remains limited to Russian Ministry of Defense statements. Smaller munitions, such as the U.S. Mark 82 bomb with 192 kg Tritonal fill (TNT plus aluminum), equate to about 200 kg TNT, scaling effects proportionally via cubed-root laws for incident pressure.[23][24] Large-scale or accidental detonations of conventional stockpiles demonstrate kiloton-scale potentials. The Halifax Explosion on December 6, 1917, resulted from the collision involving SS Mont-Blanc laden with 2,653 tons of high explosives (primarily picric acid and TNT), releasing 2.9 kilotons TNT equivalent and generating a shock wave that devastated 2 square kilometers. Similarly, the August 4, 2020, Beirut port explosion of 2,750 tons ammonium nitrate produced 0.5 to 1.1 kilotons TNT equivalent, as seismically recorded and modeled, underscoring ammonium nitrate's variable RE factor of 0.3 to 0.42 depending on confinement and initiation. These events highlight TNT equivalence's utility in forensic reconstruction and risk modeling, where actual yields often fall below theoretical maxima due to incomplete deflagration or fragmentation losses.[25][26]Nuclear Weapons
The explosive yield of nuclear weapons is measured in TNT equivalents to standardize comparisons of their energy release, with yields typically expressed in kilotons (kt) or megatons (Mt) of TNT, where 1 kt equals the energy from exploding 1,000 tons of TNT, or approximately 4.184 × 10^{12} joules.[27] This metric derives from the total kinetic energy imparted to the bomb casing and subsequent blast wave, primarily from fission of uranium or plutonium in atomic bombs and additional fusion in thermonuclear devices.[28] Yields are calculated theoretically from the mass of fissile material and efficiency of the chain reaction, using E = mc² to convert mass defect to energy, then scaled by empirical test data.[29] Post-detonation yields from tests are verified empirically through methods such as analyzing seismic magnitudes, bhangmeter readings of fireball brightness, radiochemical debris sampling, and hydrodynamic simulations calibrated against known explosions.[30] For instance, the Trinity test on July 16, 1945, at Alamogordo, New Mexico, produced a yield initially assessed at 18.6 kt but later refined through multiple analyses.[31] The uranium-based "Little Boy" device airburst over Hiroshima on August 6, 1945, at an altitude of about 580 meters, yielded approximately 15 kt, fissioning roughly 0.7% of its 64 kg of highly enriched uranium core.[32] The plutonium "Fat Man" bomb over Nagasaki three days later achieved 21 kt.[32] Thermonuclear weapons dramatically increased yields via multi-stage designs, where a fission primary triggers fusion in a secondary stage, releasing energies orders of magnitude greater than fission alone.[28] The U.S. Ivy Mike test on November 1, 1952, yielded 10.4 Mt, while the Soviet AN602 "Tsar Bomba," detonated on October 30, 1961, over Novaya Zemlya, produced 50 Mt—the highest ever tested—equivalent to fusing about 2.4 kg of deuterium-tritium at near-perfect efficiency, though designed for up to 100 Mt but scaled down to reduce fallout.[33] These measurements rely on cross-verified data from instruments, as theoretical predictions alone underestimate real-world inefficiencies like neutron losses and incomplete burn-up.[30] While TNT equivalence captures total prompt energy, nuclear blasts differ from chemical explosions in delivering a higher fraction as thermal radiation and ionizing particles, altering damage profiles beyond simple pressure scaling.[27]Non-Military Examples
The TNT equivalent is applied to quantify energies from natural disasters, providing a standardized measure for comparing seismic, volcanic, and impact events to explosive yields. Earthquakes release vast seismic energy, often expressed in this unit via established conversion formulas relating moment magnitude to joules, then to TNT tons (1 ton TNT ≈ 4.184 × 10^9 joules).[34][35] The 1960 Valdivia earthquake (moment magnitude 9.5), the largest recorded, released seismic energy estimated at 2 to 3 gigatons of TNT equivalent, calculated from log₁₀(E) ≈ 5.24 + 1.44 × M_w where E is in joules.[34] This dwarfs the 2011 Tōhoku earthquake (M_w 9.0–9.1), at approximately 475 megatons, yet illustrates how each full magnitude increase multiplies energy by roughly 31.6 times.[35] Volcanic supereruptions also yield immense energies. The most recent major Yellowstone Caldera eruption, around 640,000 years ago, expelled over 1,000 km³ of material with an estimated total energy release exceeding 2 million megatons of TNT, far surpassing modern nuclear arsenals. Asteroid and comet impacts provide extreme examples. The Chicxulub impactor, linked to the Cretaceous–Paleogene extinction ~66 million years ago, delivered kinetic energy equivalent to 100 teratons (100 million megatons) of TNT, vaporizing rock and triggering global climate disruption.[36] The 1908 Tunguska event, an airburst over Siberia, released about 10–20 megatons, flattening 2,000 km² of forest without forming a crater.[37] Similarly, Comet Shoemaker–Levy 9's fragments struck Jupiter in 1994, with total impact energy around 300–6,000 gigatons of TNT equivalent, producing fireballs larger than Earth.[38] Industrial accidents occasionally use TNT equivalents for analysis, such as the 2020 Beirut port explosion from 2,750 tons of ammonium nitrate, yielding roughly 1–1.5 kilotons TNT equivalent—comparable to a small tactical nuclear device but non-military in origin.[26] These measurements aid risk assessment without implying direct explosivity parallels, as natural events involve different energy dissipation mechanisms like seismic waves or ejecta.Limitations and Criticisms
Inherent Inaccuracies
The TNT equivalent is fundamentally an approximation, as experimental determinations introduce errors from uncompensated energy losses, such as those due to shock heating, gas expansion, and interactions with test apparatus like steel casings, which can consume up to 500 cal/g in standard configurations.[1] Air blast tests, commonly used for equivalence, exhibit variations of up to 25% in reported factors even for TNT itself, arising from differences in blast wave profiles that evolve with standoff distance and fail to fully capture impulse or negative phase accurately.[1] Standard testing methods exacerbate these issues through inconsistencies like unspecified or uncontrolled explosive charge densities, which directly influence Chapman-Jouguet pressures and overall output predictions. For example, the sand crush test skews results by neglecting density effects on detonation velocity, while the Trauzl test shows poor correlation with heat of explosion values, limiting its reliability for broad equivalence assessments.[1] Ballistic mortar and plate dent tests similarly suffer from unaccounted deformation energies and confinement artifacts, yielding equivalence values that diverge significantly across methods for the same explosive.[1] Literature-reported TNT equivalence factors for identical explosives display substantial scatter, often spanning ranges that propagate large uncertainties into hazard modeling and structural load estimates. This variability stems partly from the metric's sensitivity to specific blast parameters—equivalence for peak overpressure rarely matches that for total impulse—and is compounded in non-ideal detonations or confined environments, where oxygen availability and wall interactions alter energy partitioning beyond free-air chemical benchmarks. For aluminized or composite explosives, standard models underestimate yields by overlooking post-detonation afterburning, as these late-phase reactions contribute additional energy not reflected in initial detonation states.[1] [39] Such discrepancies highlight the caloric basis of TNT equivalence as a convenient but inherently limited proxy, particularly when extrapolating to heterogeneous events like nuclear yields where radiation and thermal fractions dominate over pure hydrodynamic effects.[1]Debates on Applicability
The applicability of TNT equivalence to events beyond conventional chemical high explosives is contested due to discrepancies in energy partitioning and release mechanisms. For nuclear weapons, the measure quantifies total fission or fusion yield but overlooks how energy is distributed—typically 35% to blast, 50% to thermal radiation, and 15% to initial nuclear radiation in an air burst—contrasting with TNT's near-total conversion to mechanical shock and heat without prompt ionizing effects or electromagnetic pulses. Critics argue this renders the equivalence insufficient for assessing comprehensive damage, as radiation and fallout extend lethal radii far beyond blast zones comparable to a TNT detonation of equivalent mass.[40] In assessments of non-ideal or aluminized explosives, such as those used in military ordnance, TNT equivalence varies significantly by metric: up to 25% deviation in air-blast scaling with distance from the source, and differing factors for peak overpressure versus impulse, complicating predictions of structural response or fragment hazards. Empirical tests, including cylinder expansion and air-blast measurements, reveal that equivalence derived from one method (e.g., detonation pressure) poorly predicts outcomes in others, prompting recommendations to favor Chapman-Jouguet pressure calculations for precision in heterogeneous compositions.[1] For geophysical events like asteroid impacts or volcanic eruptions, TNT equivalence serves as a rough total-energy proxy but misleads on localized effects, as kinetic or magmatic energy dissipates primarily into seismic waves, cratering, and ejecta rather than isotropic air blasts akin to surface detonations. Asteroid airbursts, for instance, channel energy into thermal pulses and shock waves with minimal ground coupling, yielding blast radii divergent from TNT models despite matched yields; a 500-kiloton event like Chelyabinsk in 2013 produced overpressures akin to a nuclear device but without subsurface excavation. Similarly, earthquake magnitudes equate to TNT via seismic efficiency (around 1-10% of elastic strain energy as radiated waves), yet the protracted release precludes explosive overpressures, rendering the metric irrelevant for blast damage analogies.[41]Conversions and Comparisons
To Other Energy Units
The TNT equivalent for one metric tonne is defined as exactly 4.184 gigajoules (GJ), or 4.184 × 10⁹ joules (J), a convention established to standardize explosive energy measurements independent of variations in actual TNT detonation efficiency.[8] This value aligns with the thermochemical calorie, where one tonne TNT corresponds to exactly 10⁹ calories (cal), as the thermochemical calorie is defined as precisely 4.184 J.[8] Equivalent values in other common energy units, derived from the joule definition and standard conversion factors, include:| Unit | Equivalent for 1 tonne TNT |
|---|---|
| British thermal unit (BTU, international table) | 3.965667 × 10⁶ BTU |
| Foot-pound force (ft·lbf) | 3.08596 × 10⁹ ft·lbf |
| Watt-hour (Wh) | 1.162222 × 10⁶ Wh |
Relative Effectiveness Factors
The relative effectiveness factor (RE factor), synonymous with TNT equivalence in many engineering contexts, measures the demolition or blast power of an explosive relative to TNT on a mass-for-mass basis. It is calculated as the ratio of the mass of TNT needed to achieve the same specific effect (such as peak overpressure or impulse) as one unit mass of the tested explosive, enabling scaling of effects for design, safety, and hazard assessment purposes.[42] This factor is not a fixed property but varies with the performance metric—e.g., cylinder expansion for brisance, air blast parameters for overpressure, or cratering for ground effects—and experimental conditions like charge confinement, density, and geometry.[1] RE factors are derived empirically through standardized tests, including ballistic mortar for work output, plate dent for high-pressure effects, or arena tests for air blast scaling. Theoretical estimates may supplement data using detonation parameters like Chapman-Jouguet pressure or heat of detonation, but discrepancies arise for non-ideal explosives (e.g., those with afterburning components like aluminum), where energy release timing affects outcomes. For instance, aluminized compositions often show lower RE in early blast phases but higher total impulse due to delayed combustion.[1] In regulatory and military applications, conservative RE values are selected to ensure safe standoff distances and storage separations.[42] The following table summarizes RE factors from air blast measurements for select high explosives, based on scaled tests comparing peak incident overpressure and positive-phase impulse to TNT baselines (values expressed as decimals relative to TNT=1.0):| Explosive | Peak Overpressure RE | Impulse RE |
|---|---|---|
| Composition B | 1.10 | 1.15 |
| Torpex | 1.22 | 1.67 |
| Amatol (60/40) | 0.95 | 0.59 |
| Ammonium Picrate | 0.85 | 0.74 |