Fact-checked by Grok 2 weeks ago

Entropy

Entropy is a fundamental physical quantity that quantifies the degree of disorder, randomness, or energy dispersal in a , with origins in where it measures the amount of unavailable for conversion into mechanical work during a . Introduced by in 1865, entropy is defined such that its change dS for a reversible equals the heat transferred dQ divided by the absolute T, or dS = \frac{dQ_{\text{rev}}}{T}, making it a independent of the path taken. The second law of thermodynamics states that in any occurring in an , the entropy of the system increases, reflecting the natural tendency toward and greater disorder. In , provided a microscopic foundation for entropy in 1877, expressing it as S = k \ln [W](/page/W), where k is Boltzmann's and W is the number of microscopic configurations (microstates) corresponding to a given macroscopic state (macrostate). This formulation links thermodynamic entropy to the probability of states, explaining the second law as a statistical inevitability: systems evolve toward macrostates with higher multiplicity, increasing entropy on average, though fluctuations can temporarily decrease it in small systems. Beyond physics, entropy extends to , where defined it in 1948 as a measure of or in a , given by H = -\sum p_i \log_2 p_i for a discrete \{p_i\}, with units in bits. This Shannon entropy quantifies the average surprise or unpredictability of outcomes, serving as the foundation for data compression, , and communication efficiency limits, and drawing an analogy to thermodynamic entropy by measuring "disorder" in message probabilities. The has further applications in cosmology, where entropy drives the and universe expansion, but these build upon the core thermodynamic and informational interpretations. Overall, entropy unifies diverse phenomena under the principle of increasing disorder, influencing fields from to .

Historical Development

Discovery and Early Concepts

The development of the of entropy emerged in the of 19th-century efforts to understand and improve the of engines, particularly powering the . In 1824, French engineer Sadi Carnot published Reflections on the Motive Power of Fire, which analyzed the theoretical limits of performance by modeling an idealized reversible cycle between a hot and cold reservoir. Carnot's work, though based on the then-prevailing of , demonstrated that no engine could exceed a certain determined by the difference, providing a foundational framework for quantifying transformations in thermal systems without directly introducing entropy. This analysis was motivated by practical studies of inefficiencies, where much of the input was wasted, highlighting the need for a deeper understanding of heat's role in work production. Building on Carnot's ideas, German physicist advanced the mechanical theory of in his 1850 paper "On the Moving Force of ," where he corrected Carnot's caloric assumptions by incorporating James Prescott Joule's experimental findings on -work equivalence and introduced an early measure of unavailable for mechanical work in heat engines. posited that in any process, a portion of the becomes "uncompensated" or transformed in a way that limits further work extraction, laying the groundwork for entropy as a tracking this degradation. This concept arose from quantitative analyses of engine cycles, emphasizing that real processes deviated from Carnot's ideal reversibility due to dissipative effects like and loss. In 1865, Clausius formalized these ideas in his memoir "On Several Convenient Forms of the Fundamental Equations of the Mechanical Theory of Heat," where he defined a quantity S such that its differential dS = \frac{dQ_{\text{rev}}}{T} (with Q_{\text{rev}} as reversible heat and T as ) represents the "transformation content" of , and introduced the term "entropy" derived from the Greek word for . He articulated the second law of thermodynamics as the principle that the entropy of the universe tends to a maximum, explaining the directionality of natural processes in terms of this increasing measure. Clausius's formulation resolved inconsistencies in earlier theories by providing a precise, path-independent criterion for irreversibility, directly informed by experimental data on performance and caloric measurements. During the 1870s, Austrian physicist provided a microscopic foundation for entropy by connecting it to the statistical of molecular configurations in gases. In his 1872 and 1877 papers on the second law, Boltzmann derived that entropy S is proportional to the logarithm of the number of microstates W corresponding to a macrostate, S = k \ln W (where k is a constant), interpreting higher entropy as greater molecular randomness or probability. This statistical view explained entropy's increase as the natural tendency toward more probable disordered states, bridging macroscopic thermodynamic observations from heat engines to atomic-scale dynamics without relying on caloric fluids. Concurrently, American physicist contributed to the statistical interpretation of entropy through his 1876–1878 memoirs "On the Equilibrium of Heterogeneous Substances," where he incorporated entropy into thermodynamic potentials like the to analyze phase equilibria and chemical reactions in multi-component systems. Gibbs's work extended Clausius's macroscopic entropy to statistical ensembles, treating it as an average over probable molecular distributions, which facilitated applications to complex systems beyond simple heat engines, such as solutions and alloys studied experimentally in the late . These developments collectively transformed entropy from an empirical tool for engine efficiency into a fundamental principle governing natural processes.

Etymology and Terminology Evolution

The term "entropy" was coined by Rudolf Clausius in 1865, derived from the Greek prefix en- meaning "in" and tropē meaning "transformation" or "turning," selected to parallel the word "energy" due to their closely related physical roles. In his ninth memoir on the mechanical theory of heat, Clausius explained the choice explicitly: "I have intentionally formed the word entropy so as to be as similar as possible to the word energy, since both these quantities are so nearly allied in their physical significance that they might be called twins." Originally introduced in German as Entropie in Clausius's 1865 publication Über verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie, the term quickly entered English scientific discourse. It appeared in English by 1868 through Peter Guthrie Tait's Sketch of Thermodynamics, the first English-language textbook on the subject, where Tait adopted and explained Clausius's nomenclature to British audiences. Before the coinage of "entropy," Clausius described the underlying quantity as "transformation content" (Verwandlungsinhalt), emphasizing its role in irreversible energy changes. The term's usage evolved with Ludwig Boltzmann's contributions in the , where his statistical interpretation linked entropy to probability and the multiplicity of molecular configurations, infusing it with probabilistic connotations that expanded beyond Clausius's thermodynamic focus. Early English translations, such as the 1867 edition of Clausius's The Mechanical Theory of Heat, facilitated this adoption but occasionally led to terminological ambiguities with concepts like due to overlapping energetic themes in nascent thermodynamic literature.

Core Definitions

Thermodynamic Entropy in Classical Terms

In classical thermodynamics, the foundational relation governing energy transformations is the first law, which states that the infinitesimal change in the internal energy dU of a system equals the heat transferred to the system \delta Q minus the work done by the system \delta W, expressed as dU = \delta Q - \delta W. This law establishes energy conservation but does not address the directionality of processes. Entropy S, introduced by , is an extensive that serves as a measure of the within a that is unavailable for conversion into useful work./04%3A_Unit_3-Classical_Physics-_Thermodynamics_Electricity_and_Magnetism_and_Light/08%3A_Thermal_Physics/8.13%3A_Entropy_and_the_Second_Law_of_Thermodynamics-_Disorder_and_the_Unavailability_of_Energy) Unlike path functions such as \delta Q and work \delta W, which depend on the specific trajectory of a process, entropy is a whose value depends solely on the current state of the system, independent of the path taken to reach it. The defining relation for entropy arises in the context of reversible processes, where the infinitesimal change [dS](/page/DS) is given by [dS](/page/DS) = \frac{\delta Q_\text{rev}}{T}, with \delta Q_\text{rev} representing the reversible and T the absolute temperature in . This expression integrates to yield the entropy change between states as \Delta [S](/page/DS) = \int \frac{\delta Q_\text{rev}}{T}, confirming its path independence for reversible paths connecting states. In classical descriptions, spontaneous processes in isolated systems are characterized by an increase in entropy, reflecting the irreversible degradation of available .

Statistical Mechanics Formulation

In , entropy emerges as a measure of the or associated with the microscopic configurations of a , bridging macroscopic thermodynamic properties to the underlying dynamics of particles. This formulation relies on foundational concepts from , particularly the , which represents all possible states of a as points in a high-dimensional space spanned by positions and momenta of its particles. Ensembles, collections of hypothetical systems sharing specified macroscopic constraints, provide the framework for averaging over these states; the , applicable to isolated systems with fixed , volume, and particle number, assumes uniformity across accessible regions. The fundamental postulate of statistical mechanics, positing that in the absence of additional information, all accessible microstates of an are equally probable a priori, underpins this probabilistic . A macrostate, defined by macroscopic variables like total and , corresponds to a vast number of microstates—specific configurations of particle positions and velocities consistent with those constraints. The multiplicity Ω quantifies this degeneracy, representing the volume of (or number of discrete states in quantized models) occupied by the macrostate; for large systems, Ω grows exponentially with system size, reflecting the immense number of ways to realize conditions. introduced the entropy as the logarithm of this multiplicity, scaled by Boltzmann's constant k (approximately 1.38 × 10^{-23} J/K), yielding the formula S = k \ln \Omega, which connects the additive nature of entropy to the multiplicative growth of accessible states. This definition arises directly from the fundamental postulate in the microcanonical ensemble: with each microstate equally likely, the probability of any specific one is 1/Ω, and the entropy follows as the negative sum over these probabilities weighted by their logarithms, reducing to k ln Ω. For systems not strictly isolated or where probabilities p_i deviate from uniformity—such as in contact with a reservoir—J. Willard Gibbs generalized the expression to the entropy S = -k \sum_i p_i \ln p_i, where the sum runs over all microstates, and p_i denotes the probability of the i-th state; this form accommodates ensembles like the canonical one and reduces to Boltzmann's in the equal-probability limit. Both formulations emphasize entropy's role in quantifying the dispersal of information about microscopic details, with higher Ω or broader probability distributions corresponding to greater entropy.

Fundamental Principles

Second Law of Thermodynamics

The asserts that for any , the change in entropy satisfies \Delta S \geq 0, where equality holds only for reversible processes and strict inequality applies to irreversible ones. This principle, first articulated by in 1865, quantifies the tendency of natural processes to evolve toward states of greater disorder or energy dispersal, with the total entropy of the increasing over time. Clausius's earlier formulation in provided a foundational statement of the in terms of : heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time. Complementing this, the Kelvin–Planck statement, proposed by William Thomson () in 1851, addresses the limitations of heat engines: it is impossible for a cyclic process to produce net work by absorbing heat solely from a single at uniform temperature, without rejecting heat to a colder . These equivalent expressions underscore the law's role in prohibiting machines of the second kind and establishing fundamental efficiency limits for energy conversion. The second law dictates the directionality of spontaneous processes, ensuring that transformations in isolated systems proceed toward equilibrium without violating the entropy increase. This unidirectional progression underpins the thermodynamic arrow of time, distinguishing past from future in macroscopic phenomena by the irreversible growth of entropy. In non-equilibrium thermodynamics, the second law manifests through the entropy production rate \sigma = \frac{d_i S}{dt} \geq 0, where d_i S represents the internally generated entropy, positive for dissipative processes far from equilibrium. This formulation, developed by Ilya Prigogine, highlights how irreversible phenomena, such as diffusion and chemical reactions, continuously generate entropy, driving systems toward steady states while maintaining the law's universality.

Reversible and Irreversible Processes

In thermodynamics, the concept of reversible heat transfer, denoted as \delta Q_{\text{rev}}, arises from the Clausius definition of entropy, where it represents the infinitesimal heat exchanged during a process that maintains the system in equilibrium at every stage. This reversible heat is crucial for quantifying entropy changes, as it allows the integration along an idealized path where the system's temperature T is well-defined throughout. A reversible process involves infinitesimal changes to the system, ensuring that the system and its surroundings remain in at all times, with no net in the . For such processes, the of the system is given by \Delta S = \int \frac{\delta Q_{\text{rev}}}{T}, where the is taken along the reversible path connecting the initial and final states. In contrast, an features finite deviations from equilibrium, such as sudden or gradients, leading to dissipative effects like or unrestrained ; here, the total of the ( plus surroundings) is strictly greater than zero, \Delta S_{\text{universe}} > 0. This increase reflects the second law's assertion that natural processes tend toward greater disorder. Representative examples illustrate these distinctions without quantitative computation. A quasi-static of a gas, where the external is adjusted infinitesimally to match the 's , proceeds reversibly, allowing the to absorb reversibly while maintaining and producing no net entropy increase in the . Conversely, free occurs when a gas suddenly expands into a , with no work done and no ; this generates entropy in the due to the increased volume and molecular , while the surroundings remain unaffected, resulting in an overall entropy rise for the . The role of reversible processes in defining entropy is foundational: since entropy is a state function, its change between two states is path-independent and computed exclusively using a hypothetical reversible path, even if the actual process is irreversible. This methodological reliance on reversibility ensures consistent evaluation of \Delta S regardless of the real-world path taken.

Quantitative Relations

Entropy as a State Function

In , entropy S is a , meaning its value for a in depends solely on the current state of the —characterized by variables such as U, V, and particle number N—and not on the history or path by which that state was reached. This property was established by in his 1865 formulation, where he demonstrated that the integral of divided by over a cyclic process vanishes, implying that changes in entropy are path-independent for processes connecting equilibrium states. To compute the change in entropy \Delta S between two states, one evaluates the along any convenient reversible path connecting those states, given by \Delta S = \int \frac{\delta Q_\text{rev}}{T}, where \delta Q_\text{rev} is the reversible and T is the in kelvins. This holds because, for irreversible processes, the actual differs, but the nature ensures the net change \Delta S remains the same regardless of the chosen reversible path. The infinitesimal change in entropy for a single-component is expressed through the thermodynamic : dS = \frac{1}{T} dU + \frac{P}{T} dV - \frac{\mu}{T} dN, where P is and \mu is ; this relation derives from combining the first and second and confirms entropy's dependence only on state variables. For a composite consisting of subsystems in and but without interactions affecting their individual entropies, the total entropy is the sum of the subsystem entropies: S_\text{total} = \sum_i S_i. This additivity reflects entropy's extensive nature, scaling with size under such conditions. In the (SI), entropy is measured in joules per (J/K).

Equivalence Between Thermodynamic and Statistical Definitions

The equivalence between the thermodynamic definition of entropy, which emerges from macroscopic observations of and work in reversible es, and the statistical definition, given by Boltzmann's formula S = k \ln \Omega where \Omega is the number of microstates consistent with the macrostate, is a cornerstone of . This alignment is most clearly demonstrated for the monatomic , where the thermodynamic entropy change for a process can be explicitly matched to the statistical expression derived from considerations. In , the change in entropy for an undergoing a reversible process is \Delta S = n [R](/page/R) \ln \frac{V_f}{V_i} + n C_V \ln \frac{T_f}{T_i}, where n is the number of moles, R is the , V is , T is , and C_V is the at constant volume (with C_V = \frac{3}{2} R for a ). Statistically, the absolute entropy is provided by the Sackur-Tetrode equation: S = N k \left[ \ln \left( \frac{V}{N} \left( \frac{4 \pi m U}{3 N h^2} \right)^{3/2} \right) + \frac{5}{2} \right], where N is the number of particles, k is Boltzmann's constant, m is the particle mass, U = \frac{3}{2} N k T is the , and h is Planck's constant; here, \Omega is the volume divided by h^{3N} N! to account for quantum discreteness and particle indistinguishability. Computing the differential dS from this equation yields the thermodynamic relation dS = \frac{dU}{T} + \frac{P dV}{T}, confirming exact agreement for changes in state variables, as first derived by Sackur and Tetrode in 1912. Boltzmann's H-theorem further bridges the two definitions by proving that the statistical entropy monotonically increases toward its maximum at under molecular collisions, mirroring the second law of thermodynamics. The theorem defines the H-function as H(t) = \int f(\mathbf{v}, t) \ln f(\mathbf{v}, t) \, d\mathbf{v}, where f is the velocity distribution; its time satisfies \frac{dH}{dt} \leq 0, with only at the Maxwell-Boltzmann distribution, thus S = -k H increases irreversibly to . This kinetic theory result, established in 1872, shows how microscopic dynamics produce the macroscopic entropy growth observed thermodynamically. The equivalence holds rigorously in the , where the number of particles N \to \infty and V \to \infty with fixed N/V, rendering fluctuations in macroscopic observables negligible (of order $1/\sqrt{N}). In finite systems, however, statistical fluctuations can cause temporary decreases in S, violating the strict monotonicity of the H-theorem and highlighting that thermodynamic entropy is an ensemble average over microstates. Edwin T. Jaynes resolved apparent in this , such as the Gibbs mixing , through an information-theoretic : thermodynamic entropy quantifies the in the given macroscopic constraints, ensuring consistency by treating identical particles as indistinguishable without additional information about labels. This 1957 framework recasts as under incomplete knowledge, aligning the definitions without contradictions.

Applications in Closed Systems

Carnot Cycle and Heat Engines

The represents an idealized reversible that operates between two reservoirs at temperatures T_h (hot) and T_c (cold, with T_h > T_c), serving as a benchmark for the maximum achievable by any . Proposed by Sadi Carnot in 1824, this cycle demonstrates how entropy principles limit the conversion of into work, ensuring that no real can surpass its efficiency under the same conditions. In the context of entropy, the cycle's reversibility implies that the total entropy change over a complete cycle is zero (\oint dS = 0), as the system returns to its initial state without net . The consists of four reversible processes performed on an : (1) isothermal at T_h, where the gas absorbs Q_h from the while expanding and doing work; (2) adiabatic , where the gas cools to T_c without heat exchange, continuing to do work; (3) isothermal compression at T_c, where the gas rejects Q_c (negative by convention) to the while work is done on it; and (4) adiabatic compression, where the gas is heated back to T_h without heat exchange. During the isothermal steps, the entropy change is given by \Delta S = Q/T, so the loses entropy Q_h / T_h and the gains -Q_c / T_c. For the to be reversible, the net entropy change must balance, yielding Q_h / T_h = -Q_c / T_c. The adiabatic steps contribute zero entropy change since dQ = 0. This entropy balance directly leads to the Carnot efficiency, defined as the ratio of net work output W to input Q_h: \eta = W / Q_h = 1 + Q_c / Q_h = 1 - T_c / T_h, where temperatures are in . The derivation follows from (W = Q_h + Q_c) and the entropy condition, highlighting that depends solely on the ratio, independent of the working substance. This establishes the fundamental limit imposed by the second law, as expressed through entropy. In practical heat engines, such as or internal engines, irreversibilities like , heat losses, and non-quasi-static processes generate additional entropy, resulting in efficiencies below the Carnot limit—typically 20-40% for real systems versus up to 60-70% theoretically for high-temperature ratios. These entropy increases violate the reversible condition, reducing the work extractable from the heat input and underscoring the Carnot cycle's role as an unattainable ideal for engineering design.

Entropy Changes in Simple Processes

In closed systems, the entropy changes for simple non-cyclic processes involving are determined using the thermodynamic relation dS = \frac{\delta Q_{\text{rev}}}{T}, combined with the properties of the . The , PV = nRT, where P is , V is , n is the number of moles, R is the , and T is temperature, provides the foundational for these calculations. Additionally, the at constant , C_V, is essential, with the at constant given by C_P = C_V + R, assuming constant specific heats. As entropy is a state function, its change between two states is independent of the path taken. For an ideal gas, the general expression for the entropy change is \Delta S = n C_V \ln \left( \frac{T_f}{T_i} \right) + n R \ln \left( \frac{V_f}{V_i} \right), where subscripts i and f denote initial and final states, respectively. This formula arises from integrating the differential form dS = \frac{C_V dT}{T} + \frac{n R dV}{V} for reversible processes. For an isothermal expansion or compression of an ideal gas, where T_f = T_i, the entropy change simplifies to \Delta S = n R \ln \left( \frac{V_f}{V_i} \right). In a reversible isothermal process, heat is absorbed or released to maintain constant temperature, leading to an entropy increase for expansion (V_f > V_i) and a decrease for compression. The magnitude reflects the dispersal of energy as the gas molecules occupy a larger or smaller volume. In a reversible adiabatic process for an , no is exchanged (Q = 0), and the process is isentropic, resulting in \Delta S = 0. This occurs because the temperature-volume T V^{\gamma - 1} = \text{constant}, where \gamma = C_P / C_V, ensures the terms in the general entropy expression cancel exactly. For isochoric heating or cooling, where volume is constant (V_f = V_i), the entropy change is \Delta S = n C_V \ln \left( \frac{T_f}{T_i} \right), obtained by integrating dS = \frac{C_V dT}{T} assuming constant C_V. Heating (T_f > T_i) increases entropy as thermal energy is added at varying temperatures, while cooling decreases it. Free of an into a is an irreversible with no work done and no . For an , temperature remains constant (T_f = T_i), so the system's entropy change is \Delta S_{\text{system}} = n R \ln \left( \frac{V_f}{V_i} \right) > 0, identical to that of a reversible isothermal to the same final volume due to the property. The surroundings undergo no entropy change, yielding \Delta S_{\text{universe}} = \Delta S_{\text{system}} > 0, consistent with the second law of .

Applications in Open Systems

Fundamental Thermodynamic Relation

The fundamental thermodynamic relation integrates the first and second laws of thermodynamics into a differential form that describes the behavior of closed thermodynamic systems without mass exchange. For a simple closed system consisting of a single component undergoing only pressure-volume work, the relation is given by dU = T \, dS - P \, dV, where U denotes the internal energy, T the absolute temperature, S the entropy, P the pressure, and V the volume. This equation establishes entropy as a fundamental state variable alongside energy, volume, and other extensive properties. The derivation arises from combining the first law of , which states that the change in equals added minus work done (dU = \delta Q - \delta W), with the second law's definition of entropy change for reversible es (dS = \delta Q_\text{rev} / T). For a reversible in a where the only work is quasistatic against , \delta W = P \, dV and \delta Q_\text{rev} = T \, dS, yielding the relation directly. introduced the entropy differential in his 1865 paper, providing the foundational link between and entropy in reversible transformations. From this relation, thermodynamic variables emerge as partial derivatives of the internal energy treated as a natural function of entropy and volume: temperature is T = \left( \frac{\partial U}{\partial S} \right)_V and pressure is P = -\left( \frac{\partial U}{\partial V} \right)_S. These definitions ensure the consistency of U(S, V) as an exact differential, reflecting the state function nature of energy. For systems involving changes in particle number, such as in chemical reactions within a closed volume, the relation extends to include chemical work: dU = T \, dS - P \, dV + \mu \, dN, where \mu is the chemical potential and N the number of particles or moles. This generalization, crucial for multicomponent systems, was developed by J. Willard Gibbs in his analysis of heterogeneous equilibria. The fundamental relation serves as the basis for deriving other thermodynamic potentials via Legendre transformations. The Helmholtz free energy A = U - T S has differential dA = -S \, dT - P \, dV, useful for constant-temperature processes. The Gibbs free energy G = U - T S + P V yields dG = -S \, dT + V \, dP + \mu \, dN, applicable to constant-temperature and pressure conditions, such as in phase equilibria. These potentials were systematically introduced by Gibbs to simplify the study of complex systems.

Entropy Balance for Open Systems

In open systems, where mass and energy can cross the system boundary, the entropy balance accounts for changes due to heat transfer, mass flow carrying entropy, and internal generation from irreversibilities. Specific entropy, defined as s = S / m where S is the total entropy and m is the mass, serves as a key intensive property for analyzing flows in such systems./06%3A_Entropy_and_the_Second_Law_of_Thermodynamics/6.10%3A_The_second_law_of_thermodynamics_for_open_systems) The general rate form of the entropy balance for an open system () is given by \frac{dS_\text{CV}}{dt} = \sum \left( \frac{\dot{Q}}{T} \right)_b + \sum \dot{m}_\text{in} s_\text{in} - \sum \dot{m}_\text{out} s_\text{out} + \dot{\sigma}, where \frac{dS_\text{CV}}{dt} is the rate of change of entropy within the , \sum \left( \frac{\dot{Q}}{T} \right)_b represents the net rate of entropy transfer by across the boundary at T_b, the mass terms account for entropy convected by inlet and outlet streams, and \dot{\sigma} \geq 0 is the rate of due to irreversibilities, consistent with the second law of thermodynamics. This equation extends the principles from closed systems by incorporating convective entropy transport./06%3A_Entropy_and_the_Second_Law_of_Thermodynamics/6.10%3A_The_second_law_of_thermodynamics_for_open_systems) For steady-state conditions, where the properties do not change with time, \frac{dS_\text{CV}}{dt} = 0, simplifying the balance to $0 = \sum \left( \frac{\dot{Q}}{T} \right)_b + \sum \dot{m}_\text{in} s_\text{in} - \sum \dot{m}_\text{out} s_\text{out} + \dot{\sigma}. This form is widely applied in analyses of devices like turbines, compressors, and heat exchangers, where inlet and outlet flows are steady, allowing quantification of entropy generation from inefficiencies such as or . In availability analysis, also known as analysis, the entropy production term \dot{\sigma} directly relates to the destruction of available work, or exergy , quantified as \dot{X}_\text{destroyed} = T_0 \dot{\sigma}, where T_0 is the ambient temperature. This connection highlights how irreversibilities in open systems reduce the potential for useful work output, guiding optimizations in processes like power generation and cycles.

Specialized Contexts

Chemical Thermodynamics and Phase Transitions

In , entropy plays a central role in determining the spontaneity and equilibrium conditions of reactions and phase changes, particularly through its contribution to the , ΔG = ΔH - TΔS. For phase transitions and chemical processes at constant temperature and pressure, a positive change in entropy (ΔS > 0) often favors spontaneity, reflecting increased molecular disorder or dispersal of energy. Absolute entropies, which provide a reference for these calculations, are grounded in the third of thermodynamics, stating that the entropy of a perfect crystalline substance approaches zero as temperature approaches (0 ). This enables the determination of standard molar entropies (S°) for substances, typically measured in J/·, by integrating data from 0 to 298 K and accounting for phase transitions. During phase transitions at equilibrium, such as or , the process is reversible, and the entropy change is given by ΔS = ΔH / T, where ΔH is the of and T is the temperature in . For (), the entropy of fusion is ΔS_fus = ΔH_fus / T_m; for example, water's at 273 K yields ΔS_fus ≈ 22 J/·, reflecting the increased as solid becomes . Similarly, for , ΔS_vap = ΔH_vap / T_b; water's at 373 K gives ΔS_vap ≈ 109 J/·, a larger value due to the significant in the gas phase compared to . These values are positive, as phase changes from more ordered to less ordered states increase entropy, and approximates ΔS_vap ≈ 85–90 J/· for many non-associated liquids at their boiling points. For chemical reactions, the standard entropy change (ΔS°_rxn) is calculated as the sum over stoichiometric coefficients ν_i of the standard molar entropies of products minus those of reactants: ΔS°_rxn = ∑ ν_i S°_i (products) - ∑ ν_i S°_i (reactants). Standard entropies are tabulated for elements and compounds in their standard states at 298 K and 1 bar, derived from calorimetric measurements and the third law. For instance, the reaction 2H_2(g) + O_2(g) → 2H_2O(l) has ΔS°_rxn ≈ -327 J/mol·K, negative due to the decrease in gaseous moles and formation of a more ordered liquid. This quantity helps predict reaction favorability; reactions producing gases or dissolving solids often have positive ΔS°_rxn. In mixtures and solutions, entropy increases upon mixing due to the greater number of microstates available to the dispersed components. For ideal solutions, the is ΔS_mix = -nR ∑ x_i ln x_i, where n is total moles, R is the , and x_i are mole fractions; this expression arises from the combinatorial increase in configurations and is always positive for mixing distinct . For a binary with equal moles (x_1 = x_2 = 0.5), ΔS_mix = nR ln 2 ≈ 5.76 n J/K, illustrating the entropic drive for in non-interacting systems. This mixing entropy is crucial in alloy formation, polymer blends, and , where it opposes unless effects dominate. Multiphase systems at equilibrium maximize total entropy subject to constraints, as described by the Gibbs phase rule: F = C - P + 2, where F is , C is components, and P is phases. This rule determines the variance in temperature, , and composition for coexistence, such as in a one-component system (C=1) with two phases (P=2), where F=1, fixing the transition along a univariant curve like the vapor pressure line. Entropy's role emerges in the equality of chemical potentials across phases, ensuring minimum free energy and maximum entropy for the ; for example, in a eutectic, the three-phase invariance (F=0) corresponds to entropy-balanced coexistence at the eutectic temperature. Phase diagrams thus map regions of entropy-driven stability, guiding applications in and purification processes.

Quantum and Information Entropy

In , the serves as the fundamental measure of uncertainty or mixedness in the state of a quantum system, generalizing the classical concept of entropy to . For a quantum system described by a \rho, the is defined as S(\rho) = -k \operatorname{Tr}(\rho \ln \rho), where k is Boltzmann's constant and \operatorname{Tr} denotes the . This quantity was introduced by in his foundational work on , providing a basis-independent way to quantify the information content or disorder in quantum states. Unlike classical entropy, it accounts for both classical probabilities and quantum superpositions, vanishing for pure states (\rho^2 = \rho) and reaching a maximum for completely mixed states. In the , where the quantum system reduces to a probabilistic mixture of orthogonal states with probabilities p_i, the recovers the Shannon entropy from : S(\rho) = -k \sum_i p_i \ln p_i, or equivalently in bits, H = -\sum_i p_i \log_2 p_i. This equivalence highlights the deep connection between thermodynamic entropy, , and information measures, as Shannon's formulation quantifies the average uncertainty in a message source. The Shannon entropy, introduced in , underpins modern and data compression, while its quantum analog extends these principles to scenarios involving and . A key application of lies in processing, where it quantifies entanglement in bipartite systems. The entanglement entropy of a subsystem is the of its reduced , serving as a measure of quantum correlations that cannot be captured by classical means; for a maximally entangled of two qubits, it equals \ln 2. This metric is crucial for assessing quantum resources in tasks like and dense coding, and it bounds the fidelity of codes. In , entanglement entropy reveals scaling laws in critical systems, such as the area-law behavior in gapped phases. In , the Bekenstein-Hawking formula assigns an entropy to black holes proportional to the area A of their : S = \frac{k c^3 A}{4 \hbar G}, linking gravitational phenomena to thermodynamic principles. Proposed by in 1973 and confirmed by in 1975, this entropy suggests black holes store information at the scale of one bit per Planck area, resolving puzzles about their thermodynamic behavior. Insights from , which posits that the in a volume of space are encoded on its boundary, further interpret this entropy as arising from across the horizon, with dualities like AdS/CFT providing microscopic realizations. Recent advances in quantum thermodynamics, particularly for open quantum systems, have extended entropy concepts to non-equilibrium dynamics governed by the , \dot{\rho} = -i[H, \rho] + \sum_k \left( L_k \rho L_k^\dagger - \frac{1}{2} \{ L_k^\dagger L_k, \rho \} \right), where H is the and L_k are jump operators modeling . terms in this framework, \dot{S} = -\operatorname{Tr}(\dot{\rho} \ln \rho), decompose into reversible and irreversible contributions, enabling fluctuation theorems that bound work extraction and heat in quantum engines. Developments in the , including universal Lindbladian structures for weakly coupled baths, have clarified how quantum influences entropy generation, with applications to nanoscale devices and quantum batteries. These insights unify closed-system reversibility with open-system irreversibility, advancing resource theories in quantum thermodynamics.

Interpretations and Broader Implications

Order, Disorder, and Energy Dispersal

One common interpretive for entropy portrays it as a measure of microscopic or within a . This perspective, originating from Ludwig Boltzmann's , equates higher entropy with a greater number of possible microscopic configurations, or microstates, that correspond to the observed macroscopic state. For instance, when gas molecules confined to one half of a are allowed to expand freely into the full , the entropy increases because the molecules can occupy far more positional arrangements, reflecting increased at the molecular level. This interpretation underscores how entropy quantifies the improbability of states in large systems, driven by the vast multiplicity of disordered configurations./Thermodynamics/Energies_and_Potentials/Entropy/Disorder_in_Thermodynamic_Entropy) An alternative analogy, proposed by physical chemist , frames entropy in terms of energy dispersal rather than . In this view, entropy measures the extent to which a system's becomes spread out over available quantum states or at a given . For example, during the spontaneous cooling of a hot metal block in a cooler environment, the block's concentrated disperses into the surroundings, raising the total entropy as the loses its capacity to perform useful work. This dispersal perspective emphasizes the second law's dictate that processes naturally proceed toward states where is more uniformly distributed, without invoking subjective notions of messiness. Despite their pedagogical value, both analogies face critiques for oversimplifying entropy's precise thermodynamic meaning. The interpretation, in particular, is vague and subjective—"" lacks a universal definition and fails to apply universally, as high-entropy states do not always appear chaotic. For instance, a perfect at has zero entropy per the third law of thermodynamics, embodying perfect order with no accessible microstates beyond the , yet as rises, entropy increases due to vibrational excitations while the macroscopic structure remains highly ordered. Similarly, equilibrated mixtures like oil-and-vinegar can achieve high entropy through , appearing more "ordered" than a uniform but lower-entropy . These examples illustrate that entropy tracks accessible states objectively, not perceptual ./21:_Entropy_and_the_Third_Law_of_Thermodynamics/21.02:_The_3rd_Law_of_Thermodynamics_Puts_Entropy_on_an_Absolute_Scale) In non-equilibrium , recent analyses further nuance these analogies by showing how local order can emerge transiently despite global . In open systems far from , such as dissipative structures, subsystems may exhibit decreased local entropy (increased organization) through energy flows, while the total entropy of the system plus environment rises irreversibly. This reconciles apparent violations of the second in , like self-organizing patterns in fluid , with the overarching principle of entropy increase. Such interpretations remain illustrative aids, not formal definitions, highlighting entropy's role in directional change without reducing it to everyday metaphors.

Entropy in Biology, Cosmology, and Economics

In biology, living organisms maintain their internal low-entropy states by importing from their environment, as proposed by in his 1944 book What is Life?, where he described life as feeding on negative entropy—essentially ordered from sources like food or sunlight—to counteract the entropy increase produced by metabolic processes. This concept highlights how open systems like cells export entropy as and disordered byproducts, allowing ordered structures to persist locally despite the second law's global tendency toward disorder. Prigogine's of dissipative structures further explains this in non-equilibrium systems, where biological entities such as cells or self-organize through energy dissipation, forming complex patterns like metabolic cycles or that increase overall entropy while sustaining internal order; Prigogine received the 1977 for this work on irreversible . Recent studies from 2023–2025 have applied entropy concepts to , showing that often maximizes by optimizing resource flows and , as seen in models where diverse assemblages achieve higher steady-state entropy rates than monocultures, enhancing ecosystem against perturbations. For instance, a 2024 analysis of ant communities used entropy metrics to quantify , revealing that higher correlates with maximized configurational entropy, supporting adaptive in variable environments. In , the second law of implies that the 's entropy increases over time, contributing to its accelerated expansion by dispersing energy across expanding volume, as explored in models where entropy gradients drive cosmic dynamics similar to gravitational forces. This progression culminates in the heat death hypothesis, first articulated by in 1852 and extended to by in 1917, positing that the will reach maximum entropy in a cold, uniform state where no work can be extracted, potentially trillions of years from now as stars exhaust fuel and black holes evaporate. The 's current low-entropy state traces back to the , where Roger Penrose's Weyl curvature hypothesis proposes that the featured near-zero curvature, corresponding to minimal gravitational entropy and enabling the observed homogeneity and subsequent entropy growth; this low starting entropy, estimated at a fraction of 10^{-10^{123}} relative to maximum possible, remains a profound puzzle in . Observations of the () provide a key entropy benchmark, with the photon's contribution alone yielding a total entropy of approximately 10^{88} k_B (Boltzmann's constant) across the , dominated by the relic radiation's blackbody spectrum at 2.725 K. In , entropy serves as a metaphor and quantitative measure for and inefficiency, as developed in Georgescu-Roegen's bioeconomics in his 1971 book The Entropy Law and the Economic Process, which integrates thermodynamic principles to argue that economic activities irreversibly transform low-entropy natural into high-entropy waste, imposing biophysical limits on growth. This perspective critiques by emphasizing entropy as a gauge of production waste, where inefficiency arises from incomplete utilization, such as in that dissipate as faster than it can be recycled. Georgescu-Roegen's approach influenced , highlighting how market "disorder"—like volatile prices or unequal distribution—mirrors entropic dispersal, reducing systemic efficiency and sustainability.

Philosophical and Theoretical Extensions

Adiabatic Accessibility and Maximum Entropy

In thermodynamics, adiabatic accessibility refers to the relation between equilibrium states of a system where one state can be reached from another through an adiabatic process, meaning no heat exchange with the surroundings and only work interactions via a mechanical source. Formally, state Y is adiabatically accessible from state X, denoted X \prec Y, if there exists a process transforming X to Y such that the only net effect on the external world is the displacement of a weight, with all auxiliary devices returning to their initial states. This relation is reflexive and transitive but not necessarily symmetric, with X \sim_A Y indicating adiabatic equivalence (X \prec Y and Y \prec X), and strict accessibility X \prec\prec Y when X \prec Y but not vice versa. Carathéodory's principle underpins this framework by positing that in every neighborhood of any state X, there exist states Z that are not adiabatically accessible from X, implying the existence of irreversible processes. This principle, equivalent to the axiom of irreversibility in axiomatic , ensures that the adiabatic accessibility relation is not trivial and supports the construction of entropy as a . Specifically, a state B is adiabatically accessible from state A if S_B \geq S_A and the U_B \leq U_A (for fixed ), where entropy S is defined such that X \prec\prec Y implies S(X) < S(Y), and S is extensive and additive across subsystems. The second law emerges as the entropy principle: entropy never decreases in adiabatic processes, reaching a maximum at where no further irreversible changes are possible. In , the maximum entropy production principle (MEPP) extends this by stating that systems evolve such that the local rate of \sigma is maximized under given constraints, driving the system toward states of higher overall entropy. While MEPP has been applied successfully in various contexts, such as steady-state processes, its status as a fundamental remains controversial and is not universally accepted. Formulated as \sigma = \sum J_k X_k \geq 0, where J_k are fluxes and X_k are thermodynamic forces, MEPP posits that at each stage of , the system selects paths or configurations that locally maximize \sigma, consistent with the second law's requirement that total is non-negative. This , independently proposed in the mid-20th century and generalized by , applies to steady-state processes in open systems, such as heat conduction or chemical reactions, where it predicts transitions like laminar-to-turbulent flow at critical Reynolds numbers around 1200. Edwin T. Jaynes provided an information-theoretic foundation for maximum entropy in statistical mechanics, interpreting it as a method of inference that selects the probability distribution maximizing informational entropy S = -k \sum p_i \ln p_i subject to known constraints, ensuring the least biased representation of incomplete knowledge. For a system with fixed average energy \langle E \rangle = \sum p_i E_i, the maximum entropy distribution yields the canonical ensemble p_i = \frac{1}{Z} e^{-\beta E_i}, where Z is the partition function and \beta = 1/(kT), justifying the Maxwell-Boltzmann distribution for classical ideal gases without assuming ergodicity or equal a priori probabilities. This approach unifies thermodynamics and information theory by treating entropy maximization as a universal rule for probabilistic inference, applicable beyond physics to any scenario with partial data.

Entropy in Philosophy and Theoretical Physics

In philosophy, entropy raises profound questions about the , exemplified by , which highlights the tension between time-reversible microscopic laws and irreversible macroscopic processes. This paradox arises because reversing particle velocities in a system should, in principle, reverse its evolution, yet entropy appears to increase unidirectionally, suggesting an asymmetry not inherent in the underlying dynamics. Resolutions emphasize that irreversibility stems from geometric constraints on accessibility rather than dynamical asymmetry; stable manifolds contract below quantum resolution, rendering reversed trajectories unobservable and enforcing an apparent forward . Such explanations preserve while accounting for entropy's role in defining temporal direction without invoking special initial conditions. A related philosophical puzzle involves Boltzmann brains, hypothetical self-aware entities arising from rare thermal fluctuations in high-entropy states, which challenge our low-entropy universe's coherence. In eternal universes or scenarios, these brains would vastly outnumber ordered observers like humans, implying that typical conscious experiences should be illusory and disconnected from a structured , leading to cognitive instability and skepticism toward physical theories. Philosophers argue that externalist views of —where awareness depends on broader environmental correlations—can "zombify" Boltzmann brains, rendering them non-conscious and shielding cosmological models from this . This debate underscores entropy's implications for , as low-entropy initial conditions appear improbably fine-tuned to favor , reliable observers over fleeting fluctuations. In , entropy extends to through gravitational entropy, which quantifies the in , particularly at horizons. Formulated as a Noether charge independent of , it applies to and cosmological horizons, linking entropy to lightsheet boundaries and correcting formalisms for configuration-dependent vector fields. This approach reveals entropy's role in gravitational dynamics, where it emerges from the free gravitational field's microstates, paralleling thermodynamic entropy but tied to geometric invariants. poses challenges here, as reconciling 's continuous with quantum discreteness disrupts entropy calculations, potentially violating unitarity or in curved spacetimes. Recent proposals derive itself from quantum relative entropy, coupling fields to via entropic actions, offering a pathway to unify and by treating gravitational effects as emergent from measures. The black hole information paradox exemplifies these tensions, pitting Hawking radiation's entropy increase against quantum unitarity, which demands information preservation. In the 2020s, AdS/CFT holography resolved this by introducing quantum extremal surfaces and entanglement islands, recovering the Page curve to show unitary evaporation where radiation entropy initially rises but later decreases, preserving information. This framework, advanced in works like Almheiri et al. (2020), demonstrates how bulk quantum fields' entropy aligns with boundary conformal theories, bridging semiclassical gravity and quantum information. Progress in from 2024–2025 further illuminates entropy's foundational role, with modular theory deriving semiclassical Einstein equations from quantum relative entropy between vacuum states and excitations near horizons. This generalizes thermodynamic derivations, positing that energy fluxes across horizons—proportional to entropy-area relations—yield equations, suggesting underpins curvature in curved backgrounds. Such advances address unitarity in spacetimes via direct-sum formulations, avoiding singularities by discretizing transformations. Multiverse low-entropy puzzles intensify these issues, as or fluctuating cosmologies predict our universe's improbably ordered state as a rare fluctuation from maximal entropy, akin to Boltzmann's . This implies most observers should inhabit high-entropy voids, rendering low-entropy realms like ours atypical and challenging inflationary models without additional constraints. Debates persist on whether entropy is objective or observer-dependent, particularly in gravity where von Neumann algebras shift types across perspectives, yielding varying entropies for subregions. Using quantum reference frames, calculations show gravitational entropy diverges significantly between observers in semiclassical regimes, suggesting it reflects relational rather than an intrinsic property. This observer-relativity aligns with holographic principles but complicates universal definitions, fueling discussions on entropy's status in .

References

  1. [1]
    [PDF] Rudolf Clausius, “Concerning Several Conveniently ... - Le Moyne
    For each body there are found two quantities, the transformation value of its heat content and its disgregration, the sum of which is its entropy.
  2. [2]
    Second Law - Entropy | Glenn Research Center - NASA
    Nov 22, 2023 · The second law states that if the physical process is irreversible, the entropy of the system and the environment must increase.
  3. [3]
    Boltzmann's Work in Statistical Physics
    Nov 17, 2004 · Particularly famous is his statistical explanation of the second law of thermodynamics. The celebrated formula S = k logW, expressing a relation ...
  4. [4]
    [PDF] A Mathematical Theory of Communication
    A Mathematical Theory of Communication. By C. E. SHANNON. INTRODUCTION. THE recent development of various methods of modulation such as PCM and PPM which ...
  5. [5]
    [PDF] A Review of the Entropy Concept - arXiv
    Clausius created the term entropy as an extensive classical thermodynamic variable, a state function shown to be useful in characterizing the Carnot cycle. Two ...
  6. [6]
    Entropy: From Thermodynamics to Information Processing - PMC
    Oct 14, 2021 · Entropy is a measure largely used in science and engineering [1]. It was initially introduced in thermodynamics by Clausius [2], developed by ...2. Historical Background · 2.2. Boltzmann--Gibbs... · 3.3. Shannon Entropy And...<|control11|><|separator|>
  7. [7]
    [PDF] REFLECTIONS ON THE MOTIVE POWER OF FIRE AND ON ... - ASME
    The thermal machines that existed in Carnot's time were simple steam engines in which heat was supplied to a boiler by burning coal or wood, converting water in ...
  8. [8]
    June 12, 1824: Sadi Carnot Publishes Treatise on Heat Engines
    May 26, 2009 · In 1824 he published Reflections on the Motive Power of Fire, which described a theoretical “heat engine” that produced the maximum amount of work for a given ...
  9. [9]
    Rudolf Clausius | Thermodynamics, Heat Transfer, Entropy - Britannica
    Sep 12, 2025 · Its introduction by the German physicist Rudolf Clausius in 1850 is a highlight of 19th-century physics. The idea of entropy provides a ...
  10. [10]
    Clausius and the Second Law of Thermodynamics | Research Starters
    Rudolf Clausius was a pivotal figure in the development of thermodynamics, particularly known for his formulation of the second law of thermodynamics.
  11. [11]
    Rudolph Clausius (1822–1888) and His Concept of Mathematical ...
    Dec 11, 2022 · Rudolph Clausius is well known as a pioneer of the mechanical theory of heat (1857) and as the creator of the concept of entropy (1865).
  12. [12]
    Boltzmann's Work in Statistical Physics
    Nov 17, 2004 · The celebrated formula S = k logW, expressing a relation between entropy ... molecular disorder.) However, the exact role of these assumptions ...
  13. [13]
    Translation of Ludwig Boltzmann's Paper “On the Relationship ...
    Boltzmann shows that the statistical mechanical quantity he denotes by Ω (multiplied by 2/3) is equal to the thermodynamic quantity entropy (S) as defined by ...
  14. [14]
    Josiah Willard Gibbs 1839-1903
    In these papers Gibbs' starting point for analyzing a system was the state of equilibrium, which (as he pointed out) is characterized by a maximum in the ...
  15. [15]
    [PDF] The Physics of J. Willard Gibbs in his Time
    Who was the "Mr. Josiah Willard Gibbs of New Haven" who was appointed professor of mathematical physics by the Yale Corporation on 13 July 1871?'
  16. [16]
    A History of Thermodynamics: The Missing Manual - PMC
    As noted by Gibbs, in 1850, Clausius established the first modern form of thermodynamics, followed by Thomson's 1851 rephrasing of what he called the Second Law ...
  17. [17]
    [PDF] S is for Entropy. U is for Energy. What Was Clausius Thinking?
    In the 1865 paper (10) he also introduces for S the word entropy, deriving it from the Greek τροπη (a turning, or a change), for which Clausius uses the German ...Missing: etymology | Show results with:etymology
  18. [18]
    Sketch of Thermodynamics - Tait - Google Books
    Bibliographic information ; Author, Tait ; Published, 1868 ; Original from, University Library Basel ; Digitized, Dec 6, 2023.
  19. [19]
    Boltzmann: the probabilistic interpretation of entropy
    Boltzmann used probability theory to quantify the most probable state and then demonstrated the connection between this state and its entropy.
  20. [20]
    first law of thermodynamics: conservation of energy - MIT
    Can also write the first law in differential form: dU = dQ - dW or du = dq - dw. Here the symbol "d" is used to denote that these are not exact differentials ...Missing: original source<|control11|><|separator|>
  21. [21]
    [PDF] Elementray Principles in Statistical Mechanics. - Project Gutenberg
    Jan 22, 2016 · The laws of thermodynamics may be easily obtained from the principles of statistical mechanics, of which they are the incomplete expression, ...
  22. [22]
    Lord Kelvin | On the Dynamical Theory of Heat
    The dynamical theory of heat, thus established by Sir Humphry Davy, is extended to radiant heat by the discovery of phenomena.
  23. [23]
    The Second Law: From Carnot to Thomson-Clausius, to the Theory ...
    Eventually, Clausius formulated the 2nd-law as the universal entropy growth principle: the synthesis of transfer vs. consumption led to what became known as the ...2. Carnot's Caloric Flow · 4. The Entropy Principle · 7. The Entropy Growth...
  24. [24]
    [PDF] The Arrow of Time
    The principle underlying irreversible pro- cesses is summed up in the second law of thermodynamics: the entropy of an isolated system either remains constant or ...Missing: authoritative | Show results with:authoritative
  25. [25]
    [PDF] prigogine-lecture.pdf - Nobel Prize
    The theorem of minimum entropy production expresses a kind of “inertial” property of nonequilibrium systems. When given boundary conditions prevent the system ...
  26. [26]
    A New Thermodynamic Variable: Entropy - Galileo
    for a reversible process, total entropy change (system + environment) ΔS = 0,. · for an irreversible process, total entropy increases, ΔS > 0. Entropy Change ...
  27. [27]
    reversible and irreversible processes, entropy and introduction ... - MIT
    For reversible processes (the most efficient processes possible), the net change in entropy in the universe (system + surroundings) is zero. Phenomena that ...
  28. [28]
  29. [29]
    [PDF] Guide for the Use of the International System of Units (SI)
    Feb 3, 1975 · For example, the joule per kelvin (J/K) is the SI unit for heat capacity as well as for entropy. Thus the name of the unit is not sufficient to ...
  30. [30]
    [PDF] On the 100th anniversary of the Sackur–Tetrode equation - arXiv
    Jan 23, 2013 · Abstract. In 1912, Otto Sackur and Hugo Tetrode independently put forward an equation for the absolute entropy of a monoatomic ideal gas and ...
  31. [31]
    [PDF] The Boltzmann Equation and H-Theorem
    Nov 24, 2021 · ones is the H-Theorem, which states that the entropy of any Boltzmann gas will increase with time and the gas will reach the maximum entropy ...
  32. [32]
    [PDF] Thermodynamic Limit in Statistical Physics - arXiv
    Feb 28, 2014 · The major aim of this work is to provide a better qualitative understanding of the physical significance of the thermodynamic limit in modern ...
  33. [33]
    [PDF] Information Theory and Statistical Mechanics
    Secondly, the paradox regarding the entropy of com- bined systems, which was resolved only by adoption of the generic instead of the specific definition of ...
  34. [34]
    [PDF] THE GIBBS PARADOX
    As a matter of elementary logic, no theory can determine the dependence of entropy on the size N of a system unless it makes some statement about a process.
  35. [35]
    44 The Laws of Thermodynamics - Feynman Lectures - Caltech
    The Carnot cycle. To understand what we are doing, we shall use a plot (Fig ... This Q/T is called entropy, and we say “there is no net change in entropy in a ...
  36. [36]
    Carnot Cycle - HyperPhysics Concepts
    Entropy and the Carnot Cycle​​ Any real engine cycle will result in more entropy given to the environment than was taken from it, leading to an overall net ...
  37. [37]
    [PDF] On the equilibrium of heterogeneous substances : first [-second] part
    Equilibrium of Heterogeneous /Substances. We will farther simplify the problem by supposing that the varia- tions of the parts of the energy and ...
  38. [38]
    6.9 The second law of thermodynamics for open systems
    For open systems, the second law of thermodynamics is often written in rate form, considering entropy transfer via heat and mass transfer. The entropy balance ...
  39. [39]
    13.6: The Third Law of Thermodynamics - Chemistry LibreTexts
    May 13, 2023 · The absolute entropy of a pure substance at a given temperature is the sum of all the entropy it would acquire on warming from absolute zero ( ...Learning Objectives · Definition: Third Law of... · The Third Law Lets us...
  40. [40]
    21.3: The Entropy of a Phase Transition can be Calculated from the ...
    Mar 9, 2025 · 21.3: The Entropy of a Phase Transition can be Calculated from the Enthalpy of the Phase Transition ; Vaporization / boiling · Condensation.
  41. [41]
    19.4: Entropy Changes in Chemical Reactions - Chemistry LibreTexts
    Jul 7, 2023 · The entropy change in a chemical reaction is given by the sum of the entropies of the products minus the sum of the entropies of the reactants.Learning Objectives · Standard Molar Entropy, S0 · Example 19 . 4 . 1 : Haber...
  42. [42]
    Entropy of Mixing - Chemistry LibreTexts
    Apr 1, 2023 · This corresponds to increasing the W in the equation S = k B ⁢ ln ⁡ W . The Mixing of Ideal Gases. For our example, we shall again consider a ...
  43. [43]
    13.1: The Gibbs Phase Rule for Multicomponent Systems
    Apr 12, 2022 · The phase rule assumes the system is at thermal and mechanical equilibrium. We shall assume furthermore that in addition to the temperature and ...
  44. [44]
    Quantum entropies - Scholarpedia
    Feb 8, 2022 · Von Neumann entropy is a natural generalization of the classical Shannon entropy. Surprisingly, von Neumann entropy was introduced by von ...von Neumann entropy · Rényi entropy · Min entropy · Tsallis entropy
  45. [45]
    Pedagogical introduction to the entropy of entanglement for ... - arXiv
    Sep 12, 2012 · The most useful measure of a bipartite entanglement is the von Neumann entropy of either of the reduced density matrices. For a particular class ...<|control11|><|separator|>
  46. [46]
    Bekenstein-Hawking entropy - Scholarpedia
    Oct 30, 2008 · The Bekenstein-Hawking entropy is the entropy to be ascribed to any black hole: one quarter of its horizon area expressed in units of the Planck area.Formula for black hole entropy · What is behind the black hole...
  47. [47]
    [0712.3945] Black hole entropy and the holographic principle - arXiv
    Dec 23, 2007 · Here we review the classical properties of black holes and their associated event horizons, as well as the quantum and thermodynamic properties, ...
  48. [48]
    [PDF] BOLTZMANN ENTROPY : PROBABILITY AND INFORMATION - arXiv
    Boltzmann entropy thus provides a measure of the disorder of the distribution of the states over permissible microstates [4]. Let us derive the expression ...
  49. [49]
    [PDF] The Second Law
    The measure of the dispersal of energy or matter used in thermodynamics is called the entropy, S. We shall soon define entropy precisely and quantitatively, but ...
  50. [50]
    [PDF] Entropy as Disorder: History of a Misconception
    Entropy doesn't always mean disorder; it's a precise, measurable quantity, while disorder is a vague matter of opinion.
  51. [51]
    Entropy and the Second Law of Thermodynamics—The ... - MDPI
    Systems in local thermodynamic equilibrium, but global nonequilibrium, are relatively easy to describe, since the thermal and caloric equations of state ...
  52. [52]
    The Driving Force of Natural Selection: Maximizing Entropy ...
    Jun 11, 2024 · Darwinian natural selection achieves sustainable growth in entropy increase rates and order through genetic variation, a sustainability selected ...
  53. [53]
    Entropy‐Based Assessment of Biodiversity, With Application to Ants ...
    Oct 12, 2024 · This study uses entropy measures to assess biodiversity, applying it to ant nest data, and provides interpretation guidelines for observed and ...
  54. [54]
    Cosmology based on entropy - arXiv
    Mar 18, 2024 · Entropy forces naturally lead to slow late-time accelerated expansion of the Universe. Report issue for preceding element. Within this ...
  55. [55]
    [1203.3382] On the Weyl Curvature Hypothesis - arXiv
    Mar 15, 2012 · The Weyl curvature hypothesis of Penrose attempts to explain the high homogeneity and isotropy, and the very low entropy of the early universe.
  56. [56]
    A LARGER ESTIMATE OF THE ENTROPY OF THE UNIVERSE
    Feb 3, 2010 · We make the first tentative estimate of the entropy of weakly interacting massive particle dark matter within the observable universe, Sdm = 10 ...
  57. [57]
    The Entropy Law and the Impossibility of Perpetual Economic Growth
    describes the proportion of an extracted resource which is wasted in the economic process at the given level of inefficiency. ... Georgescu-Roegen, N.
  58. [58]
    [PDF] the physics and mathematics of the second law of thermodynamics
    The order relation is that of adiabatic accessibility, which, physically, is defined by processes whose only net effect on the surroundings is exchange of ...
  59. [59]
  60. [60]
  61. [61]
    Resolution of Loschmidts Paradox via Geometric Constraints on Information Accessibility
    ### Summary of Key Points on Resolution of Loschmidt's Paradox
  62. [62]
    The Reversibility Paradox: Role of the Velocity Reversal Step
    Sep 2, 2023 · The reversibility, or Loschmidt paradox, is a thought experiment in which microscopic reversibility is exploited to generate an apparently ...
  63. [63]
    Boltzmann brain - University of Pittsburgh
    With that conclusion, Boltzmann had found a way to reconcile the time reversibility of micro-physics and the unidirectional increase of thermodynamic entropy.
  64. [64]
    Lessons from the void: What Boltzmann brains teach
    Jun 16, 2024 · This paper develops a strategy for shielding physical theorizing from the threat of Boltzmann brains.<|separator|>
  65. [65]
    Boltzmann's Universe - Sean Carroll
    Jan 14, 2008 · The low-entropy universe we see is a statistical fluctuation around an equilibrium state of maximal entropy.
  66. [66]
    [2306.04172] From entropy to gravitational entropy - arXiv
    Jun 7, 2023 · This entropy, known as gravitational entropy, reflects the degrees of freedom of the free gravitational field.
  67. [67]
    [2509.10921] Gravitational Entropy - arXiv
    Sep 13, 2025 · We formulate the classical gravitational entropy of a horizon as a Noether charge that does not require the notion of a temperature, and which ...
  68. [68]
    Quantum Relative Entropy implies the Semiclassical Einstein Equations
    ### Summary of Progress on Quantum Relative Entropy Implying Semiclassical Einstein Equations in QFT
  69. [69]
    None
    Nothing is retrieved...<|separator|>
  70. [70]
    Resolving the Black Hole Information Paradox: A Review of ...
    May 19, 2025 · This review explores these breakthroughs, focusing on how the Page curve is recovered, indicating that black hole evaporation is, in fact, unitary.
  71. [71]
    Towards a Unitary Formulation of Quantum Field Theory in Curved ...
    Dec 18, 2024 · This paper proposes a direct-sum QFT (DQFT) in black hole space-time, using discrete space-time transformations, to restore unitarity and avoid ...
  72. [72]
    Gravitational entropy is observer-dependent
    ### Summary of Argument: Gravitational Entropy is Observer-Dependent