Fact-checked by Grok 2 weeks ago

Boltzmann's entropy formula

Boltzmann's entropy formula, denoted as S = k \ln W, expresses the thermodynamic S of a macroscopic system in terms of , where k is Boltzmann's constant and W represents the number of accessible microstates consistent with the system's macroscopic properties. This formula bridges the microscopic realm of particle configurations with the macroscopic behavior described by classical , quantifying as a measure of disorder or the multiplicity of ways a system can arrange itself at equilibrium. Introduced through Ludwig Boltzmann's foundational work in the late , it underpins the second law of by linking increase to the probabilistic evolution toward the most likely state. The concept emerged from Boltzmann's efforts to reconcile the irreversible nature of thermodynamic processes with the reversible, time-symmetric equations of . In his paper, Boltzmann developed a probabilistic framework for , defining a measure related to the logarithm of the "permutability", a combinatorial measure quantifying the number of molecular distributions. Although the form S = k \ln W was later formalized by in 1900 to align with quantum considerations, it directly stems from Boltzmann's H-theorem and statistical interpretation, where is proportional to the logarithm of the volume occupied by the system. This statistical view resolves paradoxes like the apparent violation of reversibility in isolated systems, as growth reflects the overwhelming probability of transitioning to states with higher W. The formula's significance extends beyond ideal gases to broader applications in physics, including and , where it inspires analogs like Shannon entropy. For an , S reaches its maximum at , corresponding to the macrostate with the largest W, and for reversible processes involving (with appropriate corrections like the 1/N! factor), it matches the classical thermodynamic . In non-equilibrium contexts, extensions like the Boltzmann entropy for probability distributions, S = -k \int f \ln f \, d\Gamma, generalize the idea to continuous , aiding the study of relaxation dynamics. Despite foundational challenges, such as the addressed by indistinguishability corrections, the formula remains a cornerstone of modern , enabling predictions of capacities, transitions, and quantum statistics.

Background Concepts

Thermodynamic Entropy

In classical , is defined as a measure of the in a or the amount of that is unavailable for doing useful work. It quantifies the degree to which is dispersed and cannot be converted into mechanical work without external intervention. The infinitesimal change in dS for a reversible process is given by dS = \frac{\delta Q_{\text{rev}}}{T}, where \delta Q_{\text{rev}} represents the reversible heat transfer to the system and T is the absolute temperature in kelvin. The concept of was formally introduced by in his 1865 paper, where he coined the term from the Greek word for "" to describe a quantity that captures the directionality of processes. Clausius's formulation established as a , independent of the path taken between states, and linked it directly to the second law of thermodynamics, which states that the of an cannot decrease over time—it either remains constant in reversible processes or increases in irreversible ones. This principle implies that natural processes tend toward states of greater , reflecting the inevitable degradation of energy quality in the . In the (SI), is measured in joules per (J/K), reflecting its nature as divided by . A classic example of increase occurs in the irreversible flow from a hot body to a colder one: if a of Q transfers from a at T_h to one at T_c (with T_h > T_c), the total change is \Delta S = Q \left( \frac{1}{T_c} - \frac{1}{T_h} \right) > 0, demonstrating how such spontaneous processes enhance overall disorder without violating the second law. This thermodynamic perspective on entropy later found a microscopic explanation in , as explored in subsequent sections.

Microstates and Macrostates

In , a macrostate refers to the thermodynamic properties of a , such as its total energy, volume, temperature, and number of particles, which can be measured without resolving individual particle details. These properties characterize the 's equilibrium condition on a , and multiple underlying configurations can yield the same macrostate. In contrast, a microstate specifies the complete, detailed configuration of all particles in the , including their exact positions and momenta in . The concept of provides the framework for describing microstates in classical systems, representing a multidimensional space where each axis corresponds to a particle's or coordinate, resulting in a 6N-dimensional space for N particles (3 positions and 3 per particle). In this continuous , individual microstates are idealized as points, but for practical counting in , the space is often partitioned into small, discrete volumes to approximate the number of accessible states, bridging the gap between continuous classical dynamics and discrete probabilistic treatments. This avoids infinities while preserving the underlying nature of . The multiplicity, denoted as W or Ω, is the total number of microstates associated with a given macrostate, quantifying the degeneracy or accessibility of that macroscopic condition. For systems of identical particles, such as atoms in a gas, indistinguishability plays a crucial role in counting microstates: since particles cannot be distinguished, permutations of their labels do not produce distinct configurations, requiring a division by N! (where N is the number of particles) to avoid overcounting the effective number of microstates. This correction ensures that the statistical description aligns with the physical reality of indistinguishable constituents, as emphasized in foundational treatments of ideal gases.

Core Formula

Definition

Boltzmann's entropy formula provides the fundamental link between the microscopic structure of a and its macroscopic thermodynamic properties. It is expressed as S = k_B \ln W, where S denotes the of the system, k_B is Boltzmann's constant with the exact value $1.380649 \times 10^{-23} J/, \ln is the natural logarithm, and W represents the multiplicity, defined as the number of microstates consistent with a given macrostate. This formula applies specifically to isolated systems in , under the assumption that all accessible microstates are equally probable. Microstates and macrostates form the basis for W, where the macrostate specifies observable thermodynamic variables such as , volume, and particle number. The formula connects macroscopic —a measure of or unavailable energy in —to the underlying microscopic quantified by the number of possible configurations. Regarding units, W is dimensionless as a , and \ln W is also dimensionless, so k_B ensures S has the thermodynamic units of joules per (J/K), scaling the statistical logarithm to physical .

Physical Interpretation

Boltzmann's entropy formula, S = k_B \ln W, where k_B is the and W is the number of s corresponding to a given macrostate, provides a interpretation of thermodynamic as a measure of the multiplicity or number of possible microscopic configurations consistent with the observed macroscopic properties of the system. This formulation quantifies the degree of disorder or uncertainty in the system, as a larger W indicates more ways to arrange the particles while preserving the macrostate, reflecting greater "surprise" in identifying the exact microstate from macroscopic observations. The use of the natural logarithm in \ln W ensures that entropy is an additive quantity for subsystems, making it an proportional to the , which aligns with thermodynamic requirements. For two non-interacting s with multiplicities W_A and W_B, the total multiplicity is W = W_A W_B, so S = k_B \ln (W_A W_B) = k_B \ln W_A + k_B \ln W_B = S_A + S_B, preserving additivity without additional scaling factors. This logarithmic form is unique in satisfying the extensivity condition for composite s in . The formula connects directly to the second law of , which states that isolated evolve toward by increasing their , corresponding to macrostates with the maximum possible multiplicity W. In this view, spontaneous processes favor the most probable macrostates, where the overwhelming number of microstates drives the irreversibly toward higher , explaining the observed directionality of natural processes without invoking additional principles. A classic example is the free expansion of an into a , where the gas initially confined to V expands to $2V without work or exchange. Here, the multiplicity [W](/page/W) increases exponentially with the accessible , as each particle has more positional microstates available, leading to an entropy change \Delta S = N k_B \ln 2 for N particles, illustrating how \ln [W](/page/W) captures the irreversible rise in . Conceptually, \ln W measures the logarithm of the number of possibilities, serving as a precursor to Shannon's information entropy in , where it similarly quantifies uncertainty or the average information needed to specify a from the .

Historical Context

Boltzmann's Original Formulation

In 1872, published his seminal paper "Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen," where he introduced the H-theorem and derived the Maxwell-Boltzmann velocity distribution for gases. The H-theorem demonstrated that a quantity H, defined as the integral of the distribution function times its , decreases monotonically over time, approaching at the Maxwellian distribution, thereby providing a mechanical basis for the second law of . Between 1875 and 1877, Boltzmann shifted toward a probabilistic of in response to challenges like Josef Loschmidt's reversibility , which questioned the irreversibility implied by the second law given time-reversible . In his 1875 extension of the to include external forces and subsequent 1877 papers, he argued that while reversed states are theoretically possible, they are overwhelmingly improbable due to the vast number of molecular configurations. This led to his key insight that is proportional to the logarithm of the "permutability" (Permutabilitätsmaß), a measure of the number of ways microstates can realize a given macrostate. Boltzmann's original formulation appeared in his 1877 memoir "Über die Beziehung zwischen dem zweiten Hauptsatz der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht," where he expressed the entropy as S = \mathrm{const.} + K \log W, with W (or \Omega) representing the permutability measure and K a proportionality constant (later related to Boltzmann's constant k). Here, the natural logarithm simplified combinatorial calculations by converting products of probabilities into sums, enabling the quantification of state multiplicity in thermal equilibrium. Boltzmann noted that the entropy is identical to this logarithmic measure times a constant, up to an additive constant, directly linking thermodynamic entropy to probabilistic considerations.

Planck's Contributions

In 1900, Max Planck adopted Boltzmann's entropy formula to address discrepancies in spectra, particularly to reconcile theoretical predictions with experimental data that deviated from Wien's law at longer wavelengths. Facing the predicted by classical Rayleigh-Jeans theory, Planck hypothesized that the energy of is quantized in discrete units of ε = hν, where h is a new constant and ν is the . He applied the entropy expression = \ln —where represents the multiplicity of ways to distribute total energy among N oscillators—to calculate the average energy per oscillator, yielding a spectral distribution that accurately fit the observed data. Planck refined the formula in his 1901 paper by explicitly deriving the entropy of a single resonator (oscillator) as S = k \left[ \left(1 + \frac{u}{\varepsilon}\right) \ln \left(1 + \frac{u}{\varepsilon}\right) - \left(\frac{u}{\varepsilon}\right) \ln \left(\frac{u}{\varepsilon}\right) \right], where u is the average energy and ε the quantum. This discrete approach, building on Boltzmann's combinatorial method, led to the average energy per oscillator u = \frac{\varepsilon}{e^{\varepsilon / kT} - 1}, with k established as a universal constant linking microscopic statistics to macroscopic thermodynamics. By treating energy exchange between radiation and matter as occurring in finite quanta, Planck's application resolved the infinite energy divergence at high frequencies, marking a pivotal shift from classical to quantum physics. The impact of Planck's work was profound, as Boltzmann's entropy formula served as the essential bridge to , enabling the derivation of that eliminated the and laid the groundwork for subsequent developments in . In his 1914 book The Theory of Heat Radiation, Planck formalized this statistical interpretation, emphasizing the role of discrete energy levels and the constant in unifying thermodynamic with probabilistic microstates.

Mathematical Foundations

Multiplicity Calculation

In , the multiplicity W, also known as the number of microstates \Omega, quantifies the number of distinct microscopic configurations consistent with a given macrostate, such as fixed , volume, and particle number for an . This calculation assumes that all microstates are equally probable, a postulate rooted in the , which posits that over long times, a will explore all accessible microstates uniformly. A simple illustration of multiplicity arises in a two-state system, such as a collection of N non-interacting spins, each able to occupy an "up" or "down" state with energies differing by a fixed amount. If N_\uparrow spins are up and N_\downarrow = N - N_\uparrow are down, the number of distinct arrangements is given by the binomial coefficient W = \binom{N}{N_\uparrow} = \frac{N!}{N_\uparrow! \, N_\downarrow!}, which counts the ways to choose positions for the up spins among the total. This combinatorial approach extends to systems with multiple discrete energy levels, where particles are distributed as N_i in level i. For indistinguishable particles, the multiplicity becomes the multinomial coefficient W = \frac{N!}{\prod_i N_i!}, representing the number of ways to partition N particles into groups of size N_i for each level. For classical ideal gases, where continuous is involved, the multiplicity derivation shifts to integrating over and coordinates while accounting for indistinguishability and quantum . The total phase space volume for N particles with total energy U and volume V is divided by h^{3N} N!, where h is Planck's constant, to yield a dimensionless count of states: W = \frac{1}{N! h^{3N}} \int d^{3N}q \, d^{3N}p, restricted to the hypersurface of constant energy. Evaluating this integral for non-interacting particles leads to an expression proportional to V^N (U)^{3N/2} / N!, with the factorial ensuring correct counting for . Since direct computation of W is impractical for large N, the Stirling approximation simplifies analysis: \ln N! \approx N \ln N - N, so for the multinomial form, \ln W \approx N \ln N - \sum_i N_i \ln N_i. This logarithmic form facilitates handling the vast scale of multiplicities in thermodynamic limits, where N \gg 1, while preserving the extensive nature of the underlying counts.

Role of the Natural Logarithm

The natural logarithm appears in Boltzmann's entropy formula due to Ludwig Boltzmann's introduction of it in his 1877 paper, where he employed it to maximize the multiplicity W of microstates in deriving equilibrium distributions using Lagrange multipliers. This approach connected the combinatorial growth of W to thermodynamic entropy through a logarithmic measure, ensuring the formula aligned with the second law of thermodynamics. A primary mathematical reason for using \ln W is its additivity for independent systems, as \ln(W_1 W_2) = \ln W_1 + \ln W_2, which guarantees that is extensive—scaling linearly with system size in accordance with thermodynamic principles. Without the logarithm, the product of multiplicities for subsystems would not sum to the total, violating extensivity; the logarithm transforms this multiplicative structure into an additive one, reflecting how isolated systems combine without correlations. For large systems, where W often involves factorials like W \approx N! for permutations of N particles, \ln(N!) \approx N \ln N - N provides a practical simplification, converting the intractable into a manageable expression that facilitates calculations. This approximation is essential in , as it yields \ln W \approx N \ln N - N \ln n for distributions, enabling analytical treatment of macroscopic behaviors without enumerating all microstates. The choice of the base-e natural logarithm stems from its mathematical properties, particularly its differentiability, which is crucial in variational principles for optimizing distributions— the of \ln x is $1/x, naturally leading to probability-normalized terms in derivations. In contrast, other bases like base-2 (yielding bits in ) introduce scaling constants but lack the same seamless integration with ; the base-e ensures the entropy functional is convex and amenable to standard optimization techniques. This logarithmic role also emerges in the differential form, where the infinitesimal change in is dS = [k](/page/K) \, d(\ln W) = [k](/page/K) (dW / W), linking increase to relative changes in multiplicity and interpreting it as a measure of probabilistic uncertainty in accessibility.

Generalizations

Gibbs Entropy Formula

The Gibbs formula provides a generalization of Boltzmann's for statistical ensembles where the probabilities of microstates are not necessarily equal, allowing for a broader treatment of thermodynamic s in contact with reservoirs or under varying conditions. Introduced by J. Willard Gibbs in his seminal 1902 work Elementary Principles in Statistical Mechanics, the formula expresses the S_G of a as S_G = -k_B \sum_i p_i \ln p_i, where k_B is Boltzmann's constant, p_i is the probability of the system occupying microstate i, and the sum is taken over all possible microstates. This expression quantifies the average uncertainty or disorder in the system's state, weighted by the probabilities, and forms the foundation for entropy in non-uniform distributions. The formula emerges from the framework of statistical ensembles defined by Gibbs, where probabilities are determined by auxiliary conditions such as fixed energy or temperature. In the microcanonical ensemble, which assumes an isolated system with fixed energy and uniform probabilities p_i = 1/W (with W the total number of accessible microstates), the Gibbs formula reduces to Boltzmann's entropy S = k_B \ln W, as \sum_i p_i \ln p_i = \ln(1/W) = -\ln W, yielding S_G = k_B \ln W. In contrast, the canonical ensemble, applicable to systems in thermal contact with a heat bath at temperature T, assigns probabilities p_i \propto e^{-E_i / k_B T} (with E_i the energy of microstate i and Z the partition function for normalization), enabling the control of temperature through the exponential weighting. These ensembles highlight how the Gibbs formula accommodates different physical constraints while preserving thermodynamic consistency. A key derivation of the Gibbs formula and associated probabilities draws from the principle of maximum , which posits that the most unbiased consistent with known constraints maximizes the expression. Under constraints of probability \sum_i p_i = 1 and fixed average \sum_i p_i E_i = \langle E \rangle, Lagrange multipliers yield the p_i = \frac{1}{Z} e^{-\beta E_i}, where \beta = 1 / k_B T and Z = \sum_i e^{-\beta E_i}; substituting this into the formula confirms its role in deriving distributions. This information-theoretic perspective, later formalized by T. Jaynes, underscores the formula's robustness in inferring probabilities from incomplete information.

Extensions to Probability Distributions

The Gibbs entropy formula for discrete probability distributions over microstates extends to continuous probability densities in classical , particularly for systems in where positions and momenta are treated as continuous variables. For a normalized probability p(\mathbf{x}), the takes the form S = -k_B \int p(\mathbf{x}) \ln p(\mathbf{x}) \, d\mathbf{x}, where k_B is Boltzmann's constant and the is over the phase space volume. This expression arises naturally in the for large systems, providing a measure of or in continuous configurations, and it underpins derivations of thermodynamic properties like in classical fluids and gases. In , the analogous extension is the , which applies to mixed states described by a density operator \rho. The formula is S = -k_B \operatorname{Tr}(\rho \ln \rho), where \operatorname{Tr} denotes the over the . This quantum entropy reduces to the classical Gibbs form in the semiclassical limit and quantifies entanglement and information loss in quantum systems, as originally formulated by to reconcile with . A key link to emerges through the Shannon entropy H = -\sum_i p_i \log_2 p_i, which measures uncertainty in discrete distributions using base-2 logarithms for bits of information; the physical relates via S = k_B \ln 2 \cdot H, bridging thermodynamic disorder to informational content and enabling applications in communication and compression inspired by . Complementing this, the Kullback-Leibler divergence D(p \| q) = \sum_i p_i \ln (p_i / q_i) serves as a relative measure between distributions p and q, playing a central role in for assessing deviations from equilibrium and optimizing inference in non-equilibrium processes. In modern contexts, such as , these entropy extensions inform in probabilistic models, where entropy-based metrics evaluate predictive confidence and guide for tasks like molecular simulations and generative modeling.

Applications and Limitations

Use in Ideal Gases

In the context of classical ideal gases, where particles are non-interacting point masses obeying Maxwell-Boltzmann statistics, Boltzmann's entropy formula S = k_B \ln W provides a direct link between microscopic multiplicity W and macroscopic thermodynamic , enabling explicit calculations for systems like monatomic gases. For such gases, the multiplicity W arises from the volume available to the particles, leading to an absolute expression known as the Sackur-Tetrode equation, derived in 1911-1912 by Otto Sackur and Hugo Tetrode through combinatorial counting of quantum-corrected classical states. This equation for the S of N indistinguishable monatomic particles of mass m in volume V with total energy E is S = N k_B \left[ \ln \left( \frac{V}{N} \left( \frac{4 \pi m E}{3 N h^2} \right)^{3/2} \right) + \frac{5}{2} \right], where k_B is Boltzmann's constant and h is Planck's constant, incorporating a semiclassical correction to resolve Gibbs paradox and ensure extensivity. The logarithmic term reflects the dominant contribution from positional and momentum degrees of freedom, with the +5/2 arising from Stirling's approximation in the multiplicity and quantum phase space discretization. A key application is calculating entropy changes in reversible processes, such as isothermal of an at T. Here, the multiplicity scales with as W \propto V^N, so the entropy change is \Delta S = N k_B \ln (V_f / V_i), where V_f and V_i are the final and initial , respectively; this follows directly from differentiating S = k_B \ln W while holding energy fixed, matching the thermodynamic result from dS = \delta Q_{\rm rev}/T = (P dV)/T using the . For example, expanding one of from 1 L to 2 L at 300 K yields \Delta S \approx 5.76 \, \rm J/K, illustrating irreversible in free where \Delta S > 0 despite no exchange. In , the Maxwell-Boltzmann velocity distribution emerges as the that maximizes W subject to fixed total energy and particle number, ensuring the system occupies the macrostate of highest multiplicity and thus highest entropy. This maximization, via Lagrange multipliers on the constraint \sum n_i \epsilon_i = E, yields the exponential form n_i \propto g_i e^{-\epsilon_i / k_B T}, where g_i is the degeneracy, linking statistical weights to the . The formula also explains thermodynamic properties like . Differentiating the Sackur-Tetrode equation with respect to temperature gives the at constant volume C_V = (3/2) R for monatomic gases, where R = N_A k_B, as the energy term contributes (3/2) N k_B \ln T to S, so T (\partial S / \partial T)_V = C_V. Regarding the third law (Nernst theorem), the classical entropy remains finite as T \to 0 due to unbounded low-energy states, but at low temperatures, the formula predicts C_V \to 0 only through quantum corrections, highlighting its utility in approximating behaviors near but not at . In modern computational contexts, Boltzmann's entropy is employed in simulations of ideal gases to track nonequilibrium evolution, such as entropy growth during free expansion, where trajectory sampling estimates W(t) and verifies S(t) approaches the thermodynamic value for large N. These simulations, using hard-sphere potentials, confirm the formula's predictions for isolated systems evolving toward .

Exclusions for Interacting Systems

Boltzmann's entropy formula, S = k \ln W, where W represents the number of accessible microstates, fundamentally assumes that the microstates are independent and that particles behave without significant interactions, a condition that holds primarily for dilute gases. This independence implies that the multiplicity W can be straightforwardly calculated as the product of individual particle contributions, ignoring any correlations that might arise from interparticle forces. However, in real systems, such assumptions break down when interactions are present, as correlations between particles reduce the effective volume and thus the actual W, leading to inaccuracies in estimates. In denser or more structured systems such as liquids, solids, or plasmas, strong interparticle interactions introduce significant correlations that constrain the accessible microstates, causing Boltzmann's formula to fail in capturing the true thermodynamic . For instance, the illustrates this limitation: when mixing two identical gases, treating particles as distinguishable yields an unphysical entropy increase, but resolving it through particle indistinguishability adjusts the multiplicity to maintain extensivity, highlighting how interactions and correlations alter the counting of states in non-ideal systems. These failures underscore that Boltzmann's approach overestimates in correlated environments by not accounting for the reduced configurational freedom due to forces like van der Waals or interactions. To address these shortcomings in interacting systems, alternatives such as the full Gibbs , S = -k \sum p_i \ln p_i, which incorporates probability distributions over microstates, provide a more robust framework for handling correlations by weighting states according to their likelihood rather than assuming uniform accessibility. For weakly interacting cases, mean-field approximations simplify the problem by replacing detailed interactions with an average field experienced by each particle, enabling approximate calculations of and other thermodynamic quantities while capturing collective effects. A prominent example is the , which describes interacting magnetic spins on a ; here, nearest-neighbor correlations make the simple multiplicity W invalid, necessitating the use of the partition function to derive from the full statistical . Furthermore, Boltzmann's offers no direct coverage of links to the , which connects microscopic fluctuations to macroscopic dissipative responses in equilibrium interacting systems.

References

  1. [1]
    [PDF] 03. Boltzmann Entropy, Gibbs Entropy, Shannon Information.
    The Boltzmann Entropy is defined by. S. B. (Γ(X)) = k log|Γ(X)| where |Γ(X)| is the volume of Γ(X). So: SB(Γ(X)) is a measure of the size of Γ(X). And: SB(Γ(X)) ...
  2. [2]
    [PDF] Lecture 6: Entropy
    information leads to the maximum entropy. With Pi = 1 the entropy is. S =¡X j=1. kB 1 ln 1 =kBln. (37) which is of course the original Boltzmann entropy formula ...
  3. [3]
    Translation of Ludwig Boltzmann's Paper “On the Relationship ...
    Translation of the seminal 1877 paper by Ludwig Boltzmann which for the first time established the probabilistic basis of entropy.
  4. [4]
    [PDF] Rudolf Clausius, “Concerning Several Conveniently ... - Le Moyne
    For each body there are found two quantities, the transformation value of its heat content and its disgregration, the sum of which is its entropy. This however ...
  5. [5]
    8.13: Entropy and the Second Law of Thermodynamics- Disorder ...
    Mar 12, 2024 · Entropy is a measure of a system's disorder and the unavailability of energy to do work, and it is the loss of energy available to do work.
  6. [6]
    The Second Law: From Carnot to Thomson-Clausius, to the Theory ...
    Eventually, Clausius formulated the 2nd-law as the universal entropy growth principle: the synthesis of transfer vs. consumption led to what became known as the ...
  7. [7]
    14.4: Entropy - Physics LibreTexts
    Nov 5, 2020 · The SI unit for entropy is joules per kelvin (J/K). If temperature changes during the process, then it is usually a good approximation (for ...
  8. [8]
    Entropy and the Second Law of Thermodynamics: Disorder and the ...
    The SI unit for entropy is joules per kelvin (J/K). If temperature changes during the process, then it is usually a good approximation (for small changes in ...
  9. [9]
    2. The Statistical Description of Physical Systems - Stanford University
    A macrostate is defined by specifying the value of every macroscopic variable. There may be a huge number of microstates all corresponding to the same ...
  10. [10]
    [PDF] Statistical Mechanics
    A microstate is described by quantifying the variables associated with the states of the individual atoms or molecules in a system.
  11. [11]
    [PDF] Statistical Mechanics, via the counting of microstates of an isolated ...
    This data fixes a macrostate. By a microstate we'll mean a specification of the state of all the many constituents of the system, e.g. the positions and ...
  12. [12]
    8. Evolution of Phase Space Probabilities
    I then said we could still study classical mechanics, in which states are continuous rather than discrete, by dividing phase space up into tiny volumes of width ...
  13. [13]
    [PDF] A different approach to introducing statistical mechanics - Physics
    Note that if we know a system's microstate, we also know its macrostate: The total energy U, for instance, is just the sum of the energies of all the molecules.
  14. [14]
    [PDF] Statistical physics
    May 17, 2021 · However, quantum-mechanical particles such as electrons are indistinguishable from each other. Thus microstates (1,2) and (2,1) should be ...
  15. [15]
    [PDF] arXiv:0903.4748v2 [physics.chem-ph] 29 Jun 2009
    Jun 29, 2009 · The microstates numbers for distinguish- able and indistinguishable particles are certainly differ- ent and then their corresponding Boltzmann ...
  16. [16]
    Boltzmann's Work in Statistical Physics
    Nov 17, 2004 · The celebrated formula \(S = k \log W\), expressing a relation between entropy \(S\) and probability \(W\) has been engraved on his tombstone ( ...
  17. [17]
    Boltzmann constant - CODATA Value
    Boltzmann constant $k$. Numerical value, 1.380 649 x 10-23 J K-1. Standard uncertainty, (exact). Relative standard uncertainty, (exact).
  18. [18]
    [PDF] Statistical Mechanical Ensembles - Athanassios Z. Panagiotopoulos
    Boltzmann's entropy formula links a macroscopic thermodynamic quantity, the entropy, to microscopic attributes of a system at equilibrium. In this section, we.
  19. [19]
    Boltzmann's Work in Statistical Physics
    Nov 17, 2004 · The celebrated formula S = k logW, expressing a relation between entropy S and probability W has been engraved on his tombstone (even though ...
  20. [20]
    [PDF] 437 Chapter 12: The Statistical Definition of Entropy
    May 12, 2021 · The. Boltzmann formula is then just S = k ln Wmax for a single system. The practical application of the. Boltzmann formula requires that we ...
  21. [21]
    Entropy - HyperPhysics
    As a large system approaches equilibrium, its multiplicity (entropy) tends to increase. This is a way of stating the second law of thermodynamics.Missing: maximum | Show results with:maximum
  22. [22]
    [2109.07742] Entropy growth during free expansion of an ideal gas
    Sep 16, 2021 · To illustrate Boltzmann's construction of an entropy function that is defined for a microstate of a macroscopic system, we present here the ...
  23. [23]
    Entropy increase in confined free expansions via molecular ...
    Feb 1, 1999 · The eventual entropy increase of an ideal gas undergoing free expansion, 𝛥 ⁢ S = k ⁢ l n ⁡ ( V f i n ⁢ / V 0 ) , requires a “coarse-grained” ...<|separator|>
  24. [24]
    Information Processing and Thermodynamic Entropy
    Sep 15, 2009 · From the logarithmic form of Boltzmann entropy the proportion, by volume, is p ≤ eΔSB/k (with ΔSB the reduction in the Boltzmann entropy ...
  25. [25]
    [PDF] A Mathematical Theory of Communication
    H is then, for example, the H in Boltzmann's famous H theorem. We shall call H = -∑ pi logpi the entropy of the set of probabilities p1,...,pn. If x is a.Missing: precursor | Show results with:precursor<|control11|><|separator|>
  26. [26]
    [PDF] 9. On the distribution law of energy in the normal spectrum
    In the following a new relation for the entropy will be derived which results in a novel radiation formula, which does not disagree with any currently know ...
  27. [27]
    Max Planck and the birth of the quantum hypothesis - AIP Publishing
    Sep 1, 2016 · 14, 1900, Planck now derived his entropy formula by starting with Eq. (24) for the total number W of equally probable complexions of his ...
  28. [28]
    October 1900: Planck's Formula for Black Body Radiation
    It was Max Planck's profound insight into thermodynamics culled from his work on black body radiation that set the stage for the revolution to come.<|control11|><|separator|>
  29. [29]
  30. [30]
    Historical Prospective: Boltzmann's versus Planck's State Counting ...
    Jan 27, 2016 · Boltzmann also clearly demonstrates that there are two distinct contributions to entropy, arising from the distribution of heat (kinetic energy) ...
  31. [31]
    [PDF] Lecture Notes on Statistical Mechanics & Thermodynamics
    We can compute the entropy of an ideal gas from the multiplicity formula ... for indistinguishable particles or sub-systems. Ideal Gas from Boltzmann Statistics.
  32. [32]
    How multiplicity determines entropy and the derivation of the ... - PNAS
    Boltzmann entropy is the logarithm of the multiplicity,. and has the same properties as the thermodynamic (Clausius) entropy for systems such as the ideal gas ( ...Missing: law | Show results with:law
  33. [33]
    [PDF] Relation between Boltzmann and Gibbs entropy and example with ...
    Apr 18, 2018 · The multiplicity factor (12) for the binomial distribution is also called the binomial coefficient. In the case when there are m mutually ...
  34. [34]
    [PDF] Plea for the use of the exact Stirling formula in statistical mechanics
    Oct 30, 2023 · In statistical mechanics, the generally called Stirling approximation is actually an ap- proximation of Stirling's formula.
  35. [35]
    Maximizing entropy - The Theoretical Minimum |
    This is the Maxwell-Boltzmann distribution. The derivation requires Sterling's approximation and Lagrange multipliers. References. Laws of thermodynamics ...
  36. [36]
    [PDF] Derivation of the Boltzmann principle - Institut für Physik
    We derive the Boltzmann principle SB=kB ln W based on classical mechanical models of thermodynamics. The argument is based on the heat theorem and can be ...Missing: kd( | Show results with:kd(
  37. [37]
    Information Theory and Statistical Mechanics | Phys. Rev.
    Information Theory and Statistical Mechanics. E. T. Jaynes. Department of ... 106, 620 – Published 15 May, 1957. DOI: https://doi.org/10.1103/PhysRev ...
  38. [38]
    [PDF] Gibbs and Boltzmann Entropy in Classical and Quantum Mechanics
    Jun 2, 2019 · The Gibbs entropy of a macroscopic classical system is a function of a probabil- ity distribution over phase space, i.e., of an ensemble. In ...
  39. [39]
    [PDF] arXiv:math-ph/0102013v1 9 Feb 2001
    After a short introduction to entropy, von Neumann's gedanken experiment is repeated, which led him to the formula of thermodynamic entropy of a statistical ...
  40. [40]
    [1903.02121] The Generalized Boltzmann Distribution is the ... - arXiv
    Mar 6, 2019 · The Generalized Boltzmann Distribution is the Only Distribution in Which the Gibbs-Shannon Entropy Equals the Thermodynamic Entropy. We show ...
  41. [41]
    [PDF] arXiv:0902.1898v2 [cond-mat.stat-mech] 30 Apr 2009
    Apr 30, 2009 · We have investigated properties of a novel generalized KL divergence in the context of statistical physics and information theory. Our approach ...
  42. [42]
    Entropy of an Ideal Gas - HyperPhysics
    The entropy S of a monoatomic ideal gas can be expressed in a famous equation called the Sackur-Tetrode equation. where. N = number of atoms; k = Boltzmann's ...
  43. [43]
    [PDF] 5.62 Physical Chemistry II - MIT OpenCourseWare
    SACKUR-TETRODE EQUATION. 1911-13. [Sackur and Tetrode were people, not equipment!] P(ε) FOR TRANSLATION. We know Pi for translation. Pi. = e. −ε(L,M,N)/kT where ...
  44. [44]
    [PDF] Lecture 4: Temperature
    stand this yet, but the correct formula for an ideal gas is the Sackur-Tetrode equation: S =NkB ln V. N. +. 3. 2ln. 4 mE. 3Nh2 + 52. (18). There are 3 di ...
  45. [45]
    [PDF] Chapter 20 solutions
    Calculate the entropy change when one mole of an ideal gas is compressed reversibly and isothermally from a volume of 100 dm³ to 50.0 dm³ at 300 K. We defined ...
  46. [46]
    [PDF] notes on thermodynamic formalism - The University of Chicago
    Nov 11, 2014 · The Boltzmann distribution maximizes entropy for fixed energy amongst all probability distributions on states. Since S(U)/kB and βA(β) are ...<|separator|>
  47. [47]
    [PDF] Statistical mechanics - Duke Physics
    e = (QH − QC)/QH = W/(W + QC)=1/(1 + QC/W), so we have to minimize QC and maximize W, which is achieved by a square loop. ... 6.6 Maxwell-Boltzmann distribution.
  48. [48]
    [PDF] Entropy growth during free expansion of an ideal gas
    To illustrate Boltzmann's construction of an entropy function that is defined for a microstate of a macroscopic system, we present here the simple example of ...
  49. [49]
    A simulation method for calculating the absolute entropy and ... - PNAS
    Jun 22, 2004 · Boltzmann samples obtained by Monte Carlo (MC) or molecular dynamics ... where SIG is the entropy of the ideal gas at the same temperature.
  50. [50]
    [PDF] counting evolutions as a foundation for entropy and system/state ...
    When correlations between particle/molecules are present, the Boltzmann entropy fails to correspond to the thermodynamic entropy. 2.1. Preamble. Before we start ...
  51. [51]
    Gibb's paradox - Richard Fitzpatrick
    The resolution of Gibb's paradox is quite simple: treat all molecules of the same species as if they were indistinguishable.
  52. [52]
    [PDF] Chapter 2: Mean-field theory
    Mean-field theory allows for the calculation of such useful solutions, especially since it can be applied with relative ease to any statistical mechanics model ...
  53. [53]
    [PDF] Exact Results for the Ising Model
    The entropy gain due to the creation of a domain wall can be computed from the Boltzmann entropy. ∆S = k ln Ω where here Ω = N is the number of possible ...
  54. [54]
    Negative specific heats: where Clausius and Boltzmann entropies ...
    Jun 23, 2025 · It implies that fluctuations, governed by the Boltzmann entropy, and dissipation ... fluctuation–dissipation theorem. Yet, we cannot expect that ...