Boltzmann distribution
The Boltzmann distribution specifies the equilibrium probability p_i that a system occupies a discrete state i with energy \varepsilon_i in thermal contact with a heat bath at temperature T, given by p_i = \frac{1}{Z} \exp\left( -\frac{\varepsilon_i}{kT} \right), where k is Boltzmann's constant and Z = \sum_j \exp\left( -\frac{\varepsilon_j}{kT} \right) is the partition function ensuring normalization.[1] This canonical ensemble formulation, derived from maximizing entropy subject to fixed average energy via Lagrange multipliers, underpins the statistical mechanical explanation of thermodynamic properties such as pressure, heat capacity, and phase transitions in classical and quantum systems.[1][2] Developed by Ludwig Boltzmann in the late 1860s as part of his atomic hypothesis for the second law of thermodynamics, it connects microscopic state counts to macroscopic observables through the relation S = k \ln W, where W is the number of microstates, resolving the apparent irreversibility of entropy increase via probabilistic fluctuations rather than deterministic laws.[2]![Boltzmann distribution graph showing probability density versus energy levels][float-right] The distribution's exponential decay with energy favors low-energy states at low temperatures while approaching uniformity at high temperatures, enabling predictions for ideal gas laws, blackbody radiation spectra (via extensions like Planck's law), and chemical reaction rates through transition state theory, though quantum indistinguishability requires modifications like Fermi-Dirac or Bose-Einstein statistics for degenerate cases.[3][2]Fundamental Definition
Classical Formulation
In classical statistical mechanics, the Boltzmann distribution provides the equilibrium probability distribution for a system in the canonical ensemble, where the system exchanges energy with a large heat reservoir at fixed temperature T. The probability of the system occupying a microstate with energy \varepsilon_i is p_i = \frac{1}{Z} \exp\left(-\frac{\varepsilon_i}{kT}\right), where k is Boltzmann's constant and Z = \sum_j \exp\left(-\frac{\varepsilon_j}{kT}\right) is the partition function ensuring normalization \sum_i p_i = 1./03:_Classical_Ensembles/3.03:_Canonical_Ensemble)[4] This form emerges from maximizing the entropy subject to constraints on average energy and normalization in the thermodynamic limit.[4] For continuous phase space in classical systems of N indistinguishable particles, the probability density is \rho(\mathbf{q}, \mathbf{p}) = \frac{1}{Z} \exp\left(-\frac{H(\mathbf{q}, \mathbf{p})}{kT}\right), with the partition function Z = \frac{1}{N! h^{3N}} \int \exp\left(-\frac{H(\mathbf{q}, \mathbf{p})}{kT}\right) d^{3N}q \, d^{3N}p, where H is the Hamiltonian, h is Planck's constant for phase space measure, and the $1/N! factor accounts for particle indistinguishability to avoid Gibbs paradox.[5] This formulation assumes weak interactions or ideal conditions where the system explores phase space ergodically, leading to time averages equaling ensemble averages.[6] The distribution implies that lower-energy states are exponentially more probable than higher-energy ones, with the Boltzmann factor \exp\left(-\frac{\Delta \varepsilon}{[kT](/page/KT)}\right) quantifying the relative likelihood for energy difference \Delta \varepsilon./Statistical_Mechanics/Boltzmann_Average) In applications like ideal gases, integrating over momenta yields the Maxwell-Boltzmann velocity distribution f(v) \propto v^2 \exp\left(-\frac{m v^2}{2[kT](/page/KT)}\right), describing the spread of particle speeds.[7] This classical limit holds when quantum effects are negligible, such as at high temperatures or low densities where occupation numbers per state are much less than unity.[8]