Fact-checked by Grok 2 weeks ago

Von Neumann entropy

Von Neumann entropy is a fundamental measure in that quantifies the uncertainty or mixedness of a described by a \rho, serving as the quantum analog to classical Shannon entropy and Gibbs entropy in . It is formally defined as S(\rho) = -\operatorname{Tr}(\rho \log_2 \rho), where \operatorname{Tr} denotes the operation over the , and the logarithm is base-2 to express the entropy in qubits. This quantity arises naturally from extending thermodynamic concepts to , generalizing the classical entropy to density operators via their eigenvalues. Introduced by in his seminal 1932 work , the entropy was derived via a gedanken experiment involving the thermodynamic mixing of orthogonal quantum states without energy exchange, linking quantum statistical operators to the second law of . Von Neumann's formulation, originally presented in a 1927 paper and elaborated in the book, established S(\rho) = -k \sum_i p_i \log p_i for discrete eigenvalue decompositions \rho = \sum_i p_i |\phi_i\rangle\langle\phi_i|, where k is Boltzmann's constant, though in contexts it is often normalized to k=1 and base-2 for bit units. This definition highlights its origins in bridging with information and , influencing fields from physics to . Key properties of Von Neumann entropy include non-negativity (S(\rho) \geq 0), with equality for pure states (\rho = |\psi\rangle\langle\psi|), and a maximum value of \log_2 d for a d-dimensional achieved by the maximally mixed state \rho = I/d. It is unitarily invariant, meaning S(U\rho U^\dagger) = S(\rho) for any unitary U, and concave, satisfying S(\sum_i \lambda_i \rho_i) \geq \sum_i \lambda_i S(\rho_i) for probabilities \lambda_i \geq 0 summing to 1. Additionally, it obeys S(\rho_{AB}) \leq S(\rho_A) + S(\rho_B) for bipartite systems and strong subadditivity S(\rho_{ABC}) + S(\rho_B) \leq S(\rho_{AB}) + S(\rho_{BC}), which underpin monogamy relations in quantum correlations. In quantum information theory, Von Neumann entropy is pivotal for tasks such as quantum data compression, where Schumacher's states that n copies of a source state \rho can be compressed to approximately n S(\rho) qubits with vanishing error as n \to \infty. For pure bipartite states |\psi\rangle_{AB}, it measures entanglement via E(|\psi\rangle) = S(\operatorname{Tr}_B(|\psi\rangle\langle\psi|)), enabling protocols like that yield n S(\rho_A) ebits from n copies. Beyond information processing, it appears in quantum thermodynamics to describe work extraction from quantum heat engines and in for entanglement entropy across subsystems, illustrating its broad interdisciplinary impact.

Definition and Basics

Mathematical Definition

The von Neumann entropy S(\rho) of a density operator \rho acting on a finite-dimensional Hilbert space is defined as S(\rho) = -\operatorname{Tr}(\rho \log_2 \rho), where \operatorname{Tr} denotes the trace, which is the sum of the diagonal elements of the operator in any orthonormal basis, and \log_2 is the base-2 logarithm. This definition quantifies the uncertainty or mixedness of the quantum state described by \rho, a Hermitian, positive semi-definite operator with \operatorname{Tr}(\rho) = 1. Since \rho admits an eigenvalue \rho = \sum_i \lambda_i |\psi_i\rangle\langle\psi_i|, where \{\lambda_i\} are the non-negative eigenvalues satisfying \sum_i \lambda_i = 1 and \{|\psi_i\rangle\} form an of eigenvectors, the simplifies to the Shannon-like form S(\rho) = -\sum_i \lambda_i \log_2 \lambda_i. By , the convention $0 \log_2 0 = 0 is adopted to handle zero eigenvalues. This expression highlights that the von Neumann depends only on the of \rho, making it basis-independent. For a single qubit, the density operator can be parametrized using the Bloch representation as \rho = \frac{1}{2} (I + \mathbf{r} \cdot \boldsymbol{\sigma}), where I is the 2×2 , \boldsymbol{\sigma} = (\sigma_x, \sigma_y, \sigma_z) are the , and \mathbf{r} is the Bloch vector with \|\mathbf{r}\| \leq 1. The eigenvalues of \rho are \frac{1 \pm \|\mathbf{r}\|}{2}, yielding the explicit formula S(\rho) = -\frac{1 + \|\mathbf{r}\|}{2} \log_2 \left( \frac{1 + \|\mathbf{r}\|}{2} \right) - \frac{1 - \|\mathbf{r}\|}{2} \log_2 \left( \frac{1 - \|\mathbf{r}\|}{2} \right). This reaches its maximum value of 1 bit at \|\mathbf{r}\| = 0 (maximally mixed ) and minimum of 0 at \|\mathbf{r}\| = 1 (pure ). The units of S(\rho) are bits when using base-2 logarithm, analogous to classical measures; alternatively, using logarithm \ln yields nats, with a conversion factor of \log_2 e \approx 1.4427 bits per . The von Neumann entropy is , satisfying S\left( \sum_i p_i \rho_i \right) \geq \sum_i p_i S(\rho_i) for probabilities \{p_i\} summing to 1 and operators \{\rho_i\}, with equality if the \rho_i have orthogonal supports. When \rho is diagonal in some basis, S(\rho) reduces to the classical Shannon entropy of the corresponding .

Relation to Classical Entropy

The classical Shannon entropy quantifies the uncertainty or information content in a discrete probability distribution p = \{p_i\} over a of outcomes, defined as H(p) = -\sum_i p_i \log_2 p_i, where the sum is taken over all i with p_i > 0, and the logarithm is base 2 for bits of information. This measure, introduced by in 1948, arises naturally in classical as the average number of yes/no questions needed to determine the outcome of a . Von Neumann entropy S(\rho) = -\operatorname{Tr}(\rho \log_2 \rho) generalizes this concept to quantum density operators \rho, capturing the intrinsic uncertainty in quantum states beyond classical probabilities. When \rho is a diagonal matrix in its eigenbasis with eigenvalues \{p_i\}, corresponding to an incoherent classical mixture of eigenstates, the von Neumann entropy reduces exactly to the Shannon entropy: S(\rho) = H(p). This equivalence holds because the trace operation then simplifies to the classical sum over the eigenvalues, treating the quantum state as a classical probability distribution. However, for coherent superpositions, such as a pure state |\psi\rangle = \sum_i c_i |i\rangle with \rho = |\psi\rangle\langle\psi|, the off-diagonal terms ensure S(\rho) = 0, reflecting zero entropy for a pure quantum state, unlike the positive Shannon entropy that would apply to the squared amplitudes |c_i|^2 in a classical interpretation. The basis independence of von Neumann entropy distinguishes it from classical entropy, which depends on the chosen partitioning of the sample space. Quantum mechanically, S(\rho) is invariant under unitary transformations, providing a unique measure of a state's mixedness regardless of the basis. In contrast, to recover a Shannon entropy via measurement, one must project \rho onto its eigenbasis, yielding the classical distribution \{p_i\} and thus H(p); measuring in a different basis generally produces a different probability distribution and higher or lower Shannon entropy due to quantum interference effects. John von Neumann introduced this quantum entropy in 1927, motivated by the need to extend concepts from classical —such as the entropy of mixtures in —to the formalism of , thereby quantifying the loss of information in quantum ensembles. His formulation predated 's work by over a decade, yet it paralleled the measure later developed for communication channels, with von Neumann even advising to adopt the term "entropy" for its established physical connotations.

Key Properties

Subadditivity and Additivity

One fundamental property of the von Neumann entropy is , which bounds the entropy of a composite quantum system by the sum of the entropies of its subsystems. For a bipartite system with density operator \rho_{AB} acting on the \mathcal{H}_A \otimes \mathcal{H}_B, the subadditivity inequality states S(\rho_{AB}) \leq S(\rho_A) + S(\rho_B), where \rho_A = \mathrm{Tr}_B(\rho_{AB}) and \rho_B = \mathrm{Tr}_A(\rho_{AB}) denote the reduced density operators obtained by tracing out the respective subsystems. This inequality was first established in the early development of . A concise proof of subadditivity relies on the non-negativity of the quantum relative , defined as S(\rho \| \sigma) = \mathrm{Tr}(\rho \log \rho - \rho \log \sigma) for density operators \rho and \sigma with matching support. Consider \sigma_{AB} = \rho_A \otimes \rho_B; then S(\rho_{AB} \| \rho_A \otimes \rho_B) = -S(\rho_{AB}) - \mathrm{Tr}(\rho_{AB} \log (\rho_A \otimes \rho_B)) = -S(\rho_{AB}) + S(\rho_A) + S(\rho_B) \geq 0, which directly implies the desired . Equality in subadditivity holds \rho_{AB} = \rho_A \otimes \rho_B, corresponding to a product state with no correlations between the subsystems. A special case arises when one reduced state is pure, i.e., S(\rho_A) = 0 (or similarly for \rho_B), in which scenario \rho_{AB} must factorize as |\psi\rangle\langle\psi|_A \otimes \rho_B for some pure state |\psi\rangle_A, again achieving equality since S(\rho_{AB}) = S(\rho_B). Subadditivity highlights the role of quantum correlations in reducing the total uncertainty of a joint system relative to its independent parts. The nonnegative quantity S(\rho_A) + S(\rho_B) - S(\rho_{AB}), known as the quantum I(A:B), measures these correlations—encompassing both classical and quantum (entanglement) components—and vanishes precisely for product states. This property underscores how entanglement or classical correlations in \rho_{AB} lead to a joint entropy strictly less than the sum of marginal entropies, distinguishing quantum entropy from its classical Shannon counterpart.

Strong Subadditivity

The strong inequality for the von Neumann entropy extends the bipartite to quantum states, providing a tighter bound essential for analyzing correlations in multipartite systems. For a \rho_{ABC} on systems A, B, and C, the inequality states that S(\rho_{ABC}) + S(\rho_B) \leq S(\rho_{AB}) + S(\rho_{BC}), where S denotes the von Neumann entropy and \rho_{AB} = \mathrm{Tr}_C(\rho_{ABC}), \rho_{BC} = \mathrm{Tr}_A(\rho_{ABC}), and \rho_B = \mathrm{Tr}_{AC}(\rho_{ABC}) are the reduced s. An equivalent is S(\rho_{ABC}) + S(\rho_A) \leq S(\rho_{AB}) + S(\rho_{AC}), which highlights the role of intermediate systems in bounding total entropy. This property holds for any finite-dimensional and underpins many results in quantum information theory by ensuring non-negative I(A:C|B) \geq 0. Equality in strong subadditivity occurs under specific structural conditions on the state \rho_{ABC}. It holds with equality if the state is a product state across the relevant bipartitions, such as \rho_{ABC} = \rho_A \otimes \rho_{BC} or \rho_{ABC} = \rho_{AB} \otimes \rho_C, where entropies become additive. More generally, equality is achieved for quantum Markov states, which satisfy a quantum analog of the classical Markov chain condition, meaning system A can be recovered from B and C in a way that minimizes correlations beyond the chain. These states are characterized by the existence of a recovery map that preserves the reduced state on B, ensuring I(A:C|B) = 0. The strong subadditivity was first proved by Lieb and Ruskai in 1973 using properties of the quantum relative and concavity of functions, establishing it as a fundamental analogous to classical information-theoretic bounds. Their approach leverages the non-negativity of relative S(\rho\|\sigma) = \mathrm{Tr}(\rho \log \rho - \rho \log \sigma) \geq 0 and its behavior under partial traces to derive the tripartite relation. This proof has been simplified in subsequent works, but the original result remains seminal for its rigor in the non-commutative setting. In quantum information theory, strong subadditivity plays a crucial role in data processing tasks by implying monotonicity of entropies under local quantum operations. It enables chain rules for von Neumann entropy, which are vital in quantum hypothesis testing for bounding error rates in asymptotic regimes.

Monotonicity Under Operations

One key manifestation of the monotonicity of von Neumann entropy arises under the partial trace operation, which discards information from a subsystem of a composite quantum state. For a bipartite density operator \rho_{AB} on Hilbert spaces \mathcal{H}_A \otimes \mathcal{H}_B, the entropy of the reduced state satisfies S(\operatorname{Tr}_B \rho_{AB}) \geq S(\rho_{AB}), with equality if and only if \rho_{AB} is a product state across the bipartition or the traced subsystem is in a pure state relative to the other. This inequality follows directly from the subadditivity of von Neumann entropy and holds more generally because the partial trace is itself a completely positive trace-preserving (CPTP) map. More broadly, von Neumann entropy is monotonically non-decreasing under any CPTP map \Phi, which models the most general physically realizable quantum operations preserving trace and positivity. Thus, for any density operator \rho, S(\Phi(\rho)) \geq S(\rho), with equality if and only if \Phi is reversible (unitary) on the support of \rho. This property, first rigorously established through the non-increasing monotonicity of quantum relative entropy under CPTP maps, implies that quantum operations can only introduce additional or correlations, never reduce them. It underpins the second law of quantum thermodynamics and ensures that entropy serves as a measure of irreversibility in quantum evolutions. Under local operations on multipartite systems, the behavior of von Neumann entropy reflects both the non-decreasing property and . For independent CPTP maps \Phi_A and \Phi_B applied to subsystems A and B of \rho_{AB}, the processed joint entropy satisfies S((\Phi_A \otimes \Phi_B)(\rho_{AB})) \geq S(\rho_{AB}), while yields S((\Phi_A \otimes \Phi_B)(\rho_{AB})) \leq S(\Phi_A(\rho_A)) + S(\Phi_B(\rho_B)), where \rho_A = \operatorname{Tr}_B \rho_{AB} and \rho_B = \operatorname{Tr}_A \rho_{AB}. Since each local map non-decreases the marginal entropies, S(\Phi_A(\rho_A)) \geq S(\rho_A) and S(\Phi_B(\rho_B)) \geq S(\rho_B), the upper bound on the joint entropy after processing is generally larger than before. This framework connects to the Holevo bound, which caps the classical information extractable from quantum ensembles based on entropy differences. The data-processing inequality encapsulates these monotonicity features in the context of transmission. It states that quantum I(A:B)_\rho = S(\rho_A) + S(\rho_B) - S(\rho_{AB}) is non-increasing under local CPTP processing, i.e., I(A:B)_\rho \geq I(A':B')_{\sigma} where \sigma_{A'B'} = (\Phi_A \otimes \Phi_B)(\rho_{AB}). This follows from the non-decreasing behavior of individual and entropies under such maps and forms the cornerstone for bounding capacities in quantum communication protocols, ensuring no can amplify extractable beyond initial correlations.

Dynamics and Evolution

Under Unitary Transformations

The von Neumann entropy of a quantum state, represented by the density operator \rho, remains unchanged under any unitary transformation. For a unitary operator U, the transformed state is U \rho U^\dagger, and S(U \rho U^\dagger) = S(\rho). This invariance arises because unitary operations preserve both the trace and the eigenvalues of \rho, upon which the entropy solely depends. A direct proof leverages the S(\rho) = -\operatorname{Tr}(\rho \log \rho). Substituting the transformed operator yields S(U \rho U^\dagger) = -\operatorname{Tr}\bigl( U \rho U^\dagger \log(U \rho U^\dagger) \bigr). Since the ensures \log(U \rho U^\dagger) = U (\log \rho) U^\dagger, the expression simplifies to -\operatorname{Tr}\bigl( U \rho U^\dagger \cdot U (\log \rho) U^\dagger \bigr) = -\operatorname{Tr}\bigl( U \rho (\log \rho) U^\dagger \bigr). The cyclicity of the trace, \operatorname{Tr}(A B) = \operatorname{Tr}(B A), then gives \operatorname{Tr}(\rho \log \rho), confirming S(U \rho U^\dagger) = S(\rho). In closed , this property ensures that the stays constant during isolated, reversible , reflecting the unitary nature of the dynamics. An illustrative case is the governed by a H, where the unitary is U(t) = e^{-i H t / \hbar}. The evolved \rho(t) = U(t) \rho(0) U(t)^\dagger thus satisfies S(\rho(t)) = S(\rho(0)) for all times t. Unitary maps represent the equality case in the general monotonicity of under completely positive trace-preserving operations.

Under Quantum Measurements

In , a projective on a density operator \rho in an \{|k\rangle\} yields outcome probabilities p_k = \langle k | \rho | k \rangle. The corresponding post-measurement conditional state for each outcome k is the pure state \rho_k = |k\rangle\langle k|, and the non-selective post-measurement density operator, averaging over all possible outcomes, is \rho' = \sum_k p_k |k\rangle\langle k|. The von Neumann entropy of the post-measurement state simplifies to S(\rho') = H(\{p_k\}) = -\sum_k p_k \log p_k, where H(\{p_k\}) is the Shannon entropy of the outcome probabilities, since each \rho_k is pure and contributes zero entropy. More generally, for measurements where conditional states may retain some quantum coherence, the post-measurement entropy is given by S(\rho') = H(\{p_k\}) + \sum_k p_k S(\rho_k), where S(\rho_k) is the von Neumann entropy of the conditional state \rho_k = \frac{\langle k | \rho | k \rangle}{p_k}. For the standard projective case, S(\rho_k) = 0 for all k. The entropy change due to the measurement, \Delta S = S(\rho') - S(\rho), satisfies \Delta S = H(\{p_k\}) + \sum_k p_k S(\rho_k) - S(\rho) \geq 0. This non-negativity follows from the concavity of the von Neumann entropy, which implies S(\rho) \geq \sum_k p_k S(\rho_k), combined with the specific form of the measurement map that diagonalizes \rho in the measurement basis, ensuring S(\rho) \leq H(\{p_k\}) when S(\rho_k) = 0. Equality holds if and only if \rho is already diagonal in the measurement basis, meaning no quantum coherences are destroyed. This measurement-induced entropy production \Delta S quantifies the irreversibility of the process, as the information gained about the system (encoded in the classical outcome probabilities) comes at the cost of increased uncertainty in the quantum description. For a full-rank \rho, the post-measurement state \rho' is a classical mixture of pure states, so S(\rho') equals the classical Shannon entropy H(\{p_k\}), marking a transition toward a classical statistical description. Under more general positive operator-valued measures (POVMs) defined by elements \{E_m\} with \sum_m E_m = I, the non-selective post-measurement state is \rho' = \sum_m E_m^{1/2} \rho E_m^{1/2}, with outcome probabilities p_m = \mathrm{Tr}(E_m \rho) and conditional states \rho_m = E_m^{1/2} \rho E_m^{1/2} / p_m. The entropy change follows a similar form, \Delta S = H(\{p_m\}) + \sum_m p_m S(\rho_m) - S(\rho) \geq 0, but is more complex due to potentially non-zero S(\rho_m) and the broader class of operators, though the increase still reflects information gain and decoherence.

In Open Quantum Systems

In open quantum systems, the evolution of the density operator \rho is described by the , \frac{d\rho}{dt} = -i [H, \rho] + \sum_k \left( L_k \rho L_k^\dagger - \frac{1}{2} \{ L_k^\dagger L_k, \rho \} \right), where H is the system and the L_k are Lindblad operators encoding environmental interactions. The von Neumann entropy S(\rho) = -\operatorname{Tr}(\rho \log \rho) evolves according to \frac{dS}{dt} = -\operatorname{Tr}\left( \frac{d\rho}{dt} \log \rho \right). For unital dynamics, where the dissipative preserves the maximally mixed state (i.e., maps the to itself), this rate is non-negative, \frac{dS}{dt} \geq 0, reflecting irreversible from decoherence processes. The total entropy change in open systems decomposes into production and flow terms: \Delta S = \int (\sigma + \Phi) \, dt, where the production rate \sigma \geq 0 arises internally from system decoherence and irreversibility, while the flow \Phi accounts for entropy exchange with the environment, which can be positive or negative depending on correlations. This distinction ensures the second law holds locally, with production driving the system toward equilibrium despite potential entropy export. Under prolonged dissipation, such as full or , open systems typically approach steady s that maximize the von Neumann entropy, like the infinite-temperature (fully mixed) \rho_\infty = I/d with S(\rho_\infty) = \log d for a d-dimensional . These s emerge as fixed points of the , where \frac{d\rho}{dt} = 0 and entropy production vanishes.

Interpretations

Information-Theoretic Aspects

The von Neumann entropy S(\rho) serves as a fundamental measure of quantum , quantifying the degree of mixedness in a \rho. For pure states, where \rho is a onto a single , S(\rho) = 0, indicating no or complete of the system. In contrast, the reaches its maximum value of \log_2 d for a d-dimensional maximally mixed state \rho = I/d, reflecting the highest possible in the system. This range captures the informational content inherent to quantum descriptions, extending beyond classical notions by accounting for both classical probabilities and quantum coherences. When the quantum state \rho is diagonal in a given basis, representing a classical , the von Neumann entropy reduces to the Shannon entropy, bridging classical and measures. In the context of quantum communication, the von Neumann entropy bounds the amount of classical information that can be reliably transmitted using an ensemble of quantum states, as established by the Holevo theorem. This limit highlights the entropy's role in assessing the of quantum channels for encoding classical data without delving into detailed formulas. From the perspective of (QBism), the von Neumann entropy embodies the observer's subjective uncertainty regarding the , viewing \rho as a personal credence rather than an objective reality. This interpretation emphasizes the entropy as a tool for updating beliefs based on measurement outcomes, aligning with Bayesian principles in . The von Neumann entropy also relates to the distinguishability of , providing a lower bound on the error probability in state discrimination tasks. While measures like the trace distance quantify operational closeness between states, the entropic bound offers an information-theoretic constraint on discrimination errors, particularly useful for ensembles where direct distance computations are challenging.

Thermodynamic Significance

In quantum thermodynamics, the von Neumann entropy plays a central role in describing thermal states. For a system in thermal equilibrium at inverse temperature \beta = 1/(kT), the density operator is the Gibbs state \rho = e^{-\beta H}/Z, where H is the Hamiltonian and Z = \operatorname{Tr}(e^{-\beta H}) is the partition function. The von Neumann entropy of this state is S(\rho) = \beta \operatorname{Tr}(\rho H) - \beta F, where F = -kT \ln Z is the Helmholtz free energy. This expression directly connects to thermodynamic quantities, as the heat capacity C_V = \partial \operatorname{Tr}(\rho H)/\partial T can be derived from the temperature dependence of S(\rho), yielding C_V = T \partial S/\partial T, mirroring classical statistical mechanics. The Gibbs state maximizes the von Neumann entropy among all states with a fixed average \operatorname{Tr}(\rho H) = E, subject to the \operatorname{Tr}(\rho) = 1. This maximum-entropy principle establishes the thermal state as the configuration, generalizing Boltzmann's classical formula S = k \ln W—where W counts microstates—to quantum systems with continuous spectra and non-commuting observables. In this framework, deviations from the maximum entropy quantify nonequilibrium features, such as coherences or correlations that can drive thermodynamic processes. Fluctuation theorems in quantum thermodynamics incorporate the von Neumann entropy to characterize irreversibility and in processes like quantum heat engines. These theorems state that the ratio of probabilities for forward and time-reversed trajectories satisfies \langle e^{-\sigma} \rangle = 1, where \sigma = \Delta S + \sum_k \beta_k Q_k is the total , with \Delta S the change in system von Neumann entropy and Q_k the transferred to the k-th bath. The second law emerges as the average \langle \sigma \rangle \geq 0, implying \Delta S \geq -\sum_k \beta_k Q_k; for a single , this reads \Delta S + \beta Q \geq 0, where Q is the dissipated to the . This formulation holds for arbitrary open , including non-Markovian effects, and applies to cyclic operations in heat engines where bounds fluctuations. Within the resource theory of athermality, where thermal states at fixed temperature T are free resources, the von Neumann entropy quantifies the thermodynamic value of nonequilibrium states as a monotone under thermal operations. For a state \rho with the same average energy as the thermal state \rho_{th} = e^{-\beta H}/Z, the maximum extractable work via unitaries and coupling to the bath is bounded by W \leq kT [S(\rho_{th}) - S(\rho)], reflecting the athermality resource encoded in reduced entropy relative to equilibrium. This bound, derived from the monotonicity of the relative entropy D(\rho \| \rho_{th}) = \operatorname{Tr}(\rho \ln \rho) - \operatorname{Tr}(\rho \ln \rho_{th}), sets the fundamental limit on work extraction, unifying information-theoretic and thermodynamic interpretations in quantum resource conversion.

Generalizations

Conditional and Mutual Entropies

The conditional von Neumann entropy S(A|B) of a bipartite \rho_{AB} quantifies the uncertainty in subsystem A given knowledge of subsystem B, and is defined as S(A|B) = S(\rho_{AB}) - S(\rho_B), where S(\cdot) denotes the von Neumann entropy and \rho_B = \mathrm{Tr}_A(\rho_{AB}) is the reduced state on B. This definition extends the classical to the quantum setting via the chain rule for von Neumann entropies. Unlike its classical counterpart, which is always non-negative, the quantum conditional entropy can take negative values for entangled states, signaling that the subsystems A and B cannot be described independently and highlighting the non-classical correlations inherent in . Negative conditional entropy arises operationally in tasks such as quantum state merging, where it indicates that merging the state requires no additional quantum communication and may even produce ebits as a . The quantum I(A:B) measures the total correlations between subsystems A and B in a bipartite state \rho_{AB}, and is defined as I(A:B) = S(\rho_A) + S(\rho_B) - S(\rho_{AB}), where \rho_A = \mathrm{Tr}_B(\rho_{AB}). This quantity is always non-negative, as it follows from the inequality S(\rho_{AB}) \leq S(\rho_A) + S(\rho_B) for von Neumann entropy, with equality holding \rho_{AB} = \rho_A \otimes \rho_B. Quantum mutual information captures both classical and quantum correlations and serves as an upper bound on the distillable entanglement between A and B. The coherent information I_c(A \rangle B) for a purification of \rho_{AB} (or directly for the state) is given by I_c(A \rangle B) = S(\rho_B) - S(\rho_{AB}). This measure, introduced in the context of , quantifies the amount of quantum information that can be reliably transmitted from A to B through a noisy , with the channel's quantum capacity achieved by maximizing I_c over input states. One entanglement monotone derived from these quantities is the squashed entanglement E_\mathrm{sq}(A:B), defined as half the infimum of the conditional quantum mutual information over all possible purifying extensions \rho_{ABE}: E_\mathrm{sq}(A:B) = \frac{1}{2} \inf_E I(A:B|E), where I(A:B|E) = S(AE) + S(BE) - S(ABE) - S(E). This measure is , , and continuous, providing a faithful quantification of entanglement that is zero for separable states.

Relative Entropy

The quantum relative entropy, also known as Umegaki's relative entropy, between two density operators \rho and \sigma on a finite-dimensional is defined as S(\rho \| \sigma) = \Tr(\rho \log \rho - \rho \log \sigma), provided the support of \rho is contained within the support of \sigma; it is taken to be +\infty otherwise. This measure quantifies the distinguishability between the states \rho and \sigma, serving as a quantum analog of the classical Kullback-Leibler divergence. The relative entropy vanishes, S(\rho \| \sigma) = 0, \rho = \sigma. A fundamental property of the quantum relative entropy is its joint convexity: for any ensemble of density operators \{\rho_i, \sigma_i\}_{i=1}^n and probabilities \{p_i\}_{i=1}^n with \sum_i p_i = 1, it holds that S\left( \sum_i p_i \rho_i \Big\| \sum_i p_i \sigma_i \right) \leq \sum_i p_i S(\rho_i \| \sigma_i). Another key property is monotonicity under completely positive trace-preserving (CPTP) maps: for any CPTP map \Phi, S(\Phi(\rho) \| \Phi(\sigma)) \leq S(\rho \| \sigma). These properties make relative entropy a versatile tool for analyzing information loss in quantum processes. The relative entropy provides a lower bound on the trace distance between states via : S(\rho \| \sigma) \geq \frac{1}{2 \ln 2} \|\rho - \sigma\|_1^2, where \|\cdot\|_1 denotes the trace norm. This inequality establishes a quantitative link between the divergence-like relative entropy and the operational distinguishability captured by the trace distance. The non-negativity of relative entropy, S(\rho \| \sigma) \geq 0, follows from Klein's inequality applied to the monotone f(x) = x \log x and underpins several proofs in quantum information theory. In particular, it yields the subadditivity of the von Neumann entropy for a bipartite \rho_{AB} via S(\rho_{AB} \| \rho_A \otimes \rho_B) \geq 0, implying S(\rho_{AB}) \leq S(\rho_A) + S(\rho_B).

Entanglement and Other Measures

For a pure bipartite state |\psi\rangle_{AB}, the von Neumann entropy of the reduced density matrix \rho_A = \operatorname{Tr}_B (|\psi\rangle\langle\psi|) (or equivalently \rho_B) defines the entanglement entropy E(|\psi\rangle) = S(\rho_A) = S(\rho_B), which quantifies the entanglement between subsystems A and B and vanishes if and only if |\psi\rangle is a product state. This measure arises naturally from the purification of mixed states and equals the entropy of entanglement required for distillation into maximally entangled pairs. For mixed states \rho_{AB}, the entanglement cannot be directly captured by the von Neumann entropy of subsystems alone, but an extension is the relative entropy of entanglement E_R(\rho) = \inf_{\sigma \in \operatorname{SEP}} S(\rho \| \sigma), where the infimum is taken over all separable states \sigma and S(\cdot \| \cdot) is the quantum relative entropy. This quantity, which involves the von Neumann entropy in its definition, provides an entanglement that is zero for separable states and positive otherwise, though it is generally hard to compute exactly. The logarithmic negativity, defined as \mathcal{E}_N(\rho) = \log \|\rho^{T_A}\|_1 where \rho^{T_A} is the partial transpose over subsystem A and \|\cdot\|_1 the trace norm, offers a distinct, computable bound on distillable entanglement that is easier to evaluate than E_R. Unlike these relative entropy-based measures, the von Neumann entropy S(\rho_A) for reduced states serves as an entanglement monotone under local operations and classical communication (), meaning it does not increase on average during such transformations, as required for any valid entanglement quantifier. Quantum , a measure of quantum correlations beyond entanglement, is given by D(A:B) = I(A:B) - \max_{\{\Pi_k\}} J(A:B)_{\{\Pi_k\}}, where the quantum I(A:B) = S(\rho_A) + S(\rho_B) - S(\rho_{AB}) relies on von Neumann entropies, and J captures the maximum classical correlation after measurement on B. This structure highlights the von Neumann entropy's role in bounding total correlations, with vanishing for classical states but present even in separable quantum states. In , the von Neumann entropy bounds the minimum distance d of a code, as seen in entropic derivations of the k \leq n - 2(d-1) (for [[n,k,d]] codes), where k relates to the entropy of the logical and capability ties to subsystem entropies under noise. These entropy-based limits have gained renewed post-2020 for designing fault-tolerant codes in noisy intermediate-scale quantum devices, linking entanglement purification to thresholds.

Rényi Entropies

The quantum Rényi entropies provide a parameterized generalization of the von Neumann entropy, offering a family of measures that capture different aspects of quantum uncertainty depending on the parameter α > 0, α ≠ 1. These are defined for a density operator ρ as S_\alpha(\rho) = \frac{1}{1 - \alpha} \log_2 \Tr(\rho^\alpha). This form arises naturally from quantum extensions of classical Rényi entropies, preserving key informational structures while adapting to non-commuting observables. In the limit as α approaches 1, the quantum continuously recovers the von Neumann entropy through applied to the expression, ensuring consistency with the standard measure of quantum mixedness. The family exhibits monotonicity with respect to α: for 0 < α < β, it holds that S_α(ρ) ≥ S_β(ρ), reflecting how higher-order entropies emphasize rarer events or purer states more strongly. This property stems from the convexity of the function f(x) = x^α for α > 1 and the trace norm constraints on ρ. Quantum Rényi entropies also satisfy subadditivity, S_α(ρ_{AB}) ≤ S_α(ρ_A) + S_α(ρ_B), where ρ_A and ρ_B are the reduced density operators of the bipartite state ρ_{AB}; this inequality holds generally for bosonic systems and in specific parameter regimes for other quantum systems, bounding the total uncertainty by the sum of subsystem uncertainties. A prominent special case occurs at α = 2, where S_2(\rho) = -\log_2 \Tr(\rho^2). Here, Tr(ρ²) quantifies the purity of the state, with S_2(ρ) = 0 for pure states and increasing toward log₂ d for a maximally mixed state in dimension d; this entropy analogously measures collision probabilities in quantum settings, such as the likelihood of distinguishing repeated measurements of ρ. Post-2020 developments have highlighted the utility of quantum Rényi entropies in quantum hypothesis testing, where variants like the Petz-Rényi and sandwiched Rényi divergences underpin optimal error rates and sample complexity for distinguishing quantum hypotheses, extending classical limits to entangled scenarios. In AI-inspired , these entropies enable robust quantification of non-stabilizerness and entanglement, facilitating efficient algorithms for state estimation and feature extraction in high-dimensional quantum data. Extensions to entanglement Rényi entropies apply this family to bipartite reductions, providing α-dependent measures of quantum correlations beyond the α = 1 case.

Historical Context

Origins with

John von Neumann first introduced the concept of quantum entropy in his 1927 paper "Die Thermodynamik quantenmechanischer Gesamtheiten," published in the Nachrichten der Gesellschaft der Wissenschaften zu Göttingen. In this work, he developed a thermodynamic framework for quantum mechanical ensembles, defining entropy as a measure of disorder in statistical mixtures of quantum states, expressed through the trace of the logarithm of the statistical operator U: S = -N k \operatorname{Tr}(U \ln U), where N is the number of systems, k is Boltzmann's constant, and U represents the density matrix normalized such that \operatorname{Tr}(U) = 1. This formulation implicitly addressed entropy for open quantum systems by considering ensembles interacting with heat reservoirs and undergoing reversible transformations, building on earlier ideas from Einstein's thermodynamic fluctuations and Szilard's statistical mechanics to quantify irreversibility in quantum processes. Von Neumann expanded and formalized this idea in his 1932 book Mathematische Grundlagen der Quantenmechanik, where he explicitly defined the entropy of a quantum state described by a density operator \rho as S(\rho) = -\operatorname{Tr}(\rho \log \rho). This definition was motivated by the ergodic hypothesis, which posits that time averages equal ensemble averages in isolated systems, and by the need to describe the outcomes of quantum measurements on mixed states, providing a rigorous measure of uncertainty beyond mere energy eigenvalues. The book arose in the context of early quantum mechanics' foundational challenges, including paradoxes related to measurement and superposition. A key insight in von Neumann's work was deriving the entropy expression from the spectral theorem applied to the density operator, yielding S(\rho) = -\sum_i \lambda_i \log \lambda_i, where \lambda_i are the eigenvalues of \rho, directly generalizing the Boltzmann entropy formula S = -k \sum p_i \log p_i from classical statistical mechanics to the quantum domain. This quantum generalization anticipated later information-theoretic developments, such as Shannon's entropy defined in 1948, which shares a similar functional form but applies to classical probability distributions.

Subsequent Developments

Following the foundational work on von Neumann entropy, its connections to classical information theory were emphasized in 1948 when Claude Shannon introduced his entropy measure for probabilistic sources in communication systems, explicitly drawing parallels to the quantum analog to quantify uncertainty. John von Neumann reportedly advised Shannon to adopt the term "entropy" for this measure, citing its established use in statistical mechanics and cautioning that no one truly understands it. A pivotal advance occurred in 1973 with the proof of strong subadditivity for von Neumann entropy by and Mary Beth Ruskai, establishing that the entropy of a combined quantum system satisfies S(\rho_{AB}) + S(\rho_{BC}) \geq S(\rho_{ABC}) + S(\rho_B) for any tripartite density operator, which became essential for deriving capacities and bounds in protocols. That same year, Alexander S. Holevo proved his eponymous theorem, bounding the classical information extractable from a quantum ensemble by the von Neumann entropy of the average state, \chi(\{p_i, \rho_i\}) = S(\sum_i p_i \rho_i) - \sum_i p_i S(\rho_i) \leq \log d, where d is the , thus formalizing limits on quantum communication channels. During the 1980s and 1990s, von Neumann entropy gained prominence in emerging quantum computing research through its role as a measure of entanglement. Charles H. Bennett, Gilles Brassard, Sandu Popescu, Benjamin Schumacher, Jeffrey A. Smolin, and William K. Wootters demonstrated that the entanglement entropy, defined as the von Neumann entropy of the reduced density matrix of a subsystem, quantifies the distillable entanglement in bipartite pure states and enables protocols for concentrating partial entanglement via local operations. In the post-2000 era, von Neumann entropy featured centrally in the revival of quantum thermodynamics, particularly through extensions of fluctuation theorems to open quantum systems in the 2010s, where it describes and work extraction in nonequilibrium processes beyond classical limits. More recently, in the 2020s, it has been applied to by characterizing entanglement structures in stabilizer codes via graph-theoretic interpretations of entropy, aiding fault-tolerant designs.

References

  1. [1]
    [PDF] Chapter 5 Quantum Information Theory
    We will argue that the Von Neumann entropy quantifies the incompressible information content of the quantum source (in the case where the signal states are pure) ...
  2. [2]
  3. [3]
    [PDF] quantum-computation-and-quantum-information-nielsen-chuang.pdf
    One of the most cited books in physics of all time, Quantum Computation and Quantum. Information remains the best textbook in this exciting field of science.
  4. [4]
    [PDF] A Mathematical Theory of Communication
    In the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible ...
  5. [5]
    Quantum entropies - Scholarpedia
    Feb 8, 2022 · Von Neumann entropy is a natural generalization of the classical Shannon entropy. Surprisingly, von Neumann entropy was introduced by von ...
  6. [6]
    [1502.04489] Understanding von Neumann's entropy - arXiv
    Feb 16, 2015 · We review the postulates of quantum mechanics that are needed to discuss the von Neumann's entropy. We introduce it as a generalization of Shannon's entropy.
  7. [7]
    [PDF] Entropy of quantum states - arXiv
    Apr 26, 2021 · In particular, the von Neumann entropy (15) is the Shannon entropy associated with the spectral decomposition of ρ. The problem of the ambiguity ...<|separator|>
  8. [8]
    Entropy - Scholarpedia
    Nov 16, 2007 · Entropy was generalized to quantum mechanics in 1932 by John von Neumann [N]. Later this led to the invention of entropy as a term in ...
  9. [9]
    Max Delbrück Oral History Interview
    ... Max Delbrück and Gert Molière, “Statistische Quantenmechanik und Thermodynamik,” Abh. a. K. Preuss. Akad. d. Wiss., Phys. Math. Klasse, Nr. 1, 1-42 (1936) ...
  10. [10]
    Proof of the strong subadditivity of quantum‐mechanical entropy
    Dec 1, 1973 · We prove several theorems about quantum‐mechanical entropy, in particular, that it is strongly subadditive.
  11. [11]
    Structure of States Which Satisfy Strong Subadditivity of Quantum ...
    Feb 24, 2004 · We give an explicit characterisation of the quantum states which saturate the strong subadditivity inequality for the von Neumann entropy.
  12. [12]
    Structure of states which satisfy strong subadditivity of quantum ...
    Apr 1, 2003 · We give an explicit characterisation of the quantum states which saturate the strong subadditivity inequality for the von Neumann entropy.
  13. [13]
    [PDF] A simple proof of the strong subadditivity inequality - Rinton Press
    Classically, when the von Neumann entropy is replaced by the Shannon entropy function, the conditional entropy has an intepretation as the average uncertainty.
  14. [14]
    Inequalities for quantum entropy: A review with conditions for equality
    Sep 1, 2002 · This article presents self-contained proofs of the strong subadditivity inequality for von Neumann's quantum entropy, S(ρ), and some related ...
  15. [15]
    [PDF] Lecture 11: Strong subadditivity of von Neumann entropy
    S(X,Y,Z) + S(Z) ≤ S(X,Z) + S(Y,Z). There are multiple known ways to prove this theorem. The approach we will take is to first establish a property of the ...
  16. [16]
    [PDF] 1 Quantum entropy and subadditivity - CSE Home
    1 Quantum entropy and subadditivity. Recall the von Neumann entropy of a density ρ ∈ D(Cn) is defined as. S(ρ) −Tr(ρ log ρ). This is precisely the Shannon ...
  17. [17]
    [PDF] Inequalities for Quantum Entropy - arXiv
    This paper presents self-contained proofs of the strong subadditivity inequality for von Neumann's quantum entropy, S(ρ), and some related in- equalities for ...
  18. [18]
    [PDF] Lecture 4 Properties of the von Neumann entropy
    Thus von Neumann entropy during unitary transformation does not change. As an example of unitary transformation we can consider evolution of density matrix in ...
  19. [19]
    Quantum Measurement and Entropy Production - cond-mat - arXiv
    Jul 20, 2000 · We prove that the Kolmogorov-Sinai entropy of this sequence coincides with rate of von Neumann entropy increase. Comments: 5 pages, 2 ...
  20. [20]
    Information conservation and entropy change in quantum ...
    Nov 2, 2010 · The characteristics and nature of information transfer and entropy change in a quantum measurement are of basic significance for quantum theory.
  21. [21]
    Fundamental limits on quantum dynamics based on entropy change
    Jan 29, 2018 · This implies that it is possible for the rate of entropy change to be negative for strictly super-unital Markovian dynamics. Using the Lindblad ...Iv. Quantum Markov Processes · A. Bosonic Gaussian Dynamics · 1. Pure Decoherence Of A...<|control11|><|separator|>
  22. [22]
    Nonequilibrium Entropy Production for Open Quantum Systems
    Sep 30, 2011 · In this Letter, we provide generic microscopic expressions for the entropy production, and the entropy production rate, for open quantum systems ...
  23. [23]
    Entropy Production in Measured Gaussian Quantum Systems
    Dec 4, 2020 · ... Tr ... (4), it is possible to single out the entropy production and flux rates as the quadratic and linear part of dS/dt in the irreversible ...
  24. [24]
    Active Fault-Tolerant Quantum Error Correction: The Curse of the ...
    Jan 1, 2022 · Active QEC in this context can be seen as the attempt to “cool down” the open system, thus preventing its thermalization, and the threshold ...
  25. [25]
    [PDF] Entropy in general physical theories - QuTech
    Mar 17, 2017 · The Shannon entropy H(Ep) = −Pi pi log pi and von Neumann entropy S(ρ) = −tr(ρlogρ) are extremely useful tools for analysing information ...
  26. [26]
    Von Neumann was not a Quantum Bayesian - Journals
    May 28, 2016 · The most radical approach is QBism, which maintains that all quantum states are expressions of personalist Bayesian probabilities about ...
  27. [27]
    [1710.00054] Quantum fluctuation theorems for arbitrary environments
    Sep 29, 2017 · This paper analyzes entropy production in quantum systems, showing fluctuation theorems and decomposing it into adiabatic and non-adiabatic ...Missing: deffner | Show results with:deffner
  28. [28]
    An introductory review of the resource theory approach to ... - arXiv
    Jul 30, 2018 · Title:An introductory review of the resource theory approach to thermodynamics. Authors:Matteo Lostaglio. View a PDF of the paper titled An ...
  29. [29]
    Negative entropy and information in quantum mechanics - arXiv
    Oct 30, 1997 · Unlike in classical (Shannon) information theory, quantum (von Neumann) conditional entropies can be negative when considering quantum entangled systems.
  30. [30]
    On the quantum, classical and total amount of correlations in ... - arXiv
    Oct 12, 2004 · Abstract: We give an operational definition of the quantum, classical and total amount of correlations in a bipartite quantum state.
  31. [31]
    [quant-ph/9604015] The capacity of the noisy quantum channel - arXiv
    Apr 19, 1996 · View a PDF of the paper titled The capacity of the noisy quantum channel, by Seth Lloyd (MIT Mechanical Engineering). View PDF. Abstract: An ...
  32. [32]
    An Additive Entanglement Measure - quant-ph - arXiv
    Aug 16, 2003 · We derive certain properties of the new measure which we call "squashed entanglement": it is a lower bound on entanglement of formation and an ...
  33. [33]
  34. [34]
    [PDF] Quantum entropy and source coding
    The quantum relative entropy was defined by Umegaki (1962). A fact from which Klein's inequality (as stated in Proposition 5.22) may be derived was proved many ...
  35. [35]
    Mixed-state entanglement and quantum error correction | Phys. Rev. A
    Nov 1, 1996 · Entanglement purification protocols (EPPs) and quantum error-correcting codes (QECCs) provide two ways of protecting quantum states from interaction with the ...
  36. [36]
    Entanglement measures and purification procedures | Phys. Rev. A
    Mar 1, 1998 · We briefly explain the statistical basis of our measure of entanglement in the case of the quantum relative entropy. ... Vedral, M. B. Plenio, ...Missing: paper | Show results with:paper
  37. [37]
    Quantum Discord: A Measure of the Quantumness of Correlations
    Dec 14, 2001 · Quantum discord is the difference between classically identical expressions for mutual information in quantum systems, and it measures the ...Missing: original | Show results with:original
  38. [38]
    [2010.07902] Entropic proofs of Singleton bounds for quantum error ...
    Oct 15, 2020 · We show that a relatively simple reasoning using von Neumann entropy inequalities yields a robust proof of the quantum Singleton bound for quantum error- ...
  39. [39]
    [1306.3142] On quantum Renyi entropies: a new generalization and ...
    Jun 13, 2013 · The Renyi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its ...
  40. [40]
    Strong subadditivity of the Rényi entropies for bosonic and fermionic ...
    Jan 31, 2019 · We show that for bosons the Rényi entropies always satisfy subadditivity, but not necessarily strong subadditivity. Conversely, for fermions ...
  41. [41]
    The subadditivity of quantum entropy in Gaussian quantum systems
    We show that the Rényi entropies are subadditive, further giving rise to the subadditivity of the Tsallis and Unified entropies in certain parameter range. We ...
  42. [42]
    An invitation to the sample complexity of quantum hypothesis testing
    Jun 5, 2025 · Both the Petz–Rényi and sandwiched Rényi divergences reduce to the classical Rényi divergence, and the quantum relative entropy reduces to the ...
  43. [43]
    Lower and upper bounds for entanglement of Rényi-α entropy - Nature
    Dec 23, 2016 · Entanglement Rényi-α entropy is an entanglement measure. It reduces to the standard entanglement of formation when α tends to 1.
  44. [44]
    [PDF] Von Neumann's 1927 Trilogy on the Foundations of Quantum ... - arXiv
    May 20, 2025 · The concept of von Neumann entropy would be further deepened by Claude Shannon in his ... 3My paper “Mathematical foundations of quantum mechanics ...
  45. [45]
    [PDF] Von Neumann did not claim that his entropy corresponds to the ...
    The resolution of von Neumann's classical/quantum measurement paradox lies within the knowledge of the observer. The entropy change for the expansion.
  46. [46]
    Information, Reimagined | American Scientist
    “You should call it entropy, for two reasons,” von Neumann advised. “In the first place your uncertainty function has been used in statistical mechanics under ...
  47. [47]
    [PDF] What is Shannon information? - PhilSci-Archive
    Jul 27, 2014 · traditional story, the term 'entropy' was suggested by John von Neumann to Shannon in the following terms: “You should call it entropy, for two ...
  48. [48]
    Concentrating partial entanglement by local operations | Phys. Rev. A
    Apr 1, 1996 · The concentration process asymptotically conserves entropy of entanglement—the von Neumann entropy of the partial density matrix seen by either ...Missing: 1990s | Show results with:1990s
  49. [49]
    The role of quantum information in thermodynamics—a topical review
    This topical review article gives an overview of the interplay between quantum information theory and thermodynamics of quantum systems.
  50. [50]
    A graph-based approach to entanglement entropy of quantum error ...
    Jan 11, 2025 · This method offers a straightforward interpretation for the entanglement entropy of quantum error correcting codes through graph-theoretical concepts.Missing: 2020s | Show results with:2020s
  51. [51]
    (PDF) Artificial Intelligence in Quantum Communications
    Aug 6, 2025 · This paper offers a comprehensive survey of AI applications in quantum communication, with a focus on machine learning (ML) models such as ...