Fact-checked by Grok 2 weeks ago

Landauer's principle

Landauer's principle is a of the of , positing that the irreversible of a single bit of in a computational maintained at T requires a minimum of equal to k_B T \ln 2, where k_B is Boltzmann's constant. This fundamental limit arises from the connection between logical irreversibility—such as resetting a bit from an unknown to a known one—and physical increase, ensuring compatibility with law of . Formulated by in 1961, the principle challenged the prevailing view that processing could be thermodynamically cost-free, highlighting instead that certain computations inherently generate . Landauer's insight emerged from applying thermodynamic reasoning to digital logic, particularly to operations lacking a one-to-one mapping between inputs and outputs, which he argued must dissipate energy to resolve ambiguity in the system's state. Initially controversial due to the abstract nature of linking bits to physical , the principle was rigorously substantiated in the 1970s and 1980s by Charles H. Bennett, who demonstrated that —using logic gates that preserve all input information—could circumvent the erasure cost, enabling theoretically dissipation-free computation. Bennett's work, building on Landauer's foundation, showed that while erasure is costly, reversible processes allow information to be manipulated without net , provided the final erasure is deferred or avoided. The implications of Landauer's principle extend to the energy bounds of modern technologies, including limits in scaling, where at (T \approx 300 ), the erasure limit is approximately $2.8 \times 10^{-21} J per bit—far below current devices but relevant for future ultralow-power and . It underscores the physical reality of , influencing fields from nanoscale to , and has inspired research into energy-efficient algorithms that minimize irreversible steps. Experimental confirmations, starting with a 2012 demonstration using a colloidal particle in an optical trap that verified the heat dissipation for bit erasure, have solidified its status, with subsequent studies in superconducting qubits and quantum many-body systems achieving even closer approaches to the bound.

Core Concepts

Statement of the Principle

Landauer's principle asserts that erasing one bit of in a computational dissipates a minimum amount of given by k_B T \ln 2, where k_B is Boltzmann's and T is the absolute temperature of the environment. This fundamental bound links the logical process of information erasure to physical , ensuring that the decrease in information is compensated by an equivalent increase in thermodynamic elsewhere in the . At of approximately 300 K, the Landauer limit equates to about $2.9 \times 10^{-21} J, or roughly 0.018 , per bit erased. This value represents the theoretical floor for energy-efficient computation under irreversible processes. The principle specifically governs irreversible logical operations, where the output does not uniquely determine the input, such as resetting a bit from an unknown initial state (0 or 1 with equal probability) to a definite known state, like 0. In practice, modern processors, reliant on such irreversible operations in non-ideal silicon-based implementations, dissipate energy per bit operation that exceeds this limit by roughly $10^9 to $10^{10} times.

Thermodynamic Interpretation

Landauer's principle establishes a fundamental link between information processing and thermodynamics by asserting that the erasure of one bit of information in a computational system requires a minimum dissipation of energy as heat to the environment. This physical basis arises from the second law of thermodynamics, which prohibits a decrease in the total entropy of an isolated system. When a bit is erased—effectively resetting a memory state from an unknown value (0 or 1) to a definite state (say, 1)—the entropy associated with the system's logical degrees of freedom decreases by k_B \ln 2, where k_B is Boltzmann's constant. To compensate and maintain the second law, this entropy reduction must be offset by an equal or greater increase in the entropy of the environment, typically through the release of heat. The process of is inherently thermodynamically irreversible because it compresses the available to the system, mapping multiple possible (e.g., two for a bit) onto a single without a corresponding expansion elsewhere in the . This reduction in the system's configurational possibilities eliminates about the initial , but the lost "order" manifests as increased disorder in the surrounding . In practical terms, the minimum Q dissipated to the per erased bit satisfies Q \geq k_B T \ln 2, where T is the of the ; this bound represents the thermodynamic cost of discarding and ensures compliance with entropy conservation. This thermodynamic interpretation draws an to classical processes like the isothermal compression of an , where reducing the volume of the gas decreases its while the work performed on it increases the of the surroundings by an equivalent amount through . Similarly, in information erasure, the logical of states requires input that is ultimately dissipated as , preventing any violation of the second . This mechanism underscores why irreversible computational operations, such as those in conventional logic gates, incur a fundamental penalty, distinct from reversible processes that preserve without net .

Historical Development

Origins in Information Theory

The origins of Landauer's principle can be traced to early 20th-century efforts to reconcile information processing with of , particularly through thought experiments addressing apparent violations of increase. In 1929, proposed a seminal model to resolve the paradox posed by , an imaginary entity that sorts fast and slow molecules to create a difference without work. Szilard demonstrated that the demon's of a single particle's position in a box—acquiring one bit of —necessarily increases the system's by at least kT \ln 2, where k is Boltzmann's and T is the , due to the irreversible of the process. This work established a thermodynamic cost for acquisition and , laying foundational groundwork for linking logical operations to physical dissipation. Building on , the formalization of in the mid-20th century provided quantitative tools to explore these connections further. Claude Shannon's 1948 paper introduced as a measure of in communication systems, defining it in terms of bits to quantify the average required to resolve probabilistic outcomes. This mathematical , while initially abstract and focused on , bore a striking formal resemblance to thermodynamic , prompting physicists to investigate whether informational processes inherently carry energetic implications. Shannon's framework thus supplied the conceptual vocabulary for later analyses of computation's physical limits, emphasizing bits as units of reduction. John von Neumann extended these ideas in the late 1940s, integrating with computational and thermodynamic considerations during his work on automata and early computers. In discussions with around 1948, von Neumann highlighted the analogy between 's and Boltzmann's, famously advising that the term "" be retained for the information measure since "no one knows what really is" in physics, underscoring the deep parallels. His 1948-1949 lectures further probed the energy costs of computation, estimating a minimum of approximately $10^{-21} joules per decision at and questioning whether all logical operations must produce to comply with thermodynamic laws. These explorations at the Institute for Advanced Study influenced the emerging field of , setting the stage for inquiries into irreversibility in logical processes. Rolf Landauer, working at IBM in the late 1950s, drew directly from these precursors to address a pressing question in the nascent era of electronic computing: whether information processing is fundamentally dissipative or if truly reversible operations could evade thermodynamic penalties. Motivated by von Neumann's unresolved concerns and Szilard's resolution of measurement costs, as well as Shannon's bit-based entropy, Landauer sought to clarify the physical bounds on logical erasure, countering optimistic views that computation could be thermodynamically cost-free. His investigations at IBM, amid rapid advances in memory technologies, emphasized that many standard computing steps—such as resetting a bit—involve irreversible loss of information, necessitating heat dissipation to maintain the second law. This pre-1961 context framed Landauer's eventual formalization as a bridge between abstract information theory and concrete physical constraints.

Key Theoretical and Experimental Milestones

In 1961, published his seminal paper "Irreversibility and Heat Generation in the Computing Process," proposing that any logically irreversible in a , such as the of one bit of , incurs a minimum thermodynamic cost of k_B T \ln 2 energy dissipation as heat, where k_B is Boltzmann's constant and T is the . This established the principle as a fundamental limit for irreversible logic in processing, bridging and . During the 1970s and 1980s, Charles H. Bennett advanced the theoretical framework by developing , demonstrating that logical reversibility allows most computational steps to avoid , with bit remaining the sole source of the Landauer cost. In his 1973 work "Logical Reversibility of Computation," Bennett proved that universal can be achieved with reversible gates, eliminating unnecessary irreversibility. He expanded this in his 1982 "The Thermodynamics of —a Review," showing how reversible processes can theoretically approach zero except at , thus providing a pathway to circumvent the limit for practical computing. In 2008 and 2009, researchers provided rigorous thermodynamic derivations of the Landauer bound without relying on assumptions of . Sagawa and Ueda derived a generalized second law incorporating , confirming the bound for erasure in systems with and establishing its validity for non-equilibrium processes. Building on this, Kolchinsky and Wolpert in 2009 proved the universality of the principle across arbitrary physical systems, extending it to cases involving correlations and generalizing the increase associated with loss. A theoretical proposal by Vaccaro and Barnett demonstrated that bit erasure can be achieved without energy cost through quasi-static processes, by transferring the thermodynamic burden to changes in conserved quantities like in a , though practical remains challenging due to debates over reversibility. The first experimental measurement of heat during bit erasure occurred in 2012, when Bérut et al. used a single colloidal particle trapped in a double-well optical potential to implement a one-bit , observing average heat release approaching k_B T \ln 2 in the quasi-static regime. In 2014, Jun et al. provided a high-precision confirmation using a colloidal particle in a feedback-controlled potential, measuring within 0.6% of the Landauer bound for erasure cycles much longer than the thermal relaxation time. In 2016, an experiment with nanomagnetic logic devices demonstrated erasure approaching the Landauer bound of approximately 0.026 at 300 K, using adiabatic reset operations on nanoscale magnetic bits to measure intrinsic energy loss as low as 0.035 , highlighting the feasibility of low-dissipation magnetic . By 2018, the principle was extended to through single-photon experiments, where information in photonic setups verified the bound while accounting for quantum , confirming the thermodynamic cost even in fully processing. In 2025, measurements in ultracold gases verified Landauer's principle in quantum many-body systems, approaching the bound amid strong correlations.

Theoretical Foundations

The concept of information entropy, introduced by in 1948, quantifies the uncertainty or average information content associated with a probabilistic system, measured in bits. For a discrete with possible outcomes i having probabilities p_i, the Shannon entropy H is given by H = -\sum_i p_i \log_2 p_i, where the base-2 logarithm ensures units of bits, reflecting the minimum number of yes/no questions needed to identify an outcome on average. This measure captures the inherent unpredictability in a message source or data ensemble, serving as a foundational tool in for assessing and transmission efficiency. In contrast, thermodynamic entropy, originating from Ludwig Boltzmann's statistical mechanics in 1877, describes the disorder or multiplicity of microstates corresponding to a macroscopic . For a in with W accessible microstates, Boltzmann's formula is S = k_B \ln W, where k_B is Boltzmann's constant and S has units of energy over temperature (joules per ). This expression links macroscopic irreversibility, as governed by the second law of , to the underlying probabilistic distribution of microscopic configurations, emphasizing that entropy increases reflect the spreading of energy across more possible arrangements. The bridge between these entropies arises in physical realizations of information processing, where logical states are encoded in physical , such as the position or energy level of particles. In this context, the information entropy H (in bits) corresponds to a change in thermodynamic \Delta S = k_B \ln 2 \, H, with the factor \ln 2 converting from base-2 to natural logarithms to match physical units; this equivalence posits that one bit of information erasure reduces the system's thermodynamic by k_B \ln 2, necessitating a compensatory increase elsewhere to uphold the second law. played a pivotal role in recognizing this formal similarity during early discussions with , advising the adoption of "entropy" for the information measure due to its parallel structure with thermodynamic quantities. In computational systems, this linkage manifests through the representation of states: a bit storing 0 or 1 corresponds to distinct, non-overlapping volumes in the system's configuration space, each with equal $1/2, yielding H = 1 bit of . Erasing the bit—merging these volumes into a single state—decreases the system's by k_B \ln 2, but the process correlates the lost with the , increasing its by at least the same amount to ensure overall remains non-negative. This conceptual tie underscores how abstract operations impose physical constraints, treating logical irreversibility as a form of thermodynamic dissipation.

Derivation of the Energy Bound

To derive the energy bound of Landauer's principle, consider a classical consisting of a single bit of memory in with a heat bath at temperature T. The bit can occupy two logical states, labeled 0 and 1, each with equal probability of $1/2 in the initial unknown configuration (state A). After , the bit is reset to a known state, such as 0 (state B), eliminating the . In , the 's volume corresponding to state A is twice that of state B, since the two logical states are equally probable and the is partitioned equally between them. The S of the is given by S = k_B \ln W, where k_B is Boltzmann's constant and W is the number of accessible microstates ( volume). Initially, for state A, W_A = 2W_B, so the initial is S_A = k_B \ln(2W_B) = k_B \ln 2 + k_B \ln W_B. After to state B, the becomes S_B = k_B \ln W_B, yielding a decrease in \Delta S_{sys} = S_B - S_A = -k_B \ln 2. The erasure process is a logical operation that merges the phase space regions of states 0 and 1 into the single region for state 0, performed quasi-statically to maintain . This operation reduces the system's configurational by k_B \ln 2 bits, as the (Shannon H = -\sum p_i \log_2 p_i = 1 bit initially) maps directly to thermodynamic via S = k_B H \ln 2. According to the second law of thermodynamics, the total of the plus environment (heat bath) cannot decrease: \Delta S_{total} = \Delta S_{sys} + \Delta S_{env} \geq 0. Thus, the environment's must increase by at least k_B \ln 2 to compensate: \Delta S_{env} \geq k_B \ln 2. For a quasi-static in the heat bath at T, the change is \Delta S_{env} = Q / T, where Q is the heat dissipated to the bath. Therefore, Q / T \geq k_B \ln 2, implying Q \geq k_B T \ln 2. This establishes the minimum energy dissipation for erasing one bit. The energy bound can be further related to work and internal energy via the first law of thermodynamics: \Delta E = W + Q, where \Delta E is the change in internal energy, W is the work done on the system, and Q < 0 for heat leaving the system. For an isothermal process with no net work beyond the logical operation (assuming minimal external work), the dissipation \Delta E \approx -Q \geq k_B T \ln 2, confirming the thermodynamic cost. This derivation assumes a quasi-static erasure process with no initial correlations between the bit and the , and it applies to classical systems where the bit states are well-defined macrostates.

Implications for

Irreversible Operations and Heat Dissipation

In standard digital architectures, fundamental logic operations are often irreversible, meaning they map multiple possible inputs to a single output, thereby discarding . For instance, the produces an output of 0 for any of the input combinations (0,0), (0,1), or (1,0), erasing details about which specific inputs produced the result. Similarly, the outputs 1 for (0,1), (1,0), or (1,1), losing the distinction between these cases. The NOT gate, by contrast, is reversible as it uniquely maps 0 to 1 and 1 to 0 without loss. These irreversible operations are ubiquitous in conventional logic circuits, forming the basis of most computational tasks. According to Landauer's principle, each act of erasing one bit of in such operations dissipates a minimum amount of as , given by k_B T \ln 2, where k_B is Boltzmann's constant and T is the ambient temperature (approximately $2.8 \times 10^{-21} J at ). A typical CPU , such as a 32-bit in an , involves thousands of operations, each potentially contributing one or more bit erasures, resulting in total dissipation orders of magnitude above the per-bit bound. In von Neumann architectures, this cost is amplified by the need for frequent data shuttling between processor and memory, which requires copying operations that introduce additional erasures, as well as clock-driven that enforces periodic resets and signal across the chip. Fan-out requirements, where a single drives multiple downstream signals, further escalate use per logical step due to increased capacitive loading. Contemporary CMOS transistors, operating at process nodes around 3 nm (as of 2025), exhibit switching energies on the order of $10^{-19} J per operation, roughly $10^2 times the Landauer limit, primarily due to charging/discharging capacitances and voltage overheads for reliability. Projections for sub-2 nm scales, such as 1.4 nm (A14) or 1 nm nodes expected in the late 2020s, suggest switching energies could approach $10^{-19} J or lower, narrowing the gap to $10^1–$10^2 times the bound and rendering Landauer's limit a constraining factor for ultimate energy efficiency in exascale computing. With 2 nm nodes entering volume production in 2025, offering 15–30% power improvements over 3 nm, further progress toward the limit is anticipated. As device sizes continue to shrink, parasitic effects and thermal noise will make minimizing irreversible erasures essential, with reversible computing paradigms offering a path to circumvent these fundamental costs.

Reversible Computing Approaches

Reversible computing approaches aim to avoid the energy dissipation associated with information erasure by designing computational processes that preserve all information throughout the operation, deferring any irreversible steps until the final output stage. In 1973, Charles Bennett proposed a reversible that simulates any conventional irreversible using only reversible logical operations, ensuring that the machine's state at any step can be uniquely reversed to recover previous states without loss. This model employs a three-tape where input, computation, and output tapes interact through reversible transitions, avoiding bit erasure during intermediate steps. Bennett's construction demonstrates that can be achieved logically reversibly, with thermodynamic costs confined to the initial copying of inputs and final . Implementations of reversible computing often rely on physically realizable models that maintain both logical and thermodynamic reversibility. The billiard ball model, introduced by and Tommaso Toffoli in 1982, envisions computation as the elastic collisions of idealized billiard balls in a two-dimensional plane, where collisions at curved mirrors or other balls perform logic operations without energy loss or information destruction. This conservative logic framework uses gates like the , which permutes signals based on control inputs while conserving the number of "on" states. Similarly, adiabatic circuits implement reversible logic through slowly varying voltage supplies that minimize resistive dissipation, allowing signals to evolve quasi-statically in response to input changes; in the ideal limit, intermediate computations incur near-zero energy cost as the system remains in . These approaches ensure that dissipation occurs only at the boundaries, such as input preparation and output readout, aligning with Landauer's bound for those unavoidable erasures. The primary energy savings in reversible computing stem from eliminating erasure during the bulk of the computation: in quasi-static adiabatic processes, intermediate steps can approach zero dissipation, with total energy scaling favorably compared to irreversible counterparts, potentially enabling 1000-fold or greater efficiency improvements in low-power devices. Classical reversible gates, such as the —which performs controlled-controlled-NOT operations without destroying inputs—serve as building blocks for such systems, allowing universal computation while preserving bit counts. However, practical challenges persist; error correction in real devices often necessitates occasional information to reset faulty states, incurring Landauer-limited costs that accumulate over time. Additionally, clocking mechanisms for synchronizing operations introduce irreversibility through sudden state changes or power spikes, complicating the achievement of full reversibility in high-speed implementations.

Experimental Evidence

Initial Verifications

The first experimental verification of Landauer's principle was achieved in using a single colloidal particle as a model one-bit system. Researchers trapped a 2 μm silica in a formed by two overlapping ( wavelength 1,064 nm) in bidistilled water at (300 K). The erasure process involved cyclically modulating the potential barrier height from above 8 k_B T to 2.2 k_B T over 1 s, applying a linear tilt (maximum up to 1.89 × 10^{-14} N) for a variable duration t, and then restoring the barrier symmetrically, completing each in t + 2 s. The particle's was tracked at 502 Hz with nanometer precision using a fast camera. Heat dissipation was measured through the work done by the trap, inferred from the particle's trajectory via the Jarzynski equality. The average dissipated heat ⟨Q⟩ saturated at the Landauer bound of k_B T \ln 2 ≈ 3 × 10^{-21} J for long cycle times (quasi-static limit), with ⟨Q⟩ ≈ k_B T \ln 2 + B/t for finite t, where B is a constant related to relaxation dynamics. This confirmed that erasing one bit of requires at least k_B T \ln 2 of dissipated heat, with success rates up to 95% at high tilt forces. Building on this, a 2014 experiment provided a high-precision test using a feedback-controlled virtual potential on a colloidal particle. A 200-nm fluorescent silica sphere was confined in an 800-nm-thick aqueous cell using an Anti-Brownian Electrokinetic (ABEL) trap, where position (updated every 10 ms via a 532-nm laser) created a time-dependent U(x,t) = 4 E_b [-1/2 g(t) \tilde{x}^2 + 1/4 \tilde{x}^4 - A f(t) \tilde{x}], with barrier height E_b = 13 k_B T and well separation x_m = 2.5 μm at . cycles consisted of four stages: lowering the barrier, tilting the potential to compress , raising the barrier, and untilting, with total cycle time τ varying from 0.5 s to 100 s. Work was measured directly from feedback forces and particle displacements, yielding work ⟨W⟩ approaching k_B T \ln 2 ≈ 0.693 k_B T for full (phase-space compression p=1) in the slow limit (τ → ∞), and 0 for no (p=0.5). Statistical analysis showed Gaussian work distributions consistent with the bound, with asymptotic precision ±0.03 k_B T and errors ±0.10 k_B T, verifying the principle without relying on ensemble averages. A control with reversible (no-compression) cycles dissipated negligible work, isolating the cost. These initial experiments faced challenges from thermal noise and measurement precision, as the Landauer bound represents only ~0.025 per bit at , requiring long integration times (up to hundreds of cycles) and high-resolution tracking to resolve dissipation above noise floors. Despite this, they provided the first direct thermodynamic evidence of the bound, resolving long-standing debates on whether information erasure inevitably produces measurable and affirming the physical reality of information-entropy .

Modern Measurements and Confirmations

In 2016, researchers conducted an experimental test of Landauer's principle using nanoscale magnetic memory bits, specifically focusing on the erasure process in a single via adiabatic reset operations. The experiment involved nanomagnets at room temperature (300 K), where the measured energy dissipation for erasing one bit was approximately 0.026 eV, or about 1.5 times the Landauer bound of k_B T \ln 2 \approx 0.017 eV. This demonstration confirmed the principle under ambient conditions, with dissipation quantified through magneto-optical Kerr effect (MOKE) measurements of loop areas. A significant quantum confirmation came in 2018 through a single-atom experiment employing a trapped ultracold ^{171}Yb^+ as a in a Paul trap, implementing of via a controlled . The setup used pulses to manipulate the ion's internal states, achieving an average dissipation of (0.99 \pm 0.07) k_B T \ln 2 per cycle in a non-equilibrium decoupled from a standard thermal bath. This verified a quantum extension of Landauer's principle, showing equality between the increase in the ion's and the released to the reservoir, even for superpositions of logical states. Post-2020 advancements have pushed closer to the bound in solid-state systems. In , an experiment with a gate-defined quantum dot in demonstrated minimally dissipative using geometric control protocols, achieving heat dissipation as low as 1.2 k_B T \ln 2 on average, with sub-femtojoule-scale energies (around 10^{-20} J at cryogenic temperatures) by optimizing the erasure path in parameter space. This approach leveraged adiabatic passages to reduce irreversibility, confirming the bound's applicability in devices relevant to scalable . In 2025, researchers experimentally probed Landauer's principle in the quantum many-body regime using a quantum field simulator based on ultracold gases, verifying the bound's validity and uncovering additional details on its ramifications in complex . These modern verifications rely on advanced thermometry techniques for precise measurements. , such as scanning thermal microscopy (SThM), enables nanoscale resolution of in magnetic and electronic systems at , capturing local temperature gradients during . Complementarily, fluctuation theorems—extensions of the second law accounting for —allow extraction of average from statistical distributions of work in experiments, as applied in and setups to validate bounds with high fidelity. Recent work addresses key gaps in earlier proofs-of-concept, particularly room-temperature operations and scalability. The 2016 nanomagnetic and 2022 quantum dot experiments operate at or near 300 K, demonstrating practical relevance beyond cryogenic conditions, while extensions to multi-bit arrays in silicon-based dots show cumulative dissipation scaling linearly with bits erased, nearing the bound per bit without excess overhead. These confirmations underscore Landauer's principle as a viable constraint for future nanoscale and molecular information processors.

Challenges and Extensions

Theoretical Criticisms

Critics of Landauer's principle have raised concerns about the circularity inherent in its standard derivations, arguing that the proof presupposes the thermodynamic irreversibility it seeks to establish, thereby . For instance, philosopher of physics John D. Norton has contended that demonstrations of the principle typically invoke the second law of to link logical irreversibility to , rendering the argument non-independent and question-begging. This critique, echoed in later analyses, suggests that the principle may simply restate aspects of the second law rather than providing a novel bound on computational dissipation. Similar objections appeared in 2016 discussions, where the reliance on macroscopic thermodynamic assumptions was seen as undermining the principle's foundational rigor for microscopic information processing. A related theoretical challenge involves the mismatch between logical reversibility and thermodynamic reversibility. Proponents of the principle, such as Charles Bennett, assert that logically reversible operations can be implemented without net increase, but critics argue that this overlooks practical physical constraints like pre-existing correlations between the computational system and its environment or decoherence effects that enforce thermodynamic irreversibility regardless of logical design. These factors can introduce unavoidable dissipation even in operations intended to be reversible, highlighting a gap between abstract logical structure and physical realization. In 2010, physicists Takahiro Sagawa and Masahito Ueda proposed a counterproposal grounded in fluctuation theorems, demonstrating that while the Landauer bound holds on average for erasure under nonequilibrium , rare fluctuations can lead to violations where falls below kT \ln 2 per bit, albeit with exponentially low probability. This probabilistic refinement suggests the principle's strictness applies statistically rather than universally to individual processes. To address circularity concerns, alternative derivations have emerged that eschew direct appeals to the second law's irreversibility, instead relying on the and stochastic thermodynamics. A notable proof by Artemy Kolchinsky and David Wolpert establishes the cost using only and assumptions, confirming the bound without presupposing macroscopic irreversibility. These approaches affirm that the principle holds in a statistical sense, as fluctuation theorems guarantee average compliance across ensembles of processes. The principle encountered early theoretical challenges in the , with some physicists questioning whether information erasure truly entailed physical independent of broader thermodynamic principles. Ongoing debates persist regarding the principle's validity in open systems, where uncontrolled leakage to the surroundings could disrupt the precise accounting required for the bound, necessitating refined formulations to capture such dynamics.

Quantum and Broader Generalizations

In the quantum domain, Landauer's principle extends to qubits, where the minimal energy cost for erasing one logical bit remains k_B T \ln 2, with k_B the and T the temperature, but this bound applies more generally to the S = -\Tr(\rho \log_2 \rho) for mixed quantum states described by density operator \rho. This generalization accounts for quantum and entanglement, allowing the erasure cost to depend on the system's reduction rather than just classical bit flips. Recent experimental advances from 2018 to 2023 have verified this quantum Landauer bound using superconducting qubits. In one study, researchers implemented thermal-fluctuation-driven on a superconducting cell, demonstrating heat dissipation consistent with the bound for logical bit reset in a nonequilibrium setting. Another experiment achieved near-Landauer-bound performance in quantum spintronic systems using single spins as qubits. Theoretical work in 2020 explored sub-Landauer costs in systems with negative temperatures, where enables entropy decreases without equivalent heat dissipation, potentially relaxing the bound under specific quantum conditions. In 2025, an experiment in quantum many-body systems further confirmed the principle's applicability in complex nonequilibrium settings. Beyond , Landauer's principle applies to broader fields. In , DNA replication in incurs energy costs roughly two orders of magnitude above the Landauer bound per bit, reflecting the thermodynamic overhead of cellular . In , molecular motors in electronic junctions can be driven by the Landauer blowtorch effect, where information gradients produce directed motion with minimal dissipation near the bound. Links to physics emerge in the information paradox, where holographic principles suggest entropy bounds akin to Landauer for erasure, with recent analyses tying black hole area quantization to informational energy costs. Further generalizations address continuous variables and non-equilibrium systems. For analog computing with continuous states, the bound scales with changes, providing a for real-valued information processing. In driven systems, the Jarzynski equality connects work fluctuations to differences, retrieving the Landauer bound for even in finite-time irreversible protocols. Looking ahead, these extensions imply significant challenges for energy efficiency, as measurement-induced collapse effectively erases superposition , incurring at least the Landauer cost per collapsed bit and limiting scalable fault-tolerant operations.

References

  1. [1]
    [PDF] Notes on Landauer's principle, reversible computation ... - cs.Princeton
    Landauer's principle. In his classic paper, Rolf Landauer (1961) attempted to apply thermodynamic reasoning to digital computers. Paralleling the fruitful ...
  2. [2]
    60 years of Landauer's principle | Nature Reviews Physics
    Nov 17, 2021 · Landauer observed that “logical irreversibility in turn implies physical irreversibility, and the latter is accompanied by dissipative effects”.
  3. [3]
    Experimentally probing Landauer's principle in the quantum ... - Nature
    Jun 5, 2025 · Here, we experimentally probe Landauer's principle in the quantum many-body regime using a quantum field simulator of ultracold Bose gases.
  4. [4]
    [PDF] Irreversibility and Heat Generation in the Computing Process
    Abstract: It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse.Missing: DOI | Show results with:DOI
  5. [5]
    [PDF] Landauer Principle and Thermodynamics of Computation - arXiv
    The bound at room temperature is approximately 0.018 eV. (2.9 x 10−21 J) ... [88] Charles H Bennett, “Notes on landauer's principle, re- versible ...
  6. [6]
  7. [7]
    The Physics of Forgetting: Thermodynamics of Information at IBM ...
    Mar 1, 2016 · Landauer's principle provides the connection stating that, in order to erase information, it is necessary to dissipate energy (Nielsen and ...
  8. [8]
    Irreversibility and Heat Generation in the Computing Process
    Irreversibility and Heat Generation in the Computing Process | IBM Journals ... ISSN Information: DOI: 10.1147/rd.53.0183. Publisher: IBM. Authors ...
  9. [9]
    [PDF] Logical Reversibility of Computation* - UCSD Math
    Abstract: The usual general-purpose computing automaton (e.g.. a Turing machine) is logically irreversible- its transition function.Missing: principle 1980s<|control11|><|separator|>
  10. [10]
    Minimal Energy Cost for Thermodynamic Information Processing
    Sep 24, 2008 · The lower bound for the erasure vindicates the "Landauer's principle" for a special case, but otherwise implies its breakdown, indicating ...
  11. [11]
    Generalizing Landauer's principle | Phys. Rev. E
    Mar 10, 2009 · We derive the most general statement of Landauer's principle and prove its universality, extending considerably the validity of previous proofs.
  12. [12]
    [1004.5330] Information erasure without an energy cost - arXiv
    Apr 29, 2010 · Landauer argued that the process of erasing the information stored in a memory device incurs an energy cost in the form of a minimum amount of ...
  13. [13]
    Experimental Test of Landauer's Principle at the Sub-kBT Level
    Jun 20, 2012 · We report the first experimental test of Landauer's principle. For logically reversible operations we measure energy dissipations much less than ...
  14. [14]
    Experimental test of Landauer's principle in single-bit operations on ...
    Mar 11, 2016 · We report an experimental investigation of the intrinsic energy loss of an adiabatic single-bit reset operation using nanoscale magnetic memory bits.
  15. [15]
    Single-Atom Demonstration of the Quantum Landauer Principle
    Employing a trapped ultracold ion, we experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost ...Missing: photon | Show results with:photon
  16. [16]
    [PDF] A Mathematical Theory of Communication
    In the case of a discrete source of information we were able to determine a definite rate of generating information, namely the entropy of the underlying ...
  17. [17]
    Information Processing and Thermodynamic Entropy
    Sep 15, 2009 · Logically irreversible operations reduce the logical state space, so must compress the physical state space. Landauer argued that this must be ...
  18. [18]
    [PDF] arXiv:1605.08448v1 [cs.DS] 26 May 2016
    May 26, 2016 · Most CPUs discard many bits of information per clock cycle, as much as one per gate; for example, an AND gate with output 0 or an OR gate with ...
  19. [19]
    [PDF] Limits to the Energy Efficiency of CMOS Microprocessors - arXiv
    Dec 14, 2023 · The energy costs from transistor switching can be determined by starting from the Landauer limit, and adjusting it to account for reliability ...
  20. [20]
    Toward attojoule switching energy in logic transistors - Science
    Nov 17, 2022 · Transistor scaling has resulted in exponential gain in performance and energy efficiency of integrated circuits.
  21. [21]
    [PDF] conservative logic - CSAIL Publications - MIT
    A "billard-ball" model of computation. In this section we shall introduce a model of computation (the billiard-ball model) based on stylized but quite ...
  22. [22]
    [PDF] Reversible and Adiabatic Computing: Energy-Efficiency Maximized
    In this paper, we describe the relationship between adiabatic and logically reversible circuits, and predict the potential of the arith- metic implementations ...
  23. [23]
    Reversible Computing Escapes the Lab in 2025 - IEEE Spectrum
    Jan 2, 2025 · Landauer's limit gives a theoretical minimum for how much energy information erasure costs, but there is no maximum. Today's CMOS ...
  24. [24]
    [PDF] Introduction to Reversible Computing: Motivation, Progress, and ...
    The challenges that reversible computing faces are difficult, and it may take a concerted effort on the part of the semiconductor in- dustry, the broader ...
  25. [25]
    Minimally Dissipative Information Erasure in a Quantum Dot via ...
    We experimentally demonstrate how this geometric approach can be exploited to minimize dissipation in a Landauer erasure protocol. Our device is based on a ...
  26. [26]
    Energy Aware Technology Mapping of Genetic Logic Circuits
    Oct 8, 2024 · We introduce Energy Aware Technology Mapping, the automated design of genetic logic circuits with respect to energy efficiency and functionality.
  27. [27]
    Landauer's principle in the quantum regime | Phys. Rev. E
    Mar 7, 2011 · We demonstrate the validity of Landauer's erasure principle in the strong coupling quantum regime by treating the system-reservoir ...
  28. [28]
    Effect of Quantum Coherence on Landauer's Principle - MDPI
    Landauer's principle provides a fundamental lower bound for energy dissipation occurring with information erasure in the quantum regime.Missing: optical | Show results with:optical
  29. [29]
    [PDF] Nonequilibrium thermodynamics of erasure with superconducting ...
    Mar 3, 2020 · Intrinsic dissipation in superconducting circuits has been found to be very low at frequencies up to 10 GHz [29,30].<|control11|><|separator|>
  30. [30]
    [PDF] Landauer's principle at zero temperature - arXiv
    (Dated: June 2, 2020). Landauer's bound relates changes in the entropy of a system with the inevitable dissipation of heat to the environment.
  31. [31]
    The thermodynamic efficiency of computations made in cells across ...
    Thus, bacteria consume about two orders of magnitude more energy than the Landauer bound for DNA replication.
  32. [32]
    A physically realizable molecular motor driven by the Landauer ...
    Jun 8, 2023 · We propose a model for a molecular motor in a molecular electronic junction driven by a natural manifestation of Landauer's blowtorch effect ...
  33. [33]
    Landauer's principle and black hole area quantization - ResearchGate
    Aug 6, 2024 · This article assesses Landauer's principle from information theory in the context of area quantization of the Schwarzschild black hole.<|control11|><|separator|>
  34. [34]
    Landauer bound for analog computing systems | Phys. Rev. E
    Jul 25, 2016 · The key idea to generalize the Landauer principle to analog computers is based on the relation between the second principle and entropy ...
  35. [35]
    [PDF] Detailed Jarzynski Equality applied on a Logically Irreversible ... - HAL
    Jun 1, 2014 · In this context the Landauer's principle [6] is very important as it states that for any irreversible logi- cal operation the minimum amount ...
  36. [36]
    Landauer's Principle in a Quantum Szilard Engine without Maxwell's ...
    We explore a quantum Szilard engine considering quantum size effects. We see that insertion of the partition does not localize the particle to one side.<|control11|><|separator|>