Fact-checked by Grok 2 weeks ago

Hidden-variable theory

Hidden-variable theory refers to a class of that seek to explain the probabilistic nature of quantum predictions through the introduction of additional, unobserved variables that determine definite outcomes for measurements, thereby restoring and to the theory. These theories emerged as a response to perceived incompletenesses in standard , particularly following the 1935 Einstein-Podolsky-Rosen () argument, which posited that quantum mechanics cannot provide a complete description of physical reality because it allows for correlations between distant particles that seem to violate locality without underlying mechanisms. The foundational motivation for hidden-variable theories traces back to early debates on , where and collaborators argued that the theory's statistical predictions imply the existence of "elements of reality" not captured by the alone. In 1952, proposed a specific hidden-variable model, now known as Bohmian mechanics, in which particles follow definite trajectories guided by the quantum , which evolves according to the ; this approach reproduces all empirical predictions of standard while introducing particle positions as hidden variables. Bohm's formulation demonstrates that hidden-variable theories can be nonlocal, allowing influences between entangled particles , thus avoiding conflicts with in certain ways. However, hidden-variable theories face significant challenges from no-go theorems. John Bell's 1964 theorem derives inequalities that must hold for any local realistic hidden-variable theory but are violated by , as confirmed by numerous experiments showing correlations exceeding these limits by many standard deviations. Additionally, the 1967 Kochen-Specker theorem proves that non-contextual hidden-variable theories—those assigning definite values to observables independently of measurement context—are impossible in dimensions greater than or equal to three, further restricting viable models. Despite these constraints, nonlocal variants like Bohmian mechanics remain consistent with quantum predictions and continue to influence research in .

Core Concepts

Definition and Classification

Hidden-variable theories seek to provide a deterministic foundation for by introducing additional parameters, known as hidden variables, which are not accounted for in the standard quantum description but would, if known, eliminate the apparent of the . These variables are posited to underlie the probabilistic predictions of , offering a more complete specification of physical systems akin to how underlying molecular motions explain macroscopic statistical behavior in classical . In essence, a successful hidden-variable would restore predictability by determining outcomes exactly once the hidden variables are included, while still reproducing all empirical predictions of . A key prerequisite for understanding hidden-variable theories is the probabilistic framework of quantum mechanics itself, particularly the , which interprets the square of the modulus of the wave function's coefficient for a given eigenstate as the probability of measuring that outcome upon observation. This rule, formulated in the early development of , assigns probabilities to measurement results based solely on the , denoted typically as the wave function ψ. Hidden-variable theories extend this by incorporating supplementary variables, often collectively labeled λ, such that the actual outcome of a measurement is a deterministic function of both ψ and λ, thereby converting quantum probabilities into certainties conditional on knowledge of λ. Hidden-variable theories are classified along several dimensions, with locality and contextuality being the most prominent. Local theories restrict the influence of hidden variables to events within light cones, ensuring compatibility with relativistic causality by preventing faster-than-light signaling or instantaneous actions at a distance; in contrast, nonlocal theories permit correlations that transcend such spatial separations. Independently, theories are deemed contextual if the assignment of values to hidden variables depends on the specific measurement apparatus or context selected, or noncontextual if these assignments remain invariant regardless of the compatible measurements performed alongside them. This dual classification highlights the trade-offs in attempting to reconcile quantum predictions with classical intuitions of and .

Local versus Nonlocal Theories

In local hidden-variable theories, the outcomes of measurements on a are determined solely by local hidden variables λ associated with that system, ensuring that distant measurements do not influence local results, thereby upholding the no-signaling principle. These theories are inherently compatible with , as they prohibit superluminal influences and maintain Einstein causality, where effects propagate at or below the . A hypothetical example is a local realistic model where λ encodes complete information about particle properties, such as predetermined values, allowing predictions without reference to remote events. Nonlocal hidden-variable theories, in contrast, permit hidden variables to correlate outcomes across spatially separated systems instantaneously, accommodating superluminal influences that resolve apparent paradoxes in quantum predictions. While these theories can deterministically reproduce ' statistical results, they challenge by introducing action-at-a-distance, though they often avoid violating causality through the no-signaling condition, preventing usable communication. For instance, in entangled systems, nonlocal variables might enforce perfect anticorrelations in measurement outcomes regardless of separation, explaining without probabilistic indeterminism. The key implications of this distinction lie in their treatment of and : local theories preserve the foundational principles of Einsteinian physics by confining influences to light cones, fostering a free of paradoxical signaling. Nonlocal theories, however, necessitate a reevaluation of , potentially requiring modifications to relativistic frameworks or acceptance of acausal elements, yet they successfully match empirical quantum data where local models falter. Conceptually, local theories offer intuitive compatibility with classical intuitions and but encounter stringent constraints from quantum violations of locality assumptions, rendering viable models scarce. Nonlocal theories evade such constraints by embracing extended influences, providing deterministic underpinnings to at the cost of introducing non-relativistic features like instantaneous correlations.

Historical Development

Early Motivations and Einstein-Bohr Debates

The development of hidden-variable theories in quantum mechanics was profoundly influenced by the philosophical debates between Albert Einstein and Niels Bohr during the mid-1920s, particularly at the 1927 Solvay Conference on Electrons and Photons. Einstein, dissatisfied with the probabilistic nature of the newly formulated quantum mechanics, argued that it represented an incomplete description of physical reality, necessitating additional "hidden" parameters to restore determinism and objective realism. In a letter to Max Born dated December 4, 1926, Einstein famously expressed this view by stating, "God does not play dice with the universe," critiquing the inherent randomness introduced by Werner Heisenberg's uncertainty principle and emphasizing his belief in a complete, causal theory underlying quantum phenomena. At the Solvay Conference, Einstein presented thought experiments challenging the completeness of quantum mechanics, proposing that unobserved elements of reality—such as precise particle positions—must exist independently of measurement to align with classical intuitions of locality and separability. Niels Bohr countered Einstein's position by defending the , which he had begun articulating in the mid-1920s, asserting that was a complete and self-consistent framework for describing atomic phenomena. Bohr emphasized of complementarity, whereby wave and particle aspects of are mutually exclusive depending on the experimental , and highlighted the indispensable role of in defining measurable outcomes, rejecting the need for hidden variables as superfluous to the theory's predictive success. During the conference discussions, Bohr and Heisenberg declared ' formalism as finalized, with Bohr specifically arguing that attempts to introduce hidden parameters would undermine the theory's empirical foundations without resolving its conceptual tensions. This stance positioned the Copenhagen view as instrumentalist, treating quantum probabilities as fundamental rather than approximations of deeper deterministic processes. These debates underscored a core philosophical divide: Einstein's advocacy for , positing an objective reality governed by deterministic laws independent of observation, versus Bohr's , which accepted ' probabilistic predictions as the limits of knowable reality without invoking unobservable mechanisms. Einstein's demanded hidden variables to preserve and the of distant systems, while Bohr viewed such additions as incompatible with the renunciation of classical descriptions in quantum contexts. This tension motivated early exploratory ideas, such as de Broglie's pilot-wave theory, first proposed in his 1924 thesis but elaborated at the 1927 , where he suggested particles are guided by an accompanying wave carrying definite trajectories hidden from direct observation. Similarly, , in the late 1920s while developing wave mechanics, expressed initial interest in hidden parameters to reconcile wave functions with particle behavior, viewing them as potential supplements to address the apparent incompleteness of the probabilistic formalism.

EPR Paradox and Von Neumann's Theorem

In 1932, published a influential proof in his book demonstrating that hidden-variable theories could not reproduce the statistical predictions of . The theorem assumes that hidden variables determine the values of noncontextually—meaning the value of an is fixed independently of which compatible are measured simultaneously—and that the quantum probabilities arise from averaging over these hidden variables with statistical independence from the measurement process. Conceptually, von Neumann showed that for the expectation value of any A, represented as the trace \operatorname{Tr}(\rho A) in where \rho is the density operator, no dispersion-free assignment of values to all (as required by hidden variables) could match without violating these assumptions. This result was widely interpreted as ruling out hidden variables altogether, significantly discouraging research into such theories for nearly two decades. The EPR paradox emerged three years later in a 1935 paper by Albert Einstein, Boris Podolsky, and Nathan Rosen, which challenged the completeness of quantum mechanics by highlighting apparent inconsistencies in its treatment of entangled systems. They considered a pair of particles in a position-momentum entangled state, such that measuring the position of one particle instantaneously determines the position of the other, regardless of distance, implying what Einstein famously called "spooky action at a distance." EPR defined an "element of physical reality" as a quantity whose value can be predicted with certainty without disturbing the system, and argued that a complete theory must include variables representing all such elements; since quantum mechanics failed to specify hidden variables for both position and momentum simultaneously in this setup—due to the uncertainty principle—it must be incomplete. Niels Bohr responded promptly in a 1935 paper, defending the by emphasizing the principle of complementarity and the unavoidable role of apparatus in defining quantum phenomena. He contended that EPR's criteria for overlooked the holistic of quantum descriptions, where the specification of conditions inherently limits the applicability of classical concepts like separability, thus resolving the apparent paradox without invoking hidden variables. Together, von Neumann's theorem and the EPR debate solidified the dominance of the Copenhagen view in the 1930s and 1940s, portraying hidden-variable approaches as fundamentally incompatible with established quantum predictions.

Bohm's Pilot-Wave Theory

In 1952, independently rediscovered and reformulated Louis de Broglie's 1927 pilot-wave concept as a deterministic hidden-variable theory, responding to the prevailing view that was complete and probabilistic, as argued in John von Neumann's 1932 theorem. Motivated by Albert Einstein's critique of quantum theory's incompleteness—highlighted in the 1935 EPR paradox—Bohm sought to restore causality by positing additional variables beyond the wave function. This revival occurred amid growing dissatisfaction with the , and Bohm explicitly built upon de Broglie's idea of particles guided by an associated wave, which had been largely overlooked after its initial presentation at the 1927 . At its core, Bohm's theory posits that particles possess definite positions and velocities at all times, serving as the hidden variables, while the quantum acts as a pilot wave that governs their trajectories without collapsing during . The guidance of particles arises through a novel quantum potential derived from the wave function, which introduces inherent nonlocality: changes in the wave function at one location instantaneously affect particle motions elsewhere, ensuring consistency with quantum predictions like patterns. This nonlocal structure allows the theory to reproduce all empirical results of standard while providing an ontological picture of reality where particles follow continuous paths determined by initial conditions. Bohm detailed his ideas in two seminal papers published in Physical Review in 1952, the first outlining the general framework and the second addressing measurement processes. He and collaborators later extended the theory to incorporate particle in 1955, treating spin as additional hidden variables influencing trajectories via the wave function. Efforts toward relativistic formulations followed, with Bohm exploring compatible extensions in subsequent works, though full consistency with proved challenging due to the theory's nonlocality. Initially, Bohm's proposal faced sharp criticism from the physics community, primarily for its explicit nonlocality, which conflicted with the favored in relativistic theories and was seen as . Prominent figures like dismissed it as untenable, and it gained little traction during the amid the dominance of the Copenhagen school. Over time, however, the theory earned appreciation for its ontological clarity, offering a realist alternative that avoids measurement-induced collapses and clarifies quantum phenomena through definite particle trajectories, influencing later hidden-variable research.

Bell's Theorem and Its Impact

In 1964, John Stewart Bell published a seminal theorem that addressed the foundational debate sparked by the Einstein-Podolsky-Rosen (EPR) paradox, proving that any local hidden-variable theory—assuming realism and locality—cannot fully reproduce the statistical predictions of quantum mechanics for entangled particles. Bell's result showed that such theories must satisfy specific inequalities derived from their assumptions, but quantum mechanics predicts correlations that violate these bounds in certain experimental setups involving spatially separated measurements on entangled systems. This theorem provided a clear, testable criterion to distinguish local realistic descriptions from quantum theory, transforming the EPR critique from a philosophical concern into an empirically verifiable question. A prominent formulation of Bell's inequality is the Clauser-Horne-Shimony-Holt (CHSH) version, which considers two possible measurement settings (A, A') for one particle and (B, B') for the other, with outcomes typically ±1. Under , the expectation values satisfy \left| \langle AB \rangle + \langle AB' \rangle + \langle A' B \rangle - \langle A' B' \rangle \right| \leq 2, where \langle \cdot \rangle denotes the average over many trials. To derive this, one assumes hidden variables \lambda distributed according to some probability density \rho(\lambda), with local outcomes determined solely by the local setting and \lambda, such as A(a, \lambda) = \pm 1. The joint expectations are then integrals over \lambda, like \langle AB \rangle = \int \rho(\lambda) A(a, \lambda) B(b, \lambda) \, d\lambda, and algebraic manipulation yields the bound ≤2. In contrast, for a maximally entangled predicts a maximum violation of $2\sqrt{2} \approx 2.828 when settings are chosen at optimal angles. Bell's theorem rests on three key assumptions: locality, which requires that the outcome at one site depends only on the local measurement setting and shared hidden variables without influences; , positing that physical properties have definite values independent of measurement; and measurement independence (or freedom-of-choice), ensuring that the choice of settings is uncorrelated with the hidden variables. Violations of the inequality thus imply that at least one assumption fails, ruling out local realistic hidden-variable models while allowing for nonlocal alternatives. The publication of Bell's theorem in 1964 marked a pivotal shift, prompting experimental efforts to resolve the debate empirically rather than philosophically. In 1982, Alain Aspect and collaborators conducted landmark photon-based experiments that confirmed quantum predictions, observing CHSH correlations exceeding 2 by several standard deviations and effectively closing the locality loophole through rapid switching of analyzers. These results solidified quantum mechanics' empirical superiority over local hidden-variable theories, influencing subsequent research in quantum foundations and information science.

Bohmian Mechanics in Detail

Guiding Equation and Particle Trajectories

In Bohmian mechanics, the motion of particles is governed by a guiding equation that defines their velocities in terms of the wave function \psi, which evolves according to the Schrödinger equation. For a single particle of mass m, the position Q satisfies the first-order differential equation \frac{dQ}{dt} = \frac{\hbar}{m} \operatorname{Im} \left( \frac{\nabla \psi}{\psi} \right), where \hbar is the reduced Planck's constant and \nabla is the gradient operator. This velocity field arises from expressing the wave function in polar form \psi = R e^{iS/\hbar}, with R = |\psi| and S real, yielding the particle velocity as \mathbf{v} = \frac{1}{m} \nabla S. The guiding equation ensures that particles follow well-defined trajectories at all times, with initial positions distributed according to the quantum probability density |\psi|^2. The deterministic dynamics incorporate a quantum potential Q, which modifies the classical Hamilton-Jacobi equation to account for quantum effects. Defined as Q = -\frac{\hbar^2}{2m} \frac{\nabla^2 R}{R}, the quantum potential depends solely on the amplitude R of the wave function and influences particle acceleration through Newton's second law, \frac{d\mathbf{v}}{dt} = -\frac{1}{m} \nabla (V + Q), where V is the classical potential. This term introduces non-local influences, as Q reflects the global structure of \psi, enabling the theory to reproduce quantum interference without probabilistic collapse. For systems of N particles, the space is $3N-dimensional, and each particle's is determined by a collective guiding equation derived from the multi-particle \psi(q_1, \dots, q_N), where q_k denotes the position of the k-th particle. The velocities are \frac{dq_k}{dt} = \frac{\hbar}{m_k} \operatorname{Im} \left( \frac{\nabla_{q_k} \psi}{\psi} \right), ensuring that the entire evolves deterministically while maintaining the quantum flux conservation \frac{d}{dt} (R^2 \mathbf{v}) + \nabla \cdot (R^2 \mathbf{v}) = 0. Particles thus trace continuous paths in this high-dimensional space, avoiding singularities and providing a complete specification of the system's state. A illustrative example is the double-slit interference experiment, where Bohmian trajectories demonstrate how particles passing through one slit are guided by the pilot wave propagating through both slits, resulting in deflections that cluster at interference fringes. Computations of these trajectories reveal that the quantum potential creates "channels" directing particles to bright regions, reproducing the probabilistic pattern observed in standard without invoking wave-particle duality in measurement. Relativistic extensions of Bohmian mechanics include the Dirac-Bohm theory for particles, which adapts the guiding equation to the satisfying the . Here, the components are derived from the j^\mu = \bar{\psi} \gamma^\mu \psi, with particle worldlines satisfying \frac{dx^\mu}{d\tau} \propto j^\mu / (\psi^\dagger \psi), incorporating via the wave function's structure while preserving in flat . This formulation addresses single-particle relativistic dynamics, though challenges arise for multi-particle interactions due to field-theoretic requirements.

Quantum Equilibrium Hypothesis

The quantum equilibrium hypothesis in Bohmian mechanics posits that the probability distribution of particle configurations is given by the squared of the wave function, \rho = |\psi|^2, corresponding to the of standard . This assumption applies to the initial conditions of the hidden particle positions, ensuring that the statistical predictions of the theory align with observed quantum probabilities despite its fully deterministic underlying dynamics. In this framework, the guiding equation determines individual particle trajectories, but the equilibrium hypothesis introduces the statistical layer necessary for empirical equivalence with . The justification for this hypothesis draws an analogy to classical , where equilibrium distributions such as the Maxwell-Boltzmann law emerge from underlying without being fundamental laws themselves. In Bohmian mechanics, subquantum processes—arising from the deterministic evolution of the wave function and particle velocities—drive the system toward this equilibrium state through mechanisms akin to , where long-time averages equal ensemble averages over typical initial configurations. This relaxation is supported by equivariance: if the initial distribution follows |\psi_0|^2, it remains |\psi_t|^2 at all later times t, preserving the under the Schrödinger evolution. Such arguments emphasize typicality, where the vast majority of possible initial states (weighted by |\psi|^2) yield quantum-like statistics, rendering deviations atypical and unobservable in practice. A key implication of the quantum equilibrium hypothesis is that it explains measurement outcomes deterministically, without invoking wave function collapse or observer-induced randomness, thereby resolving the inherent in the . Under , the positions of particles in a measuring apparatus, such as a pointer, become effectively random according to |\psi|^2, producing the probabilistic results predicted by while maintaining a clear of particles guided by the universal . This approach thus recovers all empirical content of , including interference patterns and entanglement correlations, as emergent statistical phenomena from the underlying hidden-variable dynamics. Criticisms of the hypothesis center on the apparent fine-tuning required for the initial distribution to precisely match |\psi|^2, as deviations could lead to nonequilibrium states where predictions diverge from standard quantum mechanics. In such nonequilibrium scenarios, subquantum variances might manifest as observable anomalies, raising questions about the universality of quantum equilibrium and the theory's explanatory power for all physical systems. While proponents argue that cosmic evolution naturally enforces equilibrium through relaxation processes, the reliance on specific initial conditions remains a point of philosophical contention, potentially undermining the theory's claim to fundamental status.

Challenges and No-Go Theorems

Flaws in Early No-Go Proofs

In 1932, published a purporting to demonstrate the impossibility of hidden variables in , arguing that quantum expectation values could not be expressed as averages over dispersion-free states without violating the of the theory. The core flaw in von Neumann's proof lies in its assumption that hidden variables must assign definite, preexisting values to all observables simultaneously, even those that are incompatible (i.e., do not commute), thereby enabling an additive across the entire of observables. This assumption is invalid because incompatible observables cannot be simultaneously measured or assigned values without context dependence in quantum systems. exposed this limitation in 1952 by constructing an explicit nonlocal hidden-variable theory—now known as Bohmian mechanics—that reproduces all quantum predictions while violating von Neumann's additivity condition for noncommuting observables. further clarified the issue in 1957, showing through a measure-theoretic approach that von Neumann's error stemmed from an overly restrictive notion of noncontextuality, though Gleason's own result strengthened the case against certain hidden-variable models. Gleason's theorem states that, for any of dimension three or greater, there exists no noncontextual on the closed subspaces ( operators) that is additive over orthogonal decompositions and satisfies the required boundary conditions, except for the standard quantum measures derived from operators. This result effectively rules out noncontextual hidden-variable theories for systems described in such spaces, such as a spin-1 particle, because any attempt to preassign values to all possible independently of leads to inconsistencies with quantum probabilities. However, Gleason's theorem explicitly allows for contextual hidden-variable models, where the value assigned to an depends on the specific compatible set () in which it is measured, thus preserving the possibility of if contextuality is permitted. Subsequent early no-go proofs built on these ideas but shared similar assumptions about noncontextuality. The 1963 theorem by Josef Maria Jauch and Constantin Piron attempted to exclude hidden variables by analyzing the orthomodular lattice of quantum propositions, claiming that no two-valued homomorphism could embed the into a classical without . This proof, however, relied on the same noncontextual valuation of all observables and was refuted for overlooking contextual assignments that could still reproduce quantum outcomes, as later highlighted in critiques emphasizing its incomplete coverage of hidden-variable possibilities. Similarly, the 1967 Kochen-Specker theorem provided a concrete proof of noncontextuality's impossibility by constructing a finite set of 117 observables in three-dimensional (for spin-1 systems) where any noncontextual value assignment—predetermining +1 or -1 outcomes independently of context—leads to a logical due to overconstrained functional relations. Collectively, these early theorems targeted noncontextual , assuming that physical properties have definite values prior to and independent of , irrespective of the experimental . By focusing on this , they left open avenues for contextual hidden-variable theories, which assign values based on the measurement setup, or nonlocal ones that violate locality but maintain .

Implications of Bell's Inequality Violations

The experimental violations of Bell's inequalities have profound theoretical implications for hidden-variable theories, fundamentally challenging the notion of local in . These violations demonstrate that no can fully reproduce the probabilistic predictions of , as such theories would necessarily satisfy the inequalities derived from locality and assumptions. Instead, the results confirm either —where distant events can instantaneously influence one another—or the abandonment of , the idea that physical properties have definite values independent of . To reconcile these findings without accepting nonlocality, alternative frameworks have been proposed, though they remain controversial and require significant departures from standard assumptions. posits that the choices of measurement settings are not independent but correlated with hidden variables due to initial conditions of the , effectively eliminating free choice in experiments. suggests that future measurement outcomes can influence past states, allowing for apparent nonlocal correlations while preserving locality in forward time. Objective collapse models, which introduce spontaneous modifications to the , offer another route by altering the underlying dynamics of to avoid strict adherence to the standard theory's predictions. In response to these implications, alternative have gained prominence as viable options that do not rely on local hidden variables. The accommodates Bell violations by positing that all possible measurement outcomes occur in branching parallel universes, thereby avoiding the need for collapse or hidden variables altogether. The , emphasizing the role of measurement in realizing quantum probabilities, similarly sidesteps hidden variables by treating the wave function as a tool for prediction rather than a complete description of reality. For hidden-variable theories to survive, they must incorporate nonlocality, as in Bohmian mechanics, or contextuality, where outcomes depend on the full measurement context rather than individual properties. Philosophically, the violations mark the definitive end of local hidden-variable theories as a realistic alternative to , shifting the foundational debate toward accepting quantum weirdness or exploring radical revisions to and independence. This rejection has significantly boosted quantum information theory, underpinning the development of technologies like and computing that exploit entanglement's nonlocal correlations for practical applications. Empirically, these implications were strengthened by loophole-free Bell tests in 2015, which closed major experimental gaps such as the detection and locality loopholes. The experiment by Hensen et al. used entangled separated by 1.3 , achieving a CHSH value of S = 2.42 \pm 0.20, exceeding the local realist bound of 2. The Giustina et al. test with entangled photons reported a violation of the CH-Eberhard with J = 7.27 \times 10^{-6} (11.5 deviations), ensuring space-like separation and high detection . These 2015 experiments, building on earlier work, contributed to the 2022 awarded to John F. Clauser, , and for experiments with entangled photons establishing the violation of Bell inequalities and pioneering . Further loophole-free tests, such as those using superconducting circuits in 2023, have continued to confirm these violations.

Recent Developments

Advances in Experimental Tests

Significant advances in experimental tests of hidden-variable theories occurred between 2015 and 2020, with several groups achieving loophole-free violations of Bell's inequalities. In , researchers at demonstrated a loophole-free using entangled in , simultaneously closing the detection, locality, and freedom-of-choice loopholes by ensuring high detection (>82%), spacelike separation of measurements (1.3 μs light-travel time), and random measurement settings generated 400 meters away. Concurrently, teams at NIST and the reported similar achievements with entangled photons, attaining detection efficiencies around 75-90% and confirming violations with values up to 2.42, far exceeding the classical bound of 2. These experiments ruled out realistic hidden-variable models within the tested parameter spaces, providing strong empirical support against hidden variables. Subsequent experiments up to have confirmed and extended these results with improved precision and diverse systems. For instance, a 2023 experiment using superconducting circuits achieved a loophole-free CHSH violation of 2.075 ± 0.003, incorporating cryogenic detectors with near-unity and sub-nanosecond timing to further tighten locality constraints. Additional confirmations in photonic and platforms have pushed violation significances beyond 100 standard deviations, maintaining closure of all major loopholes while scaling to longer distances (up to 50 km) and faster repetition rates (>10 MHz). Recent theoretical expansions have identified potential additional loopholes in standard Bell tests that local hidden-variable theories might exploit. A 2025 analysis in the National High School Journal of Science reviewed and proposed new loopholes related to finite measurement statistics and assumption relaxations, suggesting that certain local hidden-variable models could mimic quantum correlations under incomplete sampling. In response, modified protocols have been proposed to probe contextual hidden variables, adapting standard setups to detect context-dependent outcomes in measurements without relying on noncontextual assumptions. Efforts to test the freedom-of-choice loophole—questioning whether hidden variables could influence measurement settings—have advanced with 2025 experiments using cosmic sources for randomization. Reported in , these tests employed light from billions of light-years away to select measurement bases, ensuring causal independence and yielding Bell violations consistent with quantum predictions, with no evidence of hidden signaling. Overall, these advances have found no evidence supporting hidden variables, with consistently prevailing. Growing precision in entanglement swapping protocols and multipartite systems—such as entangled measurements on W states—has further constrained hidden-variable possibilities, enabling applications in quantum networks.

Theoretical Innovations and Compatibility Claims

Recent theoretical advancements in hidden-variable theory (HVT) have explored the integration of to construct hidden-variable (LHV) models that replicate quantum statistics for complex entangled systems. In a 2025 study published in PRX Quantum, researchers developed a gradient-descent-based inspired by machine learning techniques to discover LHV models for arbitrary multipartite entangled states and measurements, demonstrating that such models can reproduce quantum correlations without invoking nonlocality for certain configurations. This approach addresses longstanding challenges in determining the locality of quantum states by optimizing hidden-variable distributions to match experimental predictions, offering a computational pathway to test HVT viability beyond traditional analytical methods. Compatibility between local HVT and quantum mechanics (QM) has been further examined through proofs that reconcile the two under constrained assumptions, such as restricted measurement contexts or initial conditions. Analyses propose that LHV theories can align with QM outcomes by incorporating specific correlations in hidden variables that preserve statistical equivalence, particularly for systems where observer choices are not fully independent. These proofs highlight how assumptions like —where all events, including settings, are predetermined—can salvage local , though they remain debated for implying a form of cosmic conspiracy. Similarly, models, positing future influences on past events, have been advanced as alternatives to nonlocality, with theoretical frameworks showing with QM while maintaining in hidden variables. Innovations in contextual and deconfined HVT models emphasize variables that depend on contexts or exhibit non-local yet non-signaling behaviors. A 2025 proposal in Foundations of Physics outlines a theoretical for contextual variables, where outcomes vary based on the full experimental setup, potentially detectable through modified analyses without violating no-go theorems. Complementing this, a Quantum journal paper argues that features like the and paradox arise from rather than inherent quantumness, proposing orders—such as deterministic particle positions in Bohmian-like extensions—as non-fundamental explanations that deconfine quantum weirdness to classical underpinnings. Ongoing reviews underscore the progress and remaining gaps in HVT, including integrations of AI for model construction and resolutions to loopholes like free will in Bell tests. A 2024 examination from a statistical mechanics perspective reviews HVT developments, noting how AI-driven optimizations bridge analytical limitations and address incompletenesses in traditional formulations, such as accommodating observer independence without superdeterminism. Debates persist on superdeterminism and retrocausality as viable salvages for HVT, with taxonomies classifying them as extensions beyond standard QM while emphasizing their testability through refined theoretical predictions. These efforts collectively aim to update HVT frameworks, filling conceptual voids in reconciling determinism with quantum phenomena.

References

  1. [1]
    [PDF] Can Quantum-Mechanical Description of Physical Reality Be
    In attempting to judge the success of a physical theory, we may ask ourselves two ques- tions: (1) “Is the theory correct?" and (2) “Is the description given by ...
  2. [2]
    [PDF] ON THE EINSTEIN PODOLSKY ROSEN PARADOX*
    THE paradox of Einstein, Podolsky and Rosen [1] was advanced as an argument that quantum mechanics could not be a complete theory but should be supplemented ...
  3. [3]
    [PDF] Hidden Variables and the Two Theorems of John Bell - arXiv
    Feb 27, 2018 · A frequently offered analogy is that a successful hidden-variables theory would be to quantum mechanics as classical mechanics is to ...
  4. [4]
    [PDF] A Classification of Hidden-Variable Properties - arXiv
    Dec 3, 2008 · Abstract. Hidden variables are extra components added to try to banish counterintuitive features of quantum mechanics.
  5. [5]
    Quantenmechanik der Stoßvorgänge | Zeitschrift für Physik A ...
    Volume 38, pages 803–827, (1926); Cite this article. Download PDF · Zeitschrift für Physik. Quantenmechanik der Stoßvorgänge. Download PDF. Max Born. 2088 ...<|separator|>
  6. [6]
    Research on Hidden Variable Theories: a review of recent progresses
    Jan 12, 2007 · This review is addressed to present the most recent progresses on the studies related to Hidden Variable Theories (HVT), both from an experimental and a ...Missing: original | Show results with:original
  7. [7]
  8. [8]
    None
    ### Summary of Einstein-Bohr Dialogue on Quantum Mechanics
  9. [9]
    The Solvay Debates: Einstein versus Bohr - Galileo Unbound
    Jun 15, 2020 · The chief conclusion that Heisenberg and Bohr sought to impress on the assembled attendees was that the theory of quantum processes was complete ...Missing: completeness | Show results with:completeness
  10. [10]
    Einstein's local realism vs. Bohr's instrumental anti-realism - Redalyc
    However, Einstein's realism did not agree with the uncertainties and anti-realism of Quantum Mechanics and, precisely because of his set of classical beliefs, ...Missing: instrumentalism | Show results with:instrumentalism
  11. [11]
    [PDF] Heisenberg (and Schrödinger, and Pauli) on Hidden Variables
    Jul 10, 2009 · Abstract. In this paper, we discuss various aspects of Heisenberg's thought on hidden variables in the period 1927–1935.
  12. [12]
    [1801.09305] Von Neumann's Impossibility Proof: Mathematics in ...
    Jan 28, 2018 · In 1932 von Neumann persuaded the physics community that hidden variables are impossible as a matter of principle.Missing: impact | Show results with:impact
  13. [13]
    [PDF] Von Neumann's Impossibility Proof - arXiv
    Jan 28, 2018 · Around 1952 both Bohm and de Broglie thus realized that von Neu- mann's theorem pertained to hidden-variables theories of a specific kind,.Missing: impact | Show results with:impact
  14. [14]
    [PDF] A History of Quantum Nonlocality - arXiv
    However, by this time popular support for the Copenhagen interpretation over any 'hidden variable' theory had been bolstered by von Neumann's supposed proof.Missing: impact | Show results with:impact<|control11|><|separator|>
  15. [15]
  16. [16]
    A Suggested Interpretation of the Quantum Theory in Terms of ...
    In this paper and in a subsequent paper, an interpretation of the quantum theory in terms of just such "hidden" variables is suggested.Missing: original | Show results with:original
  17. [17]
    A Suggested Interpretation of the Quantum Theory in Terms of ...
    Finally, we show that von Neumann's proof that quantum theory is not consistent with hidden variables does not apply to our interpretation, because the hidden ...Missing: theorem | Show results with:theorem
  18. [18]
    Louis de Broglie and David Bohm's Quest for a Quantum Ontology'
    Jun 15, 2023 · De Broglie published an article in 1927 [18] and a book in 1930 [19] that detailed the pilot-wave interpretation presented in 1927 and both ...<|separator|>
  19. [19]
    Proposed Experiment to Test Local Hidden-Variable Theories
    A proposed extension of the experiment of Kocher and Commins, on the polarization correlation of a pair of optical photons, will provide a decisive test.
  20. [20]
    Bell's Theorem - Stanford Encyclopedia of Philosophy
    Jul 21, 2004 · Bell's theorem shows that no theory that satisfies the conditions imposed can reproduce the probabilistic predictions of quantum mechanics under all ...Introduction · Proof of a Theorem of Bell's Type · Some Variants of Bell's Theorem
  21. [21]
    Experimental Test of Bell's Inequalities Using Time-Varying Analyzers
    Experimental Test of Bell's Inequalities Using Time-Varying Analyzers. Alain Aspect, Jean Dalibard*, and Gérard Roger ... 49, 1804 – Published 20 December, 1982.
  22. [22]
    [PDF] arXiv:0903.2601v1 [quant-ph] 15 Mar 2009 Bohmian Mechanics
    Mar 15, 2009 · An important consequence of the quantum equilib- rium hypothesis is the empirical equivalence between Bohmian mechanics and quantum mechanics: ...<|control11|><|separator|>
  23. [23]
    Quantum Equilibrium and the Origin of Absolute Uncertainty - arXiv
    Aug 6, 2003 · While distinctly non-Newtonian, Bohmian mechanics is a fully deterministic theory of particles in motion, a motion choreographed by the wave ...
  24. [24]
    Retrocausality in Quantum Mechanics
    Jun 3, 2019 · This entry presents an overview of retrocausal approaches to the interpretation of quantum theory, the main motivations for adopting this approach.3. Motivation I: Exploiting... · 5. The Transactional... · 6. Developments Towards A...<|separator|>
  25. [25]
  26. [26]
    How Bell's Theorem Proved 'Spooky Action at a Distance' Is Real
    Jul 20, 2021 · Later on in 1976 Bell showed that you can derive the same theorem from two assumptions: local causality and measurement independence. This ...
  27. [27]
    Significant-Loophole-Free Test of Bell's Theorem with Entangled ...
    Dec 16, 2015 · We report a Bell test that closes the most significant of these loopholes simultaneously. Using a well-optimized source of entangled photons.
  28. [28]
    An Expansion of Current Loopholes in Bell Experiments - NHSJS
    Jan 24, 2025 · This paper reviews and expands upon so-called loopholes in Bell experiments, factors that can be exploited by locally-causal hidden variable theories (LHVTs).
  29. [29]
    Proposed experiments for detecting contextual hidden variables
    Jun 3, 2025 · We propose two quantum experiments - modified Bell tests - that could detect contextual hidden variables underlying quantum mechanics.Missing: Pusey | Show results with:Pusey
  30. [30]
    Testing free will: New experiments on the quantum property that ...
    Jun 17, 2025 · “We show that, if only one [of the assumptions behind local hidden-variable theories] fails, then it has to fail completely, therefore excluding ...Missing: cosmic sources Bell
  31. [31]
    Entangled measurement for W states | Science Advances
    Sep 12, 2025 · Multipartite entanglement also plays a central role in error correction (5) and measurement-based quantum computation (6, 7). On the other hand, ...
  32. [32]
    [2510.10707] Automated discovery of high-dimensional multipartite ...
    Oct 12, 2025 · We show that this mechanism enables complex multipartite, high-dimensional, and even logical entanglement between remote nodes whose photons ...
  33. [33]
    Discovering Local Hidden-Variable Models for Arbitrary Multipartite ...
    Apr 24, 2025 · In this work, we present a general approach to solve the challenge of determining whether a given quantum state is local. Specifically, we use ...
  34. [34]
    Discovering Local Hidden-Variable Models for Arbitrary Multipartite ...
    Jul 5, 2024 · We develop an approach that employs gradient-descent algorithms from machine learning to find LHV models which reproduce the statistics of arbitrary ...
  35. [35]
    Rethinking Superdeterminism - Frontiers
    Superdeterminism is a promising approach not only to solve the measurement problem, but also to understand the apparent non-locality of quantum physics.
  36. [36]
    Which features of quantum physics are not fundamentally quantum ...
    Apr 3, 2025 · In this paper, we show that a number of supposedly genuine quantum features –-such as the measurement problem, the Wigner's friend paradox and ...Missing: orders | Show results with:orders
  37. [37]
    Hidden Variables in Quantum Mechanics from the Perspective of ...
    Sep 6, 2024 · This paper examines no-hidden-variables theorems in quantum mechanics from the point of view of statistical mechanics.<|control11|><|separator|>
  38. [38]
    (PDF) Taxonomy for physics beyond quantum mechanics
    Our focus is on models which have previously been referred to as superdeterministic (strong or weak), retrocausal (with or without signalling, dynamical or non- ...