Fact-checked by Grok 2 weeks ago

Bell's theorem

Bell's theorem is a foundational result in quantum physics, established by John Stewart Bell in 1964, which proves that no local hidden-variable theory—positing that quantum outcomes are determined by local factors and pre-existing variables—can fully replicate the probabilistic predictions of quantum mechanics for entangled particles. The theorem arises from the Einstein–Podolsky–Rosen (EPR) paradox of 1935, where Albert Einstein, Boris Podolsky, and Nathan Rosen argued that quantum mechanics appeared incomplete due to its allowance for "spooky action at a distance" in entangled systems, prompting Bell to derive testable inequalities to resolve the debate. In his seminal paper "On the Einstein Podolsky Rosen Paradox," Bell assumed local causality—that measurement outcomes on one particle are independent of the distant settings on its entangled partner—and realism, leading to the inequality |P(\mathbf{a}, \mathbf{b}) - P(\mathbf{a}, \mathbf{c})| \leq 1 + P(\mathbf{b}, \mathbf{c}), where P denotes correlation functions for spin measurements along directions \mathbf{a}, \mathbf{b}, and \mathbf{c}. Quantum mechanics violates this bound, predicting correlations up to $2\sqrt{2} in variants like the CHSH inequality, implying inherent nonlocality in the theory. Experimental validations began with Freedman and Clauser's 1972 test showing a violation at 6.5σ confidence, followed by Alain Aspect's 1981–1982 experiments closing the locality loophole with 40σ significance, and loophole-free confirmations in 2015 by teams led by Ronald Hanson, Krister Shalm, and Marissa Giustina at over 5σ. These results, culminating in the 2022 Nobel Prize in Physics awarded to Aspect, John Clauser, and Anton Zeilinger, have solidified Bell's theorem as a cornerstone of quantum information science, enabling applications in quantum cryptography and teleportation while challenging classical intuitions of locality and realism.

The Theorem

Statement

Bell's theorem asserts that no local hidden variable theory can reproduce all the predictions of quantum mechanics for systems of entangled particles. This result challenges the classical notion of local realism, which posits that physical properties are determined by local causes and that distant events cannot instantaneously influence one another. Named after physicist John Stewart Bell, who formulated it in 1964, the theorem arose in response to the Einstein-Podolsky-Rosen (EPR) paradox proposed in 1935, which questioned the completeness of quantum mechanics by highlighting apparent "spooky action at a distance" in entangled systems. The theorem is typically illustrated using pairs of spin-entangled particles prepared in a , where the total spin is zero, ensuring perfect anticorrelation in measurements along the same axis. Two distant observers, conventionally called , each receive one particle and independently choose measurement settings, such as the angles a and b defining the directions of spin measurement relative to a reference axis. The outcomes of these spin measurements are recorded as +1 or -1, corresponding to spin up or down along the chosen direction. The E(a,b) quantifies the average agreement between Alice's and Bob's outcomes over many such entangled pairs, defined as the expectation value E(a,b) = \langle A(a) B(b) \rangle, where A(a) and B(b) are the results. predicts that these correlations, given by E(a,b) = -\cos(a - b) for the , exhibit strengths that exceed what any local hidden variable model—where outcomes are predetermined by hidden variables carried locally by each particle—can achieve. This qualitative discrepancy arises because allows for correlations that defy classical intuitions of independent local influences, revealing a fundamental nonlocality in the quantum description of nature.

Assumptions

Bell's theorem is predicated on several foundational assumptions that characterize realistic theories, particularly those involving variables. These assumptions collectively define the against which ' predictions are tested. The theorem demonstrates that no theory satisfying all these assumptions can reproduce the statistical correlations observed in experiments. The first assumption is , which posits that physical observables possess definite values prior to , independent of the measurement process itself. In the context of hidden variable theories, this means that for any , a pre-existing value exists that determines the outcome upon . This idea stems from the Einstein-Podolsky-Rosen , where realism implies that the properties of distant particles are well-defined and objective, without requiring simultaneous influence from measurement choices. The second assumption is locality, which asserts that no influences or signals can propagate between spatially separated measurement events. Specifically, the outcome of a on one particle cannot instantaneously affect the outcome on a distant particle; any correlations must arise from shared prior conditions rather than direct interaction. This preserves relativistic causality and prevents action-at-a-distance. To formalize these within hidden variable models, outcomes are determined by a shared hidden variable λ, which encodes all relevant about the . For two distant parties, , measuring observables with settings a and b respectively, the outcomes are functions A(a, λ) and B(b, λ), where A and B take values such as ±1, and λ is distributed according to some probability density ρ(λ). The joint probability for outcomes given settings is then P(A, B | a, b) = ∫ ρ(λ) δ(A - A(a, λ)) δ(B - B(b, λ)) dλ, ensuring that correlations depend only on λ and not on direct signaling between a and b. This setup assumes separability, where the individual outcome functions depend only on settings and the shared λ. The third assumption is , also known as measurement independence or statistical independence. It requires that the experimenters' choices of measurement settings (a and b) are free and uncorrelated with the hidden variables λ; that is, ρ(λ | a, b) = ρ(λ). This prevents any pre-established between the hidden variables and the measurement selections, allowing experimenters genuine in their decisions. Without this, the statistical predictions of the theory could be rigged to match quantum results artificially. The fourth assumption is no-signaling, which stipulates that the marginal for one party's outcome is of the distant party's of setting. For instance, the probability for Alice's outcome should not depend on Bob's setting b: ∑_B P(A, B | a, b) = P(A | a). This ensures that no information or influence travels between the parties, maintaining consistency with even in correlated systems. In local hidden variable models, this follows from locality but is explicitly required to rule out signaling paradoxes. Violating any of these assumptions can potentially allow a theory to evade the conclusions of Bell's theorem while reproducing quantum predictions. For example, proposes that the universe is entirely deterministic, with measurement choices correlated with variables from the outset, thus breaking . Such models restore locality and but at the cost of an untestable global conspiracy in initial conditions. Importantly, Bell's theorem specifically targets separable local variable models, where outcomes are determined by local functions of settings and shared variables, as opposed to more general classical models that might incorporate non-separable or non-local elements. This distinction highlights the theorem's scope: it rules out local with variables but does not preclude all classical explanations of quantum phenomena.

Derivation of the Inequality

In a local realistic theory, the measurement outcomes for the first particle, denoted A(a, λ) = ±1, depend on the setting a and the hidden variable λ, while for the second particle, B(b, λ) = ±1 depends on setting b and the same λ. The hidden variables are distributed according to a probability ρ(λ) with ∫ ρ(λ) dλ = 1. The is then given by E(a, b) = \int A(a, \lambda) B(b, \lambda) \rho(\lambda) \, d\lambda. This expectation value represents the average product of outcomes under local realism, where the outcomes are predetermined by λ independently of the distant measurement choice. To derive the inequality, consider the combination |E(a, b) - E(a, b')| + |E(a', b) + E(a', b')|. Under local realism, this quantity is bounded by 2. To see this, examine the expression S = E(a, b) - E(a, b') + E(a', b) + E(a', b') = ∫ [A(a, λ)(B(b, λ) - B(b', λ)) + A(a', λ)(B(b, λ) + B(b', λ))] ρ(λ) dλ. For each λ, the integrand is A(a)(B(b) - B(b')) + A(a')(B(b) + B(b')), where each term is ±1. Evaluating over the four possible cases for (B(b, λ), B(b', λ)) yields values of ±2 in all instances, so the absolute value of the integrand is at most 2. Thus, |S| ≤ ∫ 2 ρ(λ) dλ = 2. Moreover, for each λ, |A(a, λ)(B(b, λ) - B(b', λ))| + |A(a', λ)(B(b, λ) + B(b', λ))| = 2, since the terms are complementary (one is 0 and the other ±2). Thus, |E(a, b) - E(a, b')| + |E(a', b) + E(a', b')| ≤ ∫ [|A(a, λ)(B(b, λ) - B(b', λ))| + |A(a', λ)(B(b, λ) + B(b', λ))|] ρ(λ) dλ = 2. Any local realistic model must therefore satisfy |E(a, b) - E(a, b')| + |E(a', b) + E(a', b')| ≤ 2. In , for the of two particles, the is E(a, b) = -\cos \theta, where \theta is the angle between the measurement directions a and b. This follows from the quantum prediction for the expectation value of the product of operators \vec{\sigma}_a \cdot \vec{\sigma}_b = - \vec{a} \cdot \vec{b} for the . The maximum violation occurs for specific choices of angles, typically a = 0^\circ, a' = 45^\circ, b = 22.5^\circ, b' = 67.5^\circ. Substituting these into the CHSH combination yields |E(a, b) - E(a, b')| + |E(a', b) + E(a', b')| = 2\sqrt{2} \approx 2.828 > 2, demonstrating that exceeds the local realistic bound. This violation confirms that no can reproduce the quantum for the .

Historical Development

Background: EPR and Local Realism

In 1935, Albert Einstein, , and published a seminal paper arguing that , as formulated at the time, could not be considered a complete theory of physical reality. They focused on the phenomenon of , using the example of two particles in a shared where measuring the position of one instantaneously determines the position of the other, regardless of distance. This implied what Einstein famously called "spooky ," suggesting non-local influences that violated the principle of locality in . To challenge the completeness of , they introduced the criterion that an "element of physical reality" exists if, in principle, it can be predicted with certainty without disturbing the system. Since allowed such predictions for entangled particles without local interaction, they concluded that the theory must be incomplete, requiring additional "hidden variables" to fully describe reality. Central to the EPR argument was Einstein's commitment to local , a framework positing that physical properties are objectively real—independent of measurement—and that influences between distant events cannot propagate , adhering to locality. Einstein had long advocated for , viewing ' probabilistic nature and observer dependence as unsatisfactory for a fundamental theory. , a key proponent of the , responded promptly, defending by emphasizing the indivisibility of the experimental setup and the complementary nature of wave and particle descriptions, arguing that the EPR critique misunderstood the theory's epistemological limits. This exchange ignited philosophical debates throughout the 1930s and 1940s on whether described an objective reality or merely predictive correlations, with figures like Schrödinger introducing the term "entanglement" to highlight the . Efforts to resolve the EPR paradox through hidden variable theories faced early setbacks. In 1932, John von Neumann published a proof claiming that no hidden variable theory could reproduce quantum mechanics' statistical predictions, based on assuming non-contextual hidden variables that assign definite values to all observables simultaneously. This no-go theorem discouraged further exploration for nearly two decades, though it was later shown to be flawed due to an incorrect assumption about the additivity of probabilities for non-commuting observables. In 1952, David Bohm revived interest by reformulating the EPR paradox with a concrete example: two spin-entangled particles where measuring one spin along any axis instantly correlates with the other's opposite outcome, again suggesting non-locality. Bohm proposed a nonlocal hidden variable theory, now known as Bohmian mechanics, where particle trajectories are guided by a pilot wave, allowing definite positions at all times but requiring instantaneous influences across space. These developments sustained the 1950s debates on quantum completeness, pitting realism against the apparent indeterminism of standard quantum theory. Bell's theorem later provided a rigorous test for local hidden variable theories in response to this challenge.

John Bell's Contribution

John Stewart Bell, a Northern Irish theoretical physicist, first developed his interest in the foundations of during the early 1950s while employed at the at Harwell, where his primary duties involved theoretical work on particle accelerators like the synchro-cyclotron. Influenced by David Bohm's 1952 publication of a hidden-variable interpretation of , Bell pursued philosophical inquiries into the theory's completeness alongside his accelerator research. This fascination with continued after he joined in 1960, where he contributed to and but reserved time for exploring unresolved conceptual issues in quantum theory. Motivated by the 1935 Einstein-Podolsky-Rosen (EPR) paradox, which argued that ' predictions for entangled particles implied either incompleteness or nonlocality, Bell aimed to make this debate experimentally decisive. In his groundbreaking 1964 paper, "On the Einstein Podolsky Rosen Paradox," published in Physics, he focused on the EPR-Bohm setup involving pairs of spin-1/2 particles in a and derived an that local hidden-variable theories—those preserving locality and realism—must obey for correlations between distant measurements. Bell's formulation assumed local causality, where outcomes at one location depend only on local hidden variables and not on distant measurement choices, providing a quantitative test of the EPR intuition that quantum mechanics could not be completed by local realistic elements. Bell's central insight was to transform the qualitative EPR critique into a precise testable with feasible experiments, highlighting that quantum predictions for certain correlations exceed the bounds set by local , thus favoring nonlocality. In later publications, including his 1971 "Introduction to the Hidden Variable Question" presented at the International School of Physics "," Bell refined the assumptions of his original work, particularly emphasizing measurement independence—the freedom of experimenters to choose settings uncorrelated with variables—as essential for the 's validity and experimental applicability. Throughout his career, Bell championed the acceptance of as a feature of , as supported by anticipated violations of his inequality, while rejecting —a violating —as an unpalatable alternative requiring a "conspiracy" where distant experimental choices are pre-correlated by underlying . He viewed such conspiratorial setups as undermining the scientific enterprise by constraining experimental freedom, preferring instead to embrace the counterintuitive implications of for distant influences.

Subsequent Formulations

In 1970, provided an alternative derivation of Bell's inequality using a probabilistic approach for the of two particles, emphasizing the impossibility of variables reproducing quantum correlations without invoking non-locality. This formulation simplified the proof by focusing on the joint probability distributions for measurements along different axes, making it more accessible for analyzing deterministic theories. Building on Bell's original work, John F. Clauser and Michael A. Horne formalized a probability-based in 1974 that extended the testable predictions to scenarios involving imperfect detectors and arbitrary local realistic models. Their approach derived bounds on correlations without assuming perfect anticorrelation in the , laying the groundwork for experimental implementations while highlighting the need to account for measurement outcomes explicitly. This formulation, known as the Clauser-Horne , proved particularly useful for bridging theoretical predictions with practical photon-based tests. In the mid-1970s, Bell himself refined his to derive stronger bounds applicable to realistic experimental conditions, including low detector efficiencies that could otherwise allow hidden variable models to mimic quantum violations. In a 1976 analysis, he demonstrated that the original requires near-100% detection efficiency for the , prompting adjustments to ensure robustness against the "detection loophole" where undetected particles bias results. These refinements emphasized inequalities tolerant of inefficiencies below 100%, facilitating transitions to feasible experiments by clarifying minimal efficiency thresholds for conclusive tests. Initially, Bell's 1964 theorem received limited attention in the late due to perceived experimental inaccessibility, as atomic cascade sources and polarizers were not yet sufficiently advanced for reliable tests. However, the subsequent formulations by Wigner, Clauser-Horne, and Bell spurred growing interest throughout the , as they directly tackled practical issues like detector inefficiency and probability formalisms, enabling the first experiments and integrating Bell's ideas into mainstream research.

CHSH Inequality

The Clauser-Horne-Shimony-Holt () inequality represents a practical of Bell's theorem tailored for experimental verification in bipartite quantum systems, building on John Bell's original inequality by providing a testable bound on correlations without requiring perfect anti-correlations in specific settings. It assumes local realism, where outcomes are determined by predetermined local hidden variables, and is expressed in terms of expectation values of joint on two parties, and , each choosing between two settings, a or a' for and b or b' for Bob. The states that under local realism, the of the combination of correlation functions satisfies |E(a,b) + E(a,b') + E(a',b) - E(a',b')| \leq 2, where E(x,y) denotes the expectation value of the product of outcomes A(x) B(y), with outcomes A, B = \pm 1. This bound arises from the assumption that measurement outcomes are locally determined by a shared hidden variable \lambda, distributed with density \rho(\lambda), such that A(x) = A(x, \lambda) and B(y) = B(y, \lambda). The correlations are then E(x,y) = \int A(x, \lambda) B(y, \lambda) \rho(\lambda) \, d\lambda, and the follows by considering the possible sign combinations: for fixed \lambda, the expression A(a,\lambda)B(b,\lambda) + A(a,\lambda)B(b',\lambda) + A(a',\lambda)B(b,\lambda) - A(a',\lambda)B(b',\lambda) equals either $2 [A(a,\lambda) + A(a',\lambda)] B(b,\lambda) or $2 [A(a,\lambda) - A(a',\lambda)] B(b',\lambda), each bounded by 2 in , leading to the overall bound upon integration. An equivalent form in terms of joint probabilities, P_{11}(a,b) + P_{11}(a,b') + P_{11}(a',b) - P_{11}(a',b') - P_1(a) - P_1(b') \leq 0 (and a symmetric lower bound), where P_{11} is the probability of both outcomes being +1 and P_1 the marginal for +1, directly incorporates undetected events by not assigning outcomes to them. Compared to Bell's original , the CHSH version offers key advantages for experimentation: it applies to general dichotomous observables without presupposing perfect anti-correlation for aligned settings, making it suitable for photonic implementations with polarizers, and its probability-based form mitigates issues from imperfect detection by naturally excluding no-detection events from the joint probabilities, though violation still requires above approximately 82.8% to close the detection . In , the maximum violation reaches $2\sqrt{2} \approx 2.828 for the with optimal angles (e.g., 0°, 45°, 90°, 135°), as derived from the quantum correlation E(\theta) = -\cos(\theta). The was developed between 1969 and 1974 specifically to facilitate real-world tests of local hidden variable theories, addressing practical challenges like low detection rates in early entanglement experiments. It has become the standard tool for two-qubit Bell tests in , enabling certifications of entanglement, randomness, and security in protocols such as device-independent .

GHZ-Mermin Theorem

The Greenberger–Horne–Zeilinger (GHZ) theorem provides a proof of Bell's theorem for three or more entangled particles, demonstrating an all-or-nothing contradiction between and local realism without relying on probabilistic inequalities. The setup involves three qubits prepared in the GHZ state, given by |\mathrm{GHZ}\rangle = \frac{1}{\sqrt{2}} \left( |000\rangle + |111\rangle \right), where |0\rangle and |1\rangle are the eigenstates of the Pauli \sigma_z . Each qubit is sent to a separate observer who measures either the \sigma_x or \sigma_y Pauli on their particle. Quantum mechanics predicts perfect correlations for specific combinations of these measurements: the expectation value \langle \sigma_x^{(1)} \sigma_x^{(2)} \sigma_x^{(3)} \rangle = +1 and \langle \sigma_y^{(1)} \sigma_y^{(2)} \sigma_y^{(3)} \rangle = -1, where the superscripts denote the particles. Under local , each particle carries predetermined values v(\sigma_x^{(i)}) = \pm 1 and v(\sigma_y^{(i)}) = \pm 1 for the possible outcomes, of distant measurements. These values must reproduce the quantum predictions, so v(\sigma_x^{(1)}) v(\sigma_x^{(2)}) v(\sigma_x^{(3)}) = +1 and v(\sigma_y^{(1)}) v(\sigma_y^{(2)}) v(\sigma_y^{(3)}) = -1. Consider the three observables A = \sigma_x^{(1)} \sigma_y^{(2)} \sigma_y^{(3)}, B = \sigma_y^{(1)} \sigma_x^{(2)} \sigma_y^{(3)}, and C = \sigma_y^{(1)} \sigma_y^{(2)} \sigma_x^{(3)}. yields \langle A \rangle = \langle B \rangle = \langle C \rangle = +1, implying v(A) = v(B) = v(C) = +1 under local . The product of these values is then v(A) v(B) v(C) = v(\sigma_x^{(1)}) v(\sigma_x^{(2)}) v(\sigma_x^{(3)}) = +1. However, the operator identity A B C = -\sigma_x^{(1)} \sigma_x^{(2)} \sigma_x^{(3)} implies v(A B C) = -v(\sigma_x^{(1)} \sigma_x^{(2)} \sigma_x^{(3)}) = -1, leading to an algebraic inconsistency: +1 = -1. This contradiction shows that no local realistic model can reproduce all quantum predictions. In 1990, N. David Mermin presented a simplified version of the GHZ argument using a gedankenexperiment with three particles and two possible measurement settings per particle (\sigma_x or \sigma_y), emphasizing the paradox through predetermined "instructions" for outcomes that inevitably conflict. Mermin's formulation highlights the deterministic nature of the violation, where predicts outcomes with certainty (100% probability) for the relevant correlations, eliminating the need for statistical analysis or finite sampling to detect the contradiction. The GHZ-Mermin theorem offers key advantages over earlier Bell inequalities, such as the CHSH version for two particles, by providing a stark, inequality-free disproof of local realism that directly tests multipartite entanglement in three-particle systems. It generalizes Bell's original result to higher numbers of particles and dimensions, revealing stronger forms of inherent in multipartite entangled states.

Kochen-Specker Theorem

The Kochen-Specker theorem, established in 1967, asserts that in quantum mechanical systems with dimension three or greater, no non-contextual can reproduce all predictions of by pre-assigning definite values (such as 0 or 1 for the eigenvalues of operators) to every independently of . This result targets the assumption of non-contextuality, where the value of an should be fixed and intrinsic, unaffected by which compatible observables are jointly measured. Unlike Bell's theorem, which requires multipartite entanglement to reveal violations of local realism, the Kochen-Specker theorem applies to single-particle systems, demonstrating contextuality without spatial separation. The original proof by Kochen and Specker focused on a spin-1 particle, corresponding to a three-dimensional , and employed a geometric construction involving 117 directions (rays) in three-dimensional real space. These directions represent possible spin measurement outcomes, grouped into 40 orthogonal bases where predicts exactly one vector per basis yields a value of 1 (corresponding to the onto that state), and the rest 0, due to the unit trace of the . A non-contextual would require coloring these vectors with 0s and 1s such that each basis has precisely one 1, while satisfying functional relations like additivity for sums of projectors (where the value of a sum equals the sum of values if they are disjoint). However, the interlocking structure of these bases leads to a contradiction: no such global coloring exists that is consistent across all contexts. Subsequent simplifications reduced the number of vectors needed while preserving the proof's essence. In 1984, Asher Peres presented a proof using only 33 vectors in three dimensions, arranged as directions from the center of a to points on its surface, forming 26 bases that again defy non-contextual coloring. Further refinements, such as an unpublished 31-vector construction by and Kochen reported by Peres, and a 1996 state-independent proof by Cabello, Estebaranz, and García-Alcaine using 18 vectors in four dimensions, established lower bounds and highlighted the theorem's robustness across dimensions. These proofs frame the theorem as a graph-coloring problem, where vertices are observables and edges connect compatible ones, underscoring the impossibility of value assignments without context dependence. Quantum contextuality, as revealed by the theorem, means that the outcome assigned to an observable in a measurement depends on the broader context of simultaneously measurable (commuting) observables, contrasting with classical realism where properties exist predetermined and independently. This context dependence arises even in non-entangled states, implying that hidden variable models must either abandon non-contextuality or fail to match quantum statistics. The theorem thus eliminates a broad class of non-contextual hidden variable theories for individual quantum systems, complementing Bell's work by showing that quantum mechanics' challenges to realism extend beyond nonlocality to the very notion of predetermined values.

Experimental Tests

Early Experiments (1970s-1980s)

The first experimental test of Bell's inequalities was performed by Stuart J. Freedman and John F. Clauser in 1972 at the University of California, Berkeley. They generated entangled photon pairs through a calcium atomic cascade (J=0 → J=1 → J=0 transition), excited by a deuterium lamp, with photons emitted at wavelengths of 5513 Å and 4227 Å. Polarization correlations were measured using Malus polarizers and photomultiplier detectors separated by about 1.35 m, with analyzer orientations varied in 22.5° increments, including key angles of 22.5° and 67.5° for testing the CHSH form of the inequality. The experiment yielded a violation parameter δ = 0.050 ± 0.008, exceeding the local hidden-variable bound of δ ≤ 0 by 5% and agreeing closely with quantum mechanical predictions, with high statistical significance. However, the setup suffered from low detection efficiency, approximately 0.1% due to photomultiplier quantum efficiencies and collection losses, necessitating the fair-sampling assumption that detected events represented the full ensemble. Building on this, and his team at the Institut d'Optique in conducted refined experiments in 1981 and 1982, using improved atomic cascade sources in calcium vapor excited by lasers for higher pair production rates. The setup involved entangled at 4227 Å and 5518 Å, detected over a 12 m baseline with polarizers and photomultipliers, measuring correlations at angles such as 0°, 22.5°, 45°, and 67.5° to test Bell inequalities. In the 1981 experiment, the results violated the relevant Bell inequality by more than 5 standard deviations, confirming quantum predictions with enhanced statistics from longer data runs. The 1982 follow-up introduced acousto-optic modulators as fast switches to alter analyzer settings every 10 ns—after photon emission but before detection—partially addressing concerns over predetermined settings while maintaining locality. This yielded a violation of Bell's inequalities by 5 standard deviations, with measured CHSH parameter S = 2.25 ± 0.02 against the local realist limit of |S| ≤ 2. These early tests faced common challenges, including overall detection efficiencies of 1-5% across both experiments, limited by source brightness, optical losses, and detector sensitivities, which again relied on fair-sampling. Additionally, the fixed or semi-fixed analyzer orientations in the initial setups did not fully eliminate potential communication between measurement choices. Despite these limitations, the experiments provided strong for quantum mechanical nonlocality, contradicting local hidden-variable theories and motivating subsequent refinements in experimental design and closures.

Loophole-Free Experiments (2015)

In 2015, three independent experiments achieved the first loophole-free violations of Bell's inequalities, simultaneously closing the detection and locality loopholes that had plagued earlier tests. These landmark results provided robust empirical confirmation of without reliance on unverified assumptions, decisively favoring over local realist theories. The Delft experiment, conducted by Ronald Hanson and colleagues at , utilized entangled electron spins in samples separated by 1.3 kilometers. The setup employed an event-ready protocol with nitrogen-vacancy centers to herald entanglement, followed by fast random basis selection via electro-optic modulators. They measured a CHSH parameter of S = 2.42 \pm 0.20, exceeding the classical bound of S \leq 2 with a corresponding to a of 0.039 (approximately 2.1 standard deviations). This closed the locality loophole through space-like separation of measurement events and the detection loophole via near-unity spin readout efficiency (over 97%), eliminating fair-sampling biases. Concurrently, the Vienna group led by at the performed a photonic using polarization-entangled photons from a down-conversion source, distributed over a 58-meter channel. Rapid setting choices were generated by ultrafast quantum generators based on phases, ensuring unpredictability. The experiment violated the CH-Eberhard inequality with S = 2.073 \pm 0.012, yielding a under local realism of $3.74 \times 10^{-31} (11.5 standard deviations). High heralding efficiencies of 78.6% and 76.2% closed the detection , while the 700-nanosecond window for space-like separation addressed the locality ; the setup also mitigated the freedom-of-choice through the inherent randomness of the generators. The NIST experiment, led by Krister Shalm and collaborators at the National Institute of Standards and Technology, similarly used entangled photons from , with parties separated by 184 meters in a setting. High-efficiency superconducting nanowire single-photon detectors (over 90% quantum efficiency) and fast random basis selection via quantum entropy sources enabled a loophole-free CHSH violation, with p-values as low as $5.9 \times 10^{-9} (approximately 5.9 standard deviations, adjusted to $2.3 \times 10^{-7}). This closed the detection loophole by avoiding fair-sampling assumptions and the locality loophole via strict space-like isolation of events, providing confirmation of nonlocality. Across these experiments, the was sealed using high-efficiency detectors exceeding the critical thresholds required for their respective (66.7% for the CH-Eberhard inequality in the experiment, 82.8% for the in the NIST experiment) without post-selection, while locality was enforced through rapid, unpredictable measurement settings and sufficient spatial/temporal separation to prevent light-speed signaling. The was addressed in and NIST via quantum-based randomizers, though not explicitly in . Combined analyses of the datasets later demonstrated violations exceeding 5 standard deviations, underscoring the robustness of the results against local hidden variable models.

Recent Advances (2020s)

In 2022, the was awarded to John F. Clauser, , and for their pioneering experiments with entangled photons, establishing the violation of Bell and paving the way for . Their work confirmed quantum mechanical predictions over local hidden variable theories, influencing subsequent advancements in entanglement-based technologies. A significant experimental in solid-state systems occurred in 2025, when researchers demonstrated a Bell violation using gate-defined quantum dots, achieving a Bell state of 97.17% without readout error corrections and employing direct readout. This result, obtained with spin qubits in , exceeded the CHSH bound by 86 standard deviations, highlighting the potential of platforms for scalable quantum networks. In another breakthrough, a 2025 experiment reported a violation of a Bell-like using unentangled photons, where correlations arose from quantum and path indistinguishability rather than entanglement, surpassing the classical limit by more than four standard deviations. This demonstrated that non-entanglement mechanisms can produce nonlocal correlations, broadening the scope of Bell tests beyond traditional entangled systems. Advancing loophole closures, a 2025 Bell test on a public quantum computer addressed the objectivity loophole by ensuring measurement outcomes were confirmed independently of observer biases, using protocols that enforce objective randomness in settings. Additionally, experimental self-testing of complex projective measurements via an elegant Bell inequality was achieved in 2025, verifying measurement fidelity in higher-dimensional Hilbert spaces without assuming device calibration. A contemporary review expanded on loopholes like memory effects, analyzing how past interactions could mimic nonlocality in repeated tests. Emerging trends in the 2020s include integrating Bell tests into quantum networks, where nonlocal correlations enable secure multipartite entanglement distribution over fiber-optic infrastructures. Higher-dimensional Bell experiments have pushed violation strengths, with four-dimensional tests closing detection loopholes and achieving up to 2.7 times the two-dimensional CHSH limit. To counter , tests using cosmic sources like light for random settings have been refined, ensuring causal independence over cosmic distances.

Interpretations

Impact on Local Hidden Variable Theories

Bell's theorem establishes that is incompatible with any theory that assumes local realism, specifically ruling out local hidden variable theories (LHVTs) that attempt to explain quantum phenomena through underlying variables determining measurement outcomes without influences. In such theories, outcomes at distant locations depend only on local settings and shared hidden variables, but Bell showed that the correlations predicted by for entangled particles exceed the limits imposed by these assumptions. Experimental tests confirming violations of Bell inequalities provide direct against LHVTs, demonstrating that quantum predictions hold while local realistic models cannot account for the observed correlations. LHVTs fall into two main categories: deterministic ones, where a hidden variable \lambda assigns definite outcomes to all possible measurements in advance, and ones, where \lambda influences outcome probabilities but still respects locality; however, both types are bounded by Bell's inequality and fail to reproduce the full range of quantum correlations. An exception arises with nonlocal hidden variable theories, such as Bohmian mechanics, which incorporate instantaneous influences across space and thus avoid the locality constraint of Bell's theorem while matching quantum predictions. Historically, Bell's theorem prompted a in research, redirecting efforts away from local hidden variable models toward frameworks that either abandon locality or . The quantitative impact is evident in the extent of inequality violations, which quantum mechanics saturates at Tsirelson's bound—the maximum achievable correlation under quantum rules, approximately $2\sqrt{2} for the CHSH variant—far surpassing the LHVT limit of 2 and underscoring the incompatibility.

Nonlocality and Quantum Interpretations

Bell's theorem demonstrates quantum nonlocality, characterized by correlations between distant measurements that cannot be explained by local influences propagating at or below the speed of light, yet these correlations do not allow for superluminal signaling, preserving compatibility with special relativity. This nonlocality arises from the violation of Bell inequalities, typically observed in entangled systems but also demonstrated in unentangled ones via mechanisms like quantum indistinguishability by path identity, as shown in a 2025 experiment using four-photon frustrated interference that violated the CHSH inequality by more than four standard deviations (S = 2.275 ± 0.057) without entanglement. In such setups, the choice of measurement setting at one location instantaneously correlates with outcomes at a distant site, as predicted by quantum mechanics. In the Copenhagen interpretation, nonlocality is embraced through the mechanism of upon measurement, which enforces the observed correlations without invoking hidden variables or pre-existing local realities. This view, rooted in the probabilistic nature of quantum outcomes, aligns with Bell's theorem by confirming the fundamental role of measurement in resolving superpositions, thereby rejecting local while maintaining the no-signaling . The 2025 unentangled experiment further supports this by showing that probabilistic correlations can emerge from indistinguishability without entanglement. The addresses nonlocality by positing that all possible measurement outcomes occur in branching parallel worlds, eliminating the need for and interpreting Bell violations as effects across the universal in configuration space. Here, entanglement leads to nonlocal correlations without faster-than-light influences in any single world, as the apparent nonlocality emerges from the deterministic evolution of the entire . For unentangled cases like the 2025 experiment, the arises from path identities across branches. Bohmian mechanics explicitly incorporates nonlocality through a guiding pilot wave that instantaneously influences particle trajectories across space, reproducing quantum predictions including Bell inequality violations via the nonlocal dependence of velocities on the full . This deterministic framework accepts "gross" nonlocality as a feature, where distant particles remain connected through the entangled , though relativistic extensions mitigate conflicts with . It could potentially extend to unentangled indistinguishability effects through the wave function's global configuration. In , nonlocality is reframed as a relational property dependent on the observer, with quantum states describing information relative to specific systems rather than absolute facts, allowing Bell correlations to arise without absolute spatial influences. This perspective, emphasizing observer-relative outcomes, reconciles the theorem's implications by treating entanglement as a web of relative correlations, and similarly accommodates indistinguishability-based violations as observer-dependent path relations. Philosophically, Bell's theorem establishes that quantum mechanics is inherently nonlocal, compelling interpretations to either accept instantaneous influences or redefine locality, though they differ on whether these influences represent a real "spooky action at a distance" or emergent relational effects—now extended to non-entangled scenarios. This resolution underscores the theorem's role in shifting focus from local hidden variable theories—now empirically ruled out—to diverse frameworks that integrate nonlocality into the ontology of quantum reality.

Superdeterminism and Other Loopholes

proposes a theoretical escape from the implications of Bell's theorem by relaxing the assumption of statistical independence between variables and settings. In this framework, the variables λ that determine particle outcomes are correlated with the experimenters' choices of settings through a originating from the initial conditions of the universe, such as the , ensuring perfect alignment without requiring nonlocal influences. This correlation violates the "" or measurement independence assumption in Bell's theorem, allowing local variable theories to reproduce quantum correlations while remaining deterministic and local. The concept was notably developed by physicist in his interpretation of , where he argues that provides a consistent, local description of quantum phenomena without invoking randomness at a fundamental level. Criticisms of superdeterminism center on its perceived conspiratorial nature, as it implies a fine-tuned cosmic setup where all experimental choices are predetermined to match the hidden variables, making the theory difficult to falsify or test empirically. Proponents like 't Hooft counter that such correlations arise naturally in a fully deterministic , but detractors argue it undermines the by rejecting the independence required for controlled experiments. Peer-reviewed analyses highlight that superdeterminism's reliance on non-equilibrium initial conditions and signaling dependencies leads to untestable assumptions, rendering it philosophically unappealing despite its logical consistency with Bell violations. Other potential loopholes in Bell tests include the objectivity loophole, where measurement settings may not be objectively random but influenced by subjective or observer-dependent factors, potentially allowing local realist explanations. Recent experiments in 2025 using public quantum computers, such as Quantum and platforms, have aimed to close this loophole by implementing Bell tests with multiple observers confirming outcomes under stringent unanimity conditions, demonstrating violations of Bell inequalities while observing residual but statistically significant signaling that does not compromise the results. The loophole, meanwhile, posits that outcomes of successive trials could depend on previous measurements, violating the assumption in standard analyses; this was shown to enable artificial violations of the in two-sided scenarios, but it can be addressed by using linear CHSH expressions in data analysis, as detailed in comprehensive reviews of Bell experiments. The , formulated by John Conway and Simon Kochen in 2006, further challenges by linking human to particle behavior. The theorem states that if experimenters' choices in spin-1 measurements are not predetermined by prior accessible information (FIN axiom: ), and assuming (QM axiom) and ity (TWIN axiom: no superluminal influences), then the outcomes for entangled particles cannot be determined by any variables accessible in principle. This implies that particles must possess a form of indeterminacy akin to free will, directly rejecting superdeterministic predetermination of all events and reinforcing the case against local hidden variables under standard assumptions. As of November 2025, no conclusive experimental evidence supports these loopholes as viable alternatives to , with advances in loophole-free tests progressively closing gaps like objectivity and , though philosophical debates persist regarding the implications for and free choice in physics.

References

  1. [1]
    Bell's Theorem - Stanford Encyclopedia of Philosophy
    Jul 21, 2004 · Bell's theorem shows that no theory that satisfies the conditions imposed can reproduce the probabilistic predictions of quantum mechanics under all ...Introduction · Proof of a Theorem of Bell's Type · Some Variants of Bell's Theorem
  2. [2]
    [PDF] ON THE EINSTEIN PODOLSKY ROSEN PARADOX*
    THE paradox of Einstein, Podolsky and Rosen [1] was advanced as an argument that quantum mechanics could not be a complete theory but should be supplemented ...
  3. [3]
    Fifty years of Bell's theorem | CERN
    Nov 4, 2014 · A paper by John Bell published on 4 November 1964 laid the foundations for the modern field of quantum-information science.
  4. [4]
    [1303.2849] Bell nonlocality - Quantum Physics - arXiv
    Mar 12, 2013 · Bell's 1964 theorem, which states that the predictions of quantum theory cannot be accounted for by any local theory, represents one of the most ...
  5. [5]
    [PDF] Can Quantum-Mechanical Description of Physical Reality Be
    MAY 15, 1935. PHYSICAL REVIEW. VOLUME 47. Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? A. EINSTEIN, B. PODOLSKY AND N. ROSEN, ...
  6. [6]
    Proposed Experiment to Test Local Hidden-Variable Theories
    Proposed Experiment to Test Local Hidden-Variable Theories. John F. Clauser*. Michael A. Horne · Abner Shimony · Richard A. Holt.
  7. [7]
    What is wrong with von Neumann's theorem on "no hidden variables"
    Aug 31, 2004 · The celebrated 1932 theorem of von Neumann which is often interpreted as proving the impossibility of the existence of hidden variables in Quantum Mechanics,
  8. [8]
    John Bell and the most profound discovery of science - Physics World
    Dec 1, 1998 · It was John Bell who investigated quantum theory in the greatest depth and established what the theory can tell us about the fundamental nature of the physical ...
  9. [9]
    John Stewart Bell Quietly Rings in New Era of Quantum Theory
    Oct 13, 2022 · In 1963, Bell and his wife, Mary, also a physicist, took a sort of sabbatical in America, freeing him up to work on quantum foundations. The ...
  10. [10]
    [PDF] The Depth and Breadth of John Bell's Physics - arXiv
    Bell's working hours at CERN were devoted primarily to theoretical particle physics. He facetiously listed his specialty in an official CERN document as “ ...<|separator|>
  11. [11]
    On the Einstein Podolsky Rosen paradox | Physics Physique Fizika
    On the Einstein Podolsky Rosen paradox. JS Bell. Department of Physics, University of Wisconsin, Madison, Wisconsin.
  12. [12]
  13. [13]
    How Bell's Theorem Proved 'Spooky Action at a Distance' Is Real
    Jul 20, 2021 · John Stewart Bell's 1964 theorem showing that quantum mechanics really permits instantaneous connections between far-apart locations.
  14. [14]
    Bell's theorem without inequalities | American Journal of Physics
    Daniel M. Greenberger, Michael A. Horne, Abner Shimony, Anton Zeilinger; Bell's theorem without inequalities. Am. J. Phys. 1 December 1990; 58 (12): 1131–1143.
  15. [15]
    Quantum mysteries revisited | American Journal of Physics
    Aug 1, 1990 · A gedanken gadget is described, based on an idea of Greenberger, Horne, and Zeilinger, that provides a more powerful demonstration of quantum nonlocality.
  16. [16]
    The Problem of Hidden Variables in Quantum Mechanics
    The Problem of Hidden Variables in Quantum Mechanics. Chapter. pp 293–328; Cite this chapter. Download book PDF.
  17. [17]
    [PDF] Kochen–Specker theorem revisited - arXiv
    Abstract: The Kochen–Specker theorem is a basic and fundamental 50 year old non-existence result affecting the foundations of quantum mechanix, ...
  18. [18]
    Hidden Variables and the Two Theorems of John Bell - arXiv
    Feb 27, 2018 · I describe some recent versions of the lesser known of the two (familar to experts as the "Kochen-Specker theorem") which have transparently ...
  19. [19]
    Two simple proofs of the Kochen-Specker theorem - IOPscience
    A new proof of the Kochen-Specker theorem uses 33 rays, instead of 117 in the original proof. ... Kochen S and Specker E 1967 J. Math. Mech. 17 59-87.
  20. [20]
    Bell-Kochen-Specker theorem: A proof with 18 vectors - arXiv
    Jun 3, 1997 · We present a state-independent proof of the Bell-Kochen-Specker theorem using only 18 four-dimensional vectors, which is a record for this kind of proof.Missing: simplified | Show results with:simplified
  21. [21]
    Experimental Test of Local Hidden-Variable Theories | Phys. Rev. Lett.
    It has been shown by a generalization of Bell's inequality that the existence of local hidden variables imposes restrictions on this correlation in conflict ...Missing: source | Show results with:source
  22. [22]
    [PDF] Lawrence Berkeley National Laboratory - eScholarship
    This result is expressed in terms of an inequality ("Bell's inequality") that was subsequently generalized to a form applicable to the present experiment.
  23. [23]
    Experimental Test of Bell's Inequalities Using Time-Varying Analyzers
    The results are in good agreement with quantum mechanical predictions but violate Bell's inequalities by 5 standard deviations.
  24. [24]
    Loophole-free Bell inequality violation using electron spins ... - Nature
    Oct 21, 2015 · Here we report a Bell experiment that is free of any such additional assumption and thus directly tests the principles underlying Bell's inequality.
  25. [25]
    Significant-Loophole-Free Test of Bell's Theorem with Entangled Photons
    ### Summary of https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.115.250401
  26. [26]
    None
    Nothing is retrieved...<|control11|><|separator|>
  27. [27]
    Bell inequality violation in gate-defined quantum dots - Nature
    Apr 22, 2025 · We demonstrate a 97.17% Bell state fidelity without correcting for readout errors and violate Bell's inequality using direct parity readout.
  28. [28]
    [2407.15778] Violating Bell's inequality in gate-defined quantum dots
    Abstract page for arXiv paper 2407.15778: Violating Bell's inequality in gate-defined quantum dots.Missing: theorem | Show results with:theorem
  29. [29]
    Violation of Bell inequality with unentangled photons - Science
    Aug 1, 2025 · Here, we report the violation of the Bell inequality that cannot be described by quantum entanglement in the system but arises from quantum ...
  30. [30]
    [2507.07756] Violation of Bell Inequality with Unentangled Photons
    Jul 10, 2025 · Here we report the violation of the Bell inequality that cannot be described by quantum entanglement in the system but arises from quantum indistinguishability ...
  31. [31]
    Closing objectivity loophole in Bell tests on a public quantum computer
    We have constructed and run a Bell test of local realism focusing on the objectivity criterion. The objectivity means that the outcomes are confirmed ...
  32. [32]
    Experimental self-testing of complex projective measurements via ...
    Jul 21, 2025 · In this work, we present an innovative protocol for self-testing projective measurements in complex Hilbert space through an elegant Bell ...
  33. [33]
    [PDF] An Expansion of Current Loopholes in Bell Experiments | NHSJS
    Dec 29, 2024 · This paper reviews and expands upon so-called “loopholes"in Bell experiments, factors that can be exploited by locally-causal.
  34. [34]
    High-Dimensional Bell Test without Detection Loophole
    We show that four-dimensional entangled photons violate a Bell inequality while closing the detection loophole in experiment.Missing: 2020s | Show results with:2020s
  35. [35]
    Rethinking Superdeterminism - Frontiers
    In the Cosmic Bell Test [11], measurement settings are determined by the precise wavelength of light from distant quasars, sources which were causally ...Missing: 2020s | Show results with:2020s
  36. [36]
    [PDF] Experimental Test of Bell's Inequalities Using Time- Varying Analyzers
    Dec 20, 1982 · 20 DECEMBER 1982. Two runs have been performed in order to test. Bell's inequalities. In each run, we have chosen a set of orientations ...
  37. [37]
    Bohmian Mechanics - Stanford Encyclopedia of Philosophy
    Oct 26, 2001 · Bohmian mechanics inherits and makes explicit the nonlocality implicit in the notion, common to just about all formulations and interpretations ...The Impossibility of Hidden... · The Defining Equations of... · The Quantum Potential
  38. [38]
    Bell's theorem - Scholarpedia
    Summary of each segment:
  39. [39]
    Four ways to interpret quantum mechanics - CERN Courier
    Jul 9, 2025 · The many-worlds interpretation is very close to this. It supplements the relational interpretation with an overall quantum state, interpreted ...
  40. [40]
    Many-Worlds Interpretation of Quantum Mechanics
    Mar 24, 2002 · The Many-Worlds Interpretation (MWI) of quantum mechanics holds that there are many worlds which exist in parallel at the same space and time as our own.
  41. [41]
    The Cellular Automaton Interpretation of Quantum Mechanics - arXiv
    May 7, 2014 · View a PDF of the paper titled The Cellular Automaton Interpretation of Quantum Mechanics, by Gerard 't Hooft ... superdeterminism' can be ...
  42. [42]
    Superdeterministic hidden-variables models I: non-equilibrium and ...
    Nov 18, 2020 · This is the first of two papers that attempt to comprehensively analyse superdeterministic hidden-variables models of Bell correlations.
  43. [43]
    Quantum nonlocality, Bell inequalities and the memory loophole
    Oct 5, 2002 · We show that the 2-sided memory loophole allows a systematic violation of the CHSH inequality when the data are analysed in the standard way.Missing: theorem | Show results with:theorem
  44. [44]