Fact-checked by Grok 2 weeks ago

Quantum foundations

Quantum foundations is the branch of physics that investigates the conceptual, mathematical, and philosophical underpinnings of quantum mechanics, addressing core issues such as the interpretation of wave function collapse, the nature of superposition and entanglement, non-locality, and the reconciliation of quantum theory with classical physics and relativity. Despite quantum mechanics' unparalleled predictive accuracy in describing microscopic phenomena like atomic spectra and particle interactions, its foundational aspects remain unresolved, prompting ongoing debates about the ontology of the theory and the reality it describes. The field originated in the early 20th century amid the formulation of quantum mechanics, with pivotal contributions from pioneers such as Max Planck, who introduced the quantum hypothesis in 1900 to explain blackbody radiation, and Werner Heisenberg, whose 1925 matrix mechanics laid the groundwork for non-commutative algebra in quantum theory. Key historical debates, including the Bohr-Einstein debates that began in 1927, centered on the completeness of quantum mechanics and the role of hidden variables; these were advanced by the Heisenberg uncertainty principle and Bohr's principle of complementarity, though Einstein remained skeptical, leading to further challenges. John von Neumann's 1932 axiomatization formalized the measurement process and introduced quantum entropy, quantifying irreversibility as S = -k \sum p_i \log p_i, where k is Boltzmann's constant and p_i are probabilities. Central to quantum foundations are several enduring problems, including the measurement problem, where the Schrödinger equation predicts persistent superpositions but observations yield definite outcomes, necessitating interpretations to explain the apparent collapse. Prominent interpretations include the Copenhagen interpretation, which posits wave function collapse upon measurement and distinguishes classical from quantum domains; the Many-Worlds interpretation, proposing branching universes without collapse; the de Broglie-Bohm pilot-wave theory, assigning definite particle trajectories guided by the wave function; and QBism, viewing the wave function as subjective Bayesian probabilities rather than objective reality. Another cornerstone is Bell's theorem (1964), which demonstrated that local hidden variable theories cannot reproduce quantum predictions, with experimental violations of Bell inequalities—such as those confirmed over distances up to 1200 km via satellite—affirming quantum non-locality while challenging classical intuitions of locality and causality. This experimental confirmation of quantum nonlocality was recognized by the 2022 Nobel Prize in Physics, awarded to John Clauser, Alain Aspect, and Anton Zeilinger for their pioneering work on entangled photons. Contemporary quantum foundations research intersects with quantum information science, exploring implications for entropy, irreversibility, and the quantum-to-classical transition through decoherence, where environmental interactions suppress quantum superpositions to yield classical behavior. Open questions, as highlighted in the 2013 Oxford Questions, include the fundamental nature of time and information in quantum gravity, the possibility of macroscopic superpositions, and whether quantum theory requires extension or replacement to unify with general relativity at the Planck scale of $10^{-35} meters. These inquiries not only deepen theoretical understanding but also underpin practical advancements in quantum technologies, such as computing, cryptography, and sensing, demonstrating the field's profound societal impact.

Historical development

Origins in early quantum mechanics

The roots of quantum foundations lie in the breakdowns of classical physics during the late 19th and early 20th centuries, particularly in phenomena involving radiation and atomic structure that defied continuous energy assumptions. A pivotal moment occurred in 1900 when Max Planck addressed the blackbody radiation problem, where classical Rayleigh-Jeans theory predicted infinite energy at high frequencies (the ultraviolet catastrophe), contradicting experimental spectra. Planck derived an empirical formula by postulating that energy from atomic oscillators is exchanged in discrete quanta E = h\nu, with h as a universal constant and \nu the frequency, successfully matching observations across all wavelengths. This quantization idea gained traction through Albert Einstein's 1905 explanation of the photoelectric effect, where classical wave theory failed to account for the threshold frequency and linear energy dependence of ejected electrons. Einstein hypothesized light as consisting of localized energy quanta (photons) with E = h\nu, such that electron emission occurs only if h\nu > \phi (with \phi the material's work function), and the electron's kinetic energy is h\nu - \phi. This particle-like view of light extended Planck's discrete energy to electromagnetic waves, challenging classical continuity. Niels Bohr advanced these concepts in his 1913 model of the hydrogen atom, reconciling Rutherford's nuclear structure with quantization to explain discrete spectral lines. Electrons occupy stationary orbits with quantized angular momentum L = n\hbar (where n is an integer and \hbar = h/2\pi), preventing classical radiation losses, and transitions between levels emit or absorb photons of energy \Delta E = h\nu. This semi-classical framework reproduced Balmer series frequencies accurately, introducing ad hoc stability rules that hinted at deeper quantum principles. Wave-particle duality emerged explicitly in Louis de Broglie's 1924 thesis, proposing that all matter possesses wave properties analogous to light's duality, with wavelength \lambda = h/p ( p momentum) derived from extending Einstein's photon momentum p = h/\lambda. This hypothesis unified particle and wave behaviors, predicting electron diffraction later confirmed experimentally. Building on this, Werner Heisenberg formulated matrix mechanics in 1925 as a non-commutative algebra of observables, replacing trajectories with arrays whose products yield transition amplitudes, aligning with Bohr's correspondence principle for large quantum numbers. Erwin Schrödinger independently developed wave mechanics in 1926, treating particles as waves governed by a differential equation. For time-independent cases, the state \psi satisfies the eigenvalue equation H \psi = E \psi, where H is the Hamiltonian operator incorporating kinetic and potential energies, yielding quantized energies E for bound systems like the hydrogen atom. This approach reproduced Bohr's results exactly and extended to multi-electron systems. Max Born provided the probabilistic interpretation that same year, stating that |\psi|^2 dV represents the probability of finding the particle in volume dV, shifting quantum theory from deterministic waves to statistical predictions and resolving the physical meaning of \psi. These foundational formalisms from 1900 to 1926 established quantum mechanics as a predictive theory, yet their non-classical elements—discreteness, duality, and probability—soon sparked philosophical debates on reality and measurement.

Emergence of foundational questions

The formulation of quantum mechanics in the mid-1920s, while providing a powerful predictive framework, soon gave rise to profound foundational questions about the nature of reality, measurement, and determinism. A pivotal moment came with Werner Heisenberg's introduction of the uncertainty principle in 1927, which posited that certain pairs of physical properties, such as position and momentum, cannot be simultaneously measured with arbitrary precision, implying inherent limits to knowledge in the quantum realm. This principle challenged classical intuitions of a fully determinate world, sparking debates on whether quantum indeterminacy reflected fundamental unpredictability or incomplete theory. These issues intensified at the Fifth Solvay Conference in October 1927, where Niels Bohr and Albert Einstein engaged in a landmark debate on the interpretation of quantum mechanics. Bohr defended his concept of complementarity, arguing that wave-particle duality required mutually exclusive experimental contexts, rendering quantum descriptions inherently observer-dependent and rejecting classical realism. Einstein, countering with thought experiments like the "clock in a box," insisted on the existence of objective reality independent of measurement, questioning whether quantum mechanics could be complete without hidden variables underlying apparent randomness. The exchanges highlighted tensions between probabilistic quantum predictions and the desire for a deterministic, local description of nature. In 1932, John von Neumann formalized these concerns in his axiomatic treatment of quantum mechanics, introducing the projection postulate to describe measurement-induced state collapse and proving a no-hidden-variables theorem that seemed to rule out deterministic underpinnings consistent with quantum statistics. This theorem assumed that hidden variables would need to reproduce quantum probabilities additively, leading to an impossibility result under standard assumptions, though later critiques revealed flaws in its scope. The mid-1930s saw further challenges: the Einstein-Podolsky-Rosen (EPR) paradox of 1935 argued that quantum mechanics was incomplete because entangled particles implied instantaneous influences violating locality, as measuring one particle's property appeared to determine the distant other's without causal connection. Concurrently, Erwin Schrödinger's cat thought experiment illustrated the absurdity of applying superposition to macroscopic objects, where a cat in a sealed box linked to a quantum event would be simultaneously alive and dead until observed, underscoring the measurement problem's scale. These paradoxes persisted into the 1960s, with Eugene Wigner's friend thought experiment in 1961—rooted in earlier ideas from the 1930s—questioning the role of consciousness in wave function collapse. In this scenario, a friend measures a quantum system inside a lab, entangling it with their knowledge, while Wigner outside views the entire setup as superposed until he intervenes, raising irreconcilable descriptions between observers. Such debates crystallized the foundational crises, prompting ongoing scrutiny of quantum mechanics' ontological status without resolving whether reality is observer-independent or fundamentally non-classical.

Core non-classical features

Superposition and the measurement problem

In quantum mechanics, the superposition principle asserts that a quantum system can exist in a linear combination of multiple states simultaneously, described mathematically as |\psi\rangle = \alpha |0\rangle + \beta |1\rangle, where \alpha and \beta are complex coefficients satisfying |\alpha|^2 + |\beta|^2 = 1, and |0\rangle, |1\rangle represent basis states. This principle, fundamental to the theory's linear structure, allows quantum states to interfere constructively or destructively, leading to observable effects that defy classical intuition. For instance, in the double-slit experiment, particles such as electrons or photons passing through two slits produce an interference pattern on a detection screen, as if each particle explores both paths simultaneously and interferes with itself, demonstrating the wave-like aspect of quantum matter. The measurement problem arises from the apparent conflict between this unitary evolution of superpositions under the Schrödinger equation and the definite, classical outcomes observed upon measurement, questioning why and how a superposition "collapses" into a single definite state with probabilities given by the Born rule. John von Neumann formalized this issue in his analysis of the measurement process, describing a chain where the system's state entangles sequentially with the measuring apparatus, then the environment, and ultimately the observer, yet the collapse occurs only at the subjective boundary of the observer's consciousness, leaving unresolved where exactly the transition to a definite outcome happens. This von Neumann chain highlights the problematic role of the observer, as extending the chain indefinitely without collapse would entangle the observer in a superposition, contradicting everyday experience of classical reality. Decoherence provides a partial explanation for the appearance of definite outcomes by showing how interactions with the environment rapidly suppress interference between superposition components, effectively selecting robust "pointer states" that behave classically without invoking a true collapse. Wojciech Zurek's work in the 1980s and 1990s developed this through the concept of environment-induced superselection (einselection), where environmental entanglement leads to the loss of off-diagonal terms in the density matrix, making superpositions unobservable on macroscopic scales, though it does not fully resolve the origin of the probabilistic Born rule or the preferred basis problem. Complementing these challenges, Gleason's theorem demonstrates that non-contextual hidden variable theories—positing definite pre-measurement values independent of measurement context—cannot reproduce quantum probabilities for systems with Hilbert space dimension greater than two, ruling out such deterministic underpinnings for superposition without contextual influences. While entanglement extends superposition to multi-particle correlations, the measurement problem primarily concerns the collapse in single-system superpositions.

Entanglement and quantum nonlocality

Quantum entanglement refers to a phenomenon in quantum mechanics where the quantum state of two or more particles cannot be described independently, even when separated by large distances, such that the state of the entire system is inseparable. This concept was first articulated by Erwin Schrödinger in 1935, who introduced the term "entanglement" to describe the peculiar correlations arising from such composite systems. A canonical example is the Bell state |\Phi^+\rangle = \frac{1}{\sqrt{2}} (|00\rangle + |11\rangle), where two qubits exhibit perfect correlation in their outcomes upon measurement, regardless of the distance between them. The implications of entanglement for locality were dramatically highlighted in the 1935 Einstein-Podolsky-Rosen (EPR) argument, which questioned the completeness of quantum mechanics by considering an entangled pair of particles with perfectly correlated positions and momenta. Einstein, Podolsky, and Rosen argued that measuring one particle's properties instantaneously determines the other's, suggesting either quantum mechanics is incomplete or involves "spooky action at a distance" that violates the principle of locality in special relativity. This paradox underscored the tension between quantum correlations and classical intuitions of separability, prompting decades of debate on whether hidden variables could restore locality. John Stewart Bell resolved this debate in 1964 by deriving an inequality that any local hidden-variable theory must satisfy for measurements on entangled particles. Specifically, for two parties Alice and Bob performing measurements A, A' and B, B' respectively on their shares of an entangled pair, the correlations obey the CHSH inequality:
|\langle AB \rangle + \langle AB' \rangle + \langle A'B \rangle - \langle A'B' \rangle| \leq 2.
Quantum mechanics predicts violations of this bound, reaching a maximum of $2\sqrt{2} for certain entangled states, demonstrating that no local hidden-variable model can reproduce all quantum predictions. This inequality, formalized by Clauser, Horne, Shimony, and Holt in 1969, provided a testable criterion for nonlocality.
Experimental validations began with Alain Aspect's 1982 tests using entangled photons, which violated the CHSH inequality by approximately 5 standard deviations, closing the locality loophole through rapid switching of measurement settings. Decades later, fully loophole-free demonstrations were achieved in 2015: Hensen et al. used electron spins in diamond separated by 1.3 km, observing a CHSH violation of S = 2.42 \pm 0.20, simultaneously addressing detection, locality, and freedom-of-choice loopholes. Independently, Giustina et al. employed entangled photons over 1.3 km, reporting S = 2.27 \pm 0.23, confirming quantum nonlocality without experimental assumptions favoring quantum predictions. Despite these nonlocal correlations, quantum mechanics preserves relativistic causality through the no-signaling theorem, which ensures that the measurement statistics for one party remain unchanged regardless of the distant party's choice of measurement basis, preventing superluminal information transfer. This theorem, inherent to the quantum formalism, reconciles entanglement's nonlocality with special relativity by prohibiting controllable signaling.

Contextuality and incompatibility

Quantum contextuality refers to the phenomenon in quantum mechanics where the outcome probabilities of measurements depend on the specific context of compatible observables measured together, violating the assumptions of non-contextual hidden variable models that assign predetermined values to all observables independently of measurement context. In such non-contextual models, each quantum state would correspond to a hidden variable that fixes the value of every observable, with measurement merely revealing that pre-assigned value, but quantum predictions contradict this possibility. The Kochen-Specker theorem, proved in 1967, demonstrates that no non-contextual hidden variable theory can reproduce quantum mechanics for a single quantum system in three-dimensional Hilbert space. Specifically, the theorem shows that it is impossible to assign definite values (0 or 1 for projectors) to all one-dimensional subspaces (rays) such that the assignments are consistent with the functional relations imposed by quantum orthogonality and completeness, without depending on the measurement context. The original proof constructs a set of 117 vectors in \mathbb{R}^3, grouped into 40 orthogonal triads, where any non-contextual assignment leads to a contradiction because the number of vectors assigned value 1 in each complete basis must sum to 1, but the interlocking structure makes this impossible. John S. Bell independently arrived at a related result in 1966, emphasizing the contextuality inherent in quantum measurements for single systems, later termed the Bell-Kochen-Specker theorem, which highlights that quantum mechanics requires contextual value assignments even without spatial separation. This single-system contextuality underscores a core non-classical feature distinct from multipartite correlations, though nonlocality can be viewed as a form of multipartite contextuality. Incompatibility complements contextuality by arising from non-commuting observables, which prevent simultaneous precise measurements and lead to uncertainty relations. The canonical example is the position-momentum commutator [\hat{x}, \hat{p}] = i\hbar, derived from the foundational structure of quantum kinematics, implying that the product of uncertainties satisfies \Delta x \Delta p \geq \hbar/2. This incompatibility ensures that measurement contexts cannot be ignored, as joint probability distributions for incompatible observables do not exist in quantum theory. Experimental verification of Kochen-Specker contextuality has been achieved through inequality tests that non-contextual models must satisfy but quantum predictions violate. In 2008, Klyachko et al. proposed the KCBS inequality for a spin-1 system using five pairwise compatible projectors P_i (i=1 to 5) forming a pentagon compatibility graph, where non-contextual hidden variable theories satisfy \sum_{i=1}^5 \langle P_i \rangle \leq 2, but quantum mechanics achieves up to \sqrt{5} \approx 2.236, violating the bound. This was confirmed experimentally with photons in subsequent works. Subsequent loophole-free tests have further solidified these results, closing detection and compatibility issues.

Interpretations of quantum mechanics

Copenhagen and instrumentalist views

The Copenhagen interpretation, developed in the late 1920s and early 1930s primarily by Niels Bohr and Werner Heisenberg, represents a foundational operational approach to quantum mechanics that emphasizes the theory's predictive power for experimental outcomes rather than a description of an underlying objective reality. This view arose as a response to the measurement problem, where quantum superpositions appear to resolve into definite states upon observation, motivating a focus on the conditions under which measurements are performed. Central to the Copenhagen framework is Bohr's principle of complementarity, introduced in 1928, which posits that wave-particle duality in quantum phenomena—such as light behaving as both waves and particles—cannot be observed simultaneously but requires mutually exclusive experimental arrangements. For instance, the double-slit experiment reveals wave-like interference when no which-path information is sought, but particle-like behavior emerges when such information is obtained, illustrating how complementary descriptions are context-dependent and exhaustive for understanding quantum systems. Bohr argued that this principle resolves apparent paradoxes by acknowledging the limitations of classical concepts in the quantum domain, without invoking hidden variables or deeper mechanisms. Heisenberg and Max Born further shaped the probabilistic core of this interpretation by interpreting the quantum formalism as yielding probabilities for measurement outcomes, rather than deterministic trajectories. In 1927, Heisenberg emphasized the uncertainty principle, linking it to the unavoidable disturbance of quantum systems during observation, while Born's 1926 rule specified that the square of the wave function's amplitude gives the probability density for finding a particle at a given location. This statistical approach, formalized in their joint report at the 1927 Solvay Conference, treats quantum mechanics as a tool for calculating likelihoods of observable events, eschewing claims about unobservable intermediate states. John von Neumann provided a rigorous mathematical underpinning in his 1932 treatise, introducing the projection postulate—often called the collapse postulate—whereby the quantum state vector undergoes an instantaneous, non-unitary reduction to an eigenstate upon measurement of an observable. This postulate delineates the boundary between quantum evolution (governed by the Schrödinger equation) and classical measurement outcomes, reinforcing the Copenhagen emphasis on the act of measurement as the point where probabilities actualize. Von Neumann's framework thus operationalizes the theory, assuming a separation between the quantum system and a classical measuring apparatus without specifying the physical process of collapse. Instrumentalist extensions of these ideas, such as Quantum Bayesianism (QBism) developed by Christopher Fuchs and collaborators in the 2000s, treat the quantum state not as an objective feature of physical systems but as a subjective catalog of an agent's beliefs about future measurement results. In QBism, probabilities derived from the Born rule reflect personal degrees of credence, updated Bayesian-style upon new data, thereby resolving interpretive puzzles like the measurement problem by relocating them to epistemic rather than ontological domains. This approach aligns with Copenhagen's operationalism, prioritizing how quantum predictions guide empirical inquiry over any commitment to a hidden reality beneath the formalism. A defining feature of Copenhagen and instrumentalist views is their deliberate avoidance of assumptions about hidden variables or an independent quantum reality, instead concentrating exclusively on verifiable predictions for macroscopic measurements. This pragmatic stance has enabled quantum mechanics' extraordinary success in applications, from atomic spectra to quantum technologies, by focusing on what can be tested rather than unobservable entities. Criticisms of these interpretations, particularly from the 1920s through the 1950s, center on the lack of a detailed mechanism for wave function collapse, which appears as an ad hoc addition to the unitary evolution of the Schrödinger equation without physical justification. Einstein, for instance, famously objected that such a view renders quantum theory incomplete, as it fails to explain the transition from probabilistic superpositions to singular outcomes in a causally coherent manner. Despite these challenges, the instrumentalist emphasis on empirical adequacy has sustained Copenhagen's influence as the de facto framework for much of quantum physics practice.

Ontological interpretations

Ontological interpretations of quantum mechanics aim to provide a realist ontology by attributing objective existence to quantum entities, such as particles or wave functions, thereby addressing foundational issues like the measurement problem through underlying physical dynamics rather than observer-dependent collapse. These approaches posit that the quantum state describes an objective reality, often incorporating hidden variables or universal evolution to recover classical-like definiteness while reproducing empirical predictions. Unlike instrumentalist views, they seek a complete description of physical systems independent of measurement contexts. A prominent example is Bohmian mechanics, also known as the de Broglie-Bohm pilot-wave theory, introduced by David Bohm in 1952. In this framework, particles possess definite trajectories and positions at all times, with their motion determined by a guiding equation derived from the wave function \psi. The velocity of a particle is given by \frac{d\mathbf{x}}{dt} = \frac{\hbar}{m} \Im \left( \frac{\nabla \psi}{\psi} \right), where \mathbf{x} is the particle position, m its mass, \hbar the reduced Planck's constant, and \Im denotes the imaginary part. The wave function evolves according to the standard Schrödinger equation, acting as a nonlocal pilot wave that influences particle motion without altering its form. This deterministic ontology resolves the measurement problem by eliminating collapse, as apparent randomness arises from initial position ignorance, though it implies instantaneous influences across distances, challenging locality. Another key ontological interpretation is the many-worlds interpretation, formulated by Hugh Everett III in 1957. Here, the entire universe is described by a single universal wave function that evolves linearly via the Schrödinger equation, with no wave function collapse occurring during measurements. Instead, interactions between quantum systems and observers lead to entanglement, effectively branching the universe into multiple, non-interacting worlds, each realizing a different outcome of a superposition. For instance, in a spin measurement, the observer-system state splits into correlated branches where the particle is "up" in one world and "down" in another, with all branches coexisting objectively. This approach maintains unitarity and avoids special roles for measurement, providing a fully objective dynamics, but it requires accepting the proliferation of worlds and raises questions about the preferred basis for branching. The consistent histories interpretation offers an alternative ontological framework, pioneered by Robert B. Griffiths in 1984 and extended by Roland Omnès. It generalizes the Born rule to assign probabilities not just to single events but to entire sequences or "histories" of quantum events, defined as chains of projectors on the system's Hilbert space. A set of histories is deemed consistent or decoherent if the interference between different paths vanishes, satisfying the condition \sum_k \text{Tr}(C_{h_k}^\dagger C_{h_k} \rho) = \text{Tr}(\rho \sum_k C_{h_k}^\dagger C_{h_k}) for density operator \rho and class operators C_{h_k}, ensuring classical probability rules apply without contradiction. This allows objective descriptions of closed quantum systems over time, resolving the measurement problem by selecting decoherent families of histories as physically realizable narratives, akin to classical paths. However, the approach depends on choosing appropriate decoherent sets, which can introduce ambiguity in the preferred basis. Collectively, these ontological interpretations resolve the measurement problem by invoking objective, collapse-free dynamics—deterministic guidance in Bohmian mechanics, universal branching in many-worlds, or decoherent narratives in consistent histories—but they encounter challenges such as nonlocality, ontological extravagance, or basis selection issues.

Informational and relational approaches

Informational approaches to quantum foundations posit that the fundamental entities of the universe are not physical objects or fields per se, but rather information and the processes by which it is acquired and processed. This perspective, famously encapsulated in John Archibald Wheeler's "it from bit" hypothesis, suggests that every physical "it"—every particle, force, or spacetime feature—derives its existence and properties from binary yes/no questions answered through quantum measurements, rendering reality as an emergent construct from informational exchanges. Wheeler argued that this framework bridges quantum mechanics with general relativity by treating information as the bedrock, influencing subsequent efforts to derive quantum rules from informational axioms, such as the no-cloning theorem and the Born rule, without presupposing the full Hilbert space structure. Relational quantum mechanics (RQM), proposed by Carlo Rovelli in 1996, extends this informational ethos by viewing quantum states as inherently relative to specific observers or systems, eschewing any absolute, observer-independent facts about the world. In RQM, the state of a system is not a complete description of reality but a tool encoding correlations between that system and another interacting system, such as an observer; outcomes of measurements are thus relativized, with different observers potentially describing the same event differently without contradiction. This relational ontology treats the wave function as epistemic—reflecting an observer's knowledge of correlations rather than an ontic entity representing the system's intrinsic state—thereby avoiding paradoxes like the measurement problem by denying a unique, universal reality and emphasizing observer-dependent perspectives. For instance, in the Schrödinger's cat thought experiment, the cat's state is alive relative to one observer and dead relative to another, resolving apparent superpositions without collapse. Quantum Darwinism, developed by Wojciech H. Zurek starting in the early 2000s, complements these views by explaining how classical objectivity emerges from quantum superpositions through environmental interactions that redundantly broadcast information about preferred states. The theory posits that certain robust "pointer states" of a quantum system—those stable under decoherence—proliferate copies of themselves in the environment via entanglement, allowing multiple observers to access the same classical information without directly interacting with the system. This redundancy acts as a Darwinian selection process, where only the fittest states (those maximizing information survival) become "observable" reality, with the environment serving as a witness that amplifies classical correlations while suppressing quantum alternatives. Decoherence plays a supportive role here by enabling this proliferation without resolving the interpretive issues outright. A key recent development within relational and informational paradigms is the revival of the Page-Wootters mechanism, originally proposed in 1983, which derives the appearance of time and dynamical evolution from static quantum entanglement in a timeless universe. In this framework, time emerges relationally as correlations between a clock subsystem and the rest of the system, with the global wave function remaining stationary while conditional states evolve as if under the Schrödinger equation. Revived in the 2010s through experimental tests and theoretical extensions, the mechanism aligns with informational approaches by treating time as an emergent relational property, further supporting the idea that quantum theory describes correlations rather than absolute states. This avoids foundational puzzles like the origin of time in quantum gravity by relativizing temporal facts to entangled subsystems. Despite these diverse approaches, a 2025 Nature survey of over 1,100 physicists revealed sharp divisions, with no single interpretation achieving consensus; the Copenhagen interpretation remains the most favored among respondents, underscoring the field's ongoing debates.

Axiomatic reconstructions

Traditional Hilbert space formalism

The traditional Hilbert space formalism establishes the mathematical foundation of quantum mechanics through a set of axioms primarily formulated by Paul Dirac and John von Neumann in the late 1920s and early 1930s. This framework posits that quantum systems are described in an infinite-dimensional, separable Hilbert space \mathcal{H} over the complex numbers, where physical states and observables are represented abstractly without reference to specific coordinates. The axioms emphasize the duality between states and observables, enabling probabilistic predictions for measurement outcomes while incorporating non-classical features like superposition. Central to the formalism are the representations of quantum states and observables. A pure state of the system is given by a ray in \mathcal{H}, corresponding to a normalized vector |\psi\rangle unique up to a global phase factor e^{i\theta}, such that \langle \psi | \psi \rangle = 1. Mixed states are described by density operators \rho, which are positive semi-definite trace-class operators with \operatorname{Tr}(\rho) = 1. Observables, representing measurable quantities like position or momentum, are modeled as self-adjoint operators \hat{A} on \mathcal{H}, ensuring real-valued expectation values \langle \hat{A} \rangle = \langle \psi | \hat{A} | \psi \rangle. The connection between observables and measurement outcomes relies on the spectral theorem for self-adjoint operators, which guarantees a spectral decomposition \hat{A} = \int_{-\infty}^{\infty} a \, dE(a), where E(a) is a projection-valued measure onto the eigenspaces and the spectrum consists of possible outcomes a. For a non-degenerate eigenvector satisfying \hat{A} |\psi\rangle = a |\psi\rangle, a measurement yields the definite outcome a with probability 1. In general, the probability of obtaining outcome a for state |\psi\rangle is given by p(a) = \langle \psi | E(da) | \psi \rangle, aligning with the Born rule derived within this structure. Time evolution between measurements proceeds deterministically via unitary dynamics governed by the Schrödinger equation: i \hbar \frac{\partial}{\partial t} |\psi(t)\rangle = \hat{H} |\psi(t)\rangle, where \hat{H} is the self-adjoint Hamiltonian operator encoding the system's energy. This generates a one-parameter unitary group U(t) = e^{-i \hat{H} t / \hbar}, preserving the norm \| U(t) |\psi\rangle \| = \| |\psi\rangle \| and thus the total probability. The equation applies to closed systems without external interactions, maintaining coherence until a measurement intervenes. Measurement introduces the projection postulate, which describes the non-unitary collapse of the state upon obtaining an outcome. If the measurement of \hat{A} yields a, the post-measurement state is the normalized projection |\psi'\rangle = \frac{E(a) |\psi\rangle}{\sqrt{\langle \psi | E(a) | \psi \rangle}}, with the probability \langle \psi | E(a) | \psi \rangle. This postulate, distinct from unitary evolution, accounts for the irreversible nature of observation but raises foundational questions about its physical mechanism. A key result supporting the uniqueness of this probabilistic structure is Gleason's theorem, which proves that any probability measure on the closed subspaces of a Hilbert space \mathcal{H} with \dim \mathcal{H} \geq 3 must be of the form \mu(P) = \operatorname{Tr}(\rho P) for some density operator \rho, assuming non-contextuality (i.e., frame functions independent of orthonormal basis choice). The theorem applies to complex Hilbert spaces and excludes dimension 2 due to counterexamples like the Kochen-Specker theorem's precursors, reinforcing the formalism's reliance on the standard Born rule without hidden variables. This axiomatic setup assumes a complex, separable Hilbert space of dimension at least 3 and focuses on non-relativistic phenomena, without incorporating special relativity or field-theoretic extensions. Limitations include the absence of a direct relativistic formulation and challenges in treating infinite-dimensional cases rigorously for unbounded operators.

Generalized probabilistic theories

Generalized probabilistic theories (GPTs) provide a broad operational framework for axiomatic reconstructions of physical theories, encompassing classical probability theory, quantum theory, and hypothetical extensions beyond both. In this approach, physical systems are described abstractly through their statistical predictions, without presupposing specific mathematical structures like Hilbert spaces. A GPT consists of a state space, which is a convex set representing all possible preparations of the system, with pure states as the extreme points and mixed states as convex combinations thereof. Effects are affine functionals on the state space mapping to probabilities between 0 and 1, corresponding to measurement outcomes, while tests are collections of effects that form complete measurements summing to the unit effect. Atomicity in GPTs posits that every state can be decomposed into a finite mixture of perfectly distinguishable pure states, ensuring that information is carried by indivisible "atoms" of probability. Purification is a key principle stating that every state admits a purification—a pure state on an extended system whose marginal recovers the original state—and that such purifications are unique up to reversible channels on the purifying system. These principles, along with causality (ensuring no superluminal signaling) and atomic parallelism (parallel composition of atomic tests yields atomic tests), structure the theory's composite systems via a convex combination of parallel and sequential compositions. In the 2000s and 2010s, Giulio Chiribella and collaborators developed an information-theoretic approach within GPTs to derive quantum theory from elementary axioms. Their framework begins with causality, which implies a unique normalization for states; perfect distinguishability, allowing non-orthogonal states to be told apart with certainty using multiple copies; and ideal compression, enabling lossless encoding into minimal-dimensional systems. Additional axioms include local distinguishability for composite states and purification, which together imply that the state space is a simplex of rank equal to the dimension squared, leading to the quantum formalism of density operators on complex Hilbert spaces. These axioms rule out classical theory by permitting superposition and entanglement, while excluding more exotic theories through constraints on information processing. Key principles in GPTs, such as the no-cloning and no-deleting theorems, arise as limits on discriminability in non-classical theories. The no-cloning theorem states that no channel can perfectly copy an unknown state onto an ancillary system for all input states, a consequence of the non-orthogonality of states in theories beyond classical probability. Similarly, the no-deleting theorem prohibits the perfect erasure of an unknown state while preserving others, emerging from the same operational constraints on reversible transformations and purification in GPTs. These no-go results highlight quantum theory's uniqueness in balancing information preservation with irreversibility. Quantum theory's uniqueness within GPTs is established by sets of 5 to 8 axioms that exclude both classical probability and supra-quantum correlations like Popescu-Rohrlich (PR) boxes. Lucien Hardy's 2001 formulation uses five axioms—probabilistic consistency, simplicity of degrees of freedom, subspace inheritance, tensor product composition for composites, and continuous reversibility—deriving quantum mechanics while ruling out classical theory (which lacks continuous transformations between pure states) and PR boxes (which violate the tensor structure and bounded correlations). Complementing this, Lluís Masanes and Markus P. Müller in 2011 employed seven axioms, including finite dimensionality for binary systems, local tomography (states determined by local measurements), subspace equivalence, continuous symmetry of pure states, and allowance of all effects for qubits, proving that only classical and quantum theories satisfy them; adding continuity selects quantum theory alone. Subsequent reconstructions, including those by Masanes et al. (2013), Goyal (2014), and Höhn (2017), have further developed these information-theoretic axioms, reinforcing quantum theory's uniqueness within GPTs. These reconstructions demonstrate that quantum theory is the unique GPT compatible with intuitive physical requirements like efficient information encoding and local observability. A pivotal result in this context is Solér's theorem from 1995, which characterizes infinite-dimensional orthomodular lattices (abstracting quantum logic) with an orthonormal basis as Hilbert spaces over the real numbers, complex numbers, or quaternions, thereby implying the complex Hilbert space structure underlying quantum theory when combined with GPT axioms like purification and atomicity. Quantum theory emerges as a special case of GPTs, where the state space is the set of density matrices on a complex Hilbert space, effects are bounded positive operators, and composites follow the tensor product rule.

Categorical and operational frameworks

Categorical quantum mechanics provides a diagrammatic and compositional framework for quantum theory, treating physical processes as morphisms in a symmetric monoidal category enriched over Hilbert spaces, where objects represent quantum systems and parallel composition corresponds to tensor products. Pioneered by Samson Abramsky and Bob Coecke in 2004, this approach emphasizes dagger-compact structure to capture unitarity and complementarity, with dagger symmetry ensuring that processes are reversible and self-adjoint in a graphical sense. Measurements are modeled using Frobenius algebras, which axiomatize classical structures within the quantum category, allowing complementary observables to be represented as mutually unbiased algebras. A key development in this framework is the ZX-calculus, introduced by Bob Coecke and Ross Duncan in 2008, which extends categorical quantum mechanics with a graphical language using Z- and X-spiders to simplify proofs of quantum protocols and circuit equivalences. In process theories, the diagrammatic representation uses wires for systems and boxes for channels, enabling intuitive composition of quantum operations while highlighting complementarity through the dagger structure that enforces probabilistic interpretations. This categorical perspective reconstructs quantum theory from abstract principles of compositionality, avoiding direct reliance on Hilbert space coordinates. Operational frameworks treat quantum systems as black boxes, focusing on statistics derived from preparations, transformations, and measurements without presupposing an underlying ontology. Lucien Hardy's 2001 axiomatization derives the full quantum formalism, including the Born rule, from five principles: systems as abstract entities, compound systems via direct sum and tensor product, continuous reversible transformations, and a simple composability condition for overlapping measurements. Building on this, Howard Barnum and Christopher Fuchs emphasized operational probabilities as the core of quantum information, where states are compendia of outcome probabilities for all possible measurements, leading to derivations of quantum features like no-cloning from informational constraints. Robert Spekkens' 2007 toy model serves as a classical analogue within operational frameworks, illustrating epistemic interpretations where quantum-like behavior emerges from limited knowledge about ontic states, with "self-dual" information flow mimicking quantum complementarity without true superposition. In this model, systems have hidden degrees of freedom, and operations reveal partial information, reproducing key quantum phenomena such as interference in a restricted epistemic setting. More recently, quantum combs, developed by Giulio Chiribella and collaborators in the late 2000s, extend operational frameworks to higher-order processes, representing networks of channels as combs that input and output quantum instruments, enabling the reconstruction of quantum theory from principles of causality and purification. This formalism captures indefinite causal orders and provides a process-theoretic basis for quantum advantage in information tasks.

Extensions and alternative theories

Objective collapse models

Objective collapse models propose modifications to the Schrödinger equation that introduce spontaneous, objective wave function collapses, aiming to resolve the measurement problem without invoking a special role for observers. These theories maintain the predictive success of standard quantum mechanics for microscopic systems while ensuring that macroscopic superpositions decay rapidly, leading to definite outcomes. Unlike interpretations that preserve unitary evolution, objective collapse models alter the dynamics to include stochastic or nonlinear terms that localize the wave function in position space over time. The Ghirardi–Rimini–Weber (GRW) model, introduced in 1986, posits spontaneous localization events occurring at a low rate for individual particles but amplified for macroscopic objects. In this framework, the wave function undergoes rare, random collapses modeled as Gaussian multiplications, with a localization rate \lambda \approx 10^{-16} s^{-1} per particle and a spatial smearing width of about 100 nm, ensuring that superpositions involving many particles collapse almost immediately. This mechanism provides an objective criterion for collapse, independent of measurement, while reproducing quantum predictions for isolated systems. A continuous variant, the continuous spontaneous localization (CSL) model developed in 1990, replaces discrete jumps with a stochastic differential equation incorporating white noise. The evolution is governed by d\psi = -\frac{i}{\hbar} H \, dt \, \psi - \frac{\lambda}{2} (q - \langle q \rangle)^2 \, dt \, \psi + \sqrt{\lambda} (q - \langle q \rangle) \, dW \, \psi, where H is the Hamiltonian, q is the position operator, \langle q \rangle is its expectation value, \lambda is the localization rate, and dW is Wiener noise. CSL avoids the discontinuity of GRW while preserving particle symmetries and leading to similar amplification for macroscopic systems. The Diósi–Penrose model, proposed in the late 1980s (Diósi 1987–1989; Penrose 1989), links collapse to gravitational effects, suggesting that superpositions of differing spacetime geometries become unstable. Diósi's formulation incorporates gravitational self-energy differences driving diffusion, while Penrose argues for objective reduction when the gravitational uncertainty exceeds the Planck scale, with a characteristic length l = \sqrt{\hbar G / c^5} \approx 10^{-35} m. This gravity-induced collapse occurs on timescales inversely proportional to the superposition size and mass, naturally suppressing macroscopic quantum coherence. Recent analyses as of 2024 indicate that the model predicts gravitationally induced entanglement, offering potential new experimental tests in tabletop setups. These models address the measurement problem by providing an objective, intrinsic collapse mechanism, distinct from environment-induced decoherence, though they share the goal of explaining classical emergence. They predict testable deviations, such as excess radiation from spontaneous localizations or modified interferometry patterns, but current experiments impose tight constraints; for instance, recent molecular interferometry and optomechanical experiments in the 2020s have set upper limits on the CSL rate \lambda below approximately $10^{-13} s^{-1} for correlation lengths around 100 nm, depending on the specific parameters, while underground photon emission tests have ruled out parameter-free versions of the Diósi–Penrose model. Recent developments in Diósi's stochastic gravity approach, updated in the 2020s, refine the model by incorporating quantum fluctuations in the gravitational field to ensure energy conservation and avoid divergences, as explored in analyses of superposition decay rates and experimental tests.

Stochastic and nonlinear modifications

Stochastic modifications of quantum mechanics seek to reinterpret the theory through continuous random processes, providing an objective description of particle trajectories without invoking wave function collapse. A seminal example is Edward Nelson's stochastic mechanics, introduced in 1966, which models the motion of particles as a Markov diffusion process akin to Brownian motion. In this framework, the particle's position evolves according to the stochastic differential equation d\mathbf{x} = \mathbf{b}(\mathbf{x}, t) \, dt + d\mathbf{w}, where \mathbf{b} represents the mean drift velocity and d\mathbf{w} is the increment of a Wiener process capturing thermal-like fluctuations. Nelson derived the non-relativistic Schrödinger equation from Newtonian mechanics by incorporating an "osmotic velocity" \mathbf{u}(\mathbf{x}, t) = \frac{\hbar}{m} \nabla \ln \sqrt{\rho(\mathbf{x}, t)}, where \rho is the probability density, thus linking classical stochastic dynamics to quantum behavior while preserving the equivalence to standard quantum predictions. Nonlinear extensions, by contrast, directly alter the unitary evolution of the wave function to introduce deterministic but non-standard dynamics, potentially resolving issues like the measurement problem through objective evolution. Steven Weinberg proposed such a framework in 1989, generalizing the Schrödinger equation to i\hbar \frac{d\psi}{dt} = h(\psi, \psi^*) \psi, where h is a homogeneous function that can include nonlinear dependencies on the wave function and its complex conjugate, allowing for small corrections to linear quantum mechanics while maintaining hermiticity for observables. However, these models predict deviations observable in precision experiments, such as shifts in energy levels or EPR correlations; for instance, recent precision experiments, including those with superconducting qubits as of 2025, have set bounds on the nonlinearity parameter \gamma < 1.15 \times 10^{-12} at 90% confidence level. Both stochastic and nonlinear approaches aim to endow quantum dynamics with greater objectivity, avoiding observer-dependent collapse, and stochastic variants have been extended to relativistic settings in the 1990s via formulations that incorporate Lorentz-invariant diffusions and proper time stochastic processes, ensuring compatibility with special relativity without violating causality. These relativistic extensions, such as those mapping Klein-Gordon solutions to Markov diffusions, maintain the derivation of relativistic wave equations from stochastic principles. In recent developments during the 2020s, continuous-variable stochastic models have emerged for testing these ideas in optomechanical systems, where quantum fluctuations in light-mechanical oscillator couplings induce effective stochastic trajectories for macroscopic probes, enabling experimental probes of quantum-stochastic interfaces at larger scales. For example, linear optomechanical interactions generate quantum-induced diffusion in a semiclassical mechanical mode, with variance scaling as the cooperativity parameter, offering pathways to detect deviations from linear quantum predictions.

Superdeterminism and retrocausality

Superdeterminism proposes a deterministic framework for quantum mechanics that resolves apparent nonlocality in Bell inequality violations by positing correlations between measurement settings and hidden variables through a common cause in the initial conditions of the universe. In this view, championed by Gerard 't Hooft in the 2010s, the universe's evolution is fully deterministic, and quantum probabilities emerge from underlying classical variables, eliminating the need for superluminal influences while closing the "freedom-of-choice" loophole in Bell's theorem. Specifically, 't Hooft's cellular automaton models treat quantum systems as emergent from deterministic rules at a fundamental level, where particle positions and momenta are precisely defined, and statistical independence between experimenters' choices and system states is not assumed. As of 2025, ongoing quantum experiments aim to test the free-will assumption by generating measurement choices with high-entropy quantum sources, potentially constraining superdeterministic models if no correlations are found. A key feature of superdeterminism is the requirement for highly specific initial conditions that correlate all relevant variables, including observer choices, from the Big Bang onward, often described as a "conspiracy" in the universe's setup to produce observed quantum correlations. This setup avoids nonlocality but has drawn criticism for implying a lack of experimenter free will, as measurement settings are predetermined by the same hidden variables influencing outcomes, rendering experiments non-random in a way that challenges statistical assumptions in physics. Debates in the 2020s, including analyses by proponents like Sabine Hossenfelder, highlight that while superdeterminism is logically consistent, its reliance on such fine-tuned correlations makes it empirically indistinguishable from standard quantum mechanics without additional testable predictions. Retrocausality offers an alternative foundational approach by incorporating time-symmetric influences, where future boundary conditions affect past events through advanced waves propagating backward in time. In John G. Cramer's transactional interpretation, introduced in 1986, quantum events arise from "transactions" between retarded (forward-propagating) and advanced (backward-propagating) waves that satisfy boundary conditions at emission and absorption points, ensuring a fully causal, relativistically invariant description without collapse or hidden variables. This model interprets the Schrödinger equation's solutions as real physical waves, with absorbers in the future "handshaking" with emitters via these waves to form definite outcomes, providing a retrocausal mechanism for entanglement correlations. The two-state vector formalism (TSVF), originating from the work of Yakir Aharonov, Peter Bergmann, and Joel L. Lebowitz in the 1960s, formalizes this time symmetry by describing quantum states with both a pre-selected forward-evolving state vector and a post-selected backward-evolving one, enabling the computation of weak values through weak measurements that minimally disturb the system. Recent experiments in the 2020s using weak measurements have tested TSVF predictions, such as anomalous weak values in entangled systems, confirming consistency with standard quantum mechanics while exploring time-symmetric features like the "past of a quantum particle." For instance, studies on multipartite entangled qubits have characterized state updates via nondestructive weak measurements, aligning observed weak values with theoretical expectations from the formalism. Experimental investigations of retrocausality, particularly in delayed-choice entanglement scenarios, have found no evidence supporting backward causation beyond standard quantum predictions. In 2023 analyses of delayed-choice experiments, including entanglement swapping, forward-time interpretations suffice without invoking retrocausal influences, as correlations emerge from entanglement alone rather than future choices altering past states. These bounds reinforce that while retrocausal models like TSVF provide interpretive tools, they do not necessitate modifications to the quantum formalism for observed phenomena.

References

  1. [1]
    Foundations of quantum mechanics and their impact on ... - NIH
    May 28, 2018 · Whereas there remains little doubt that quantum mechanics works and is one of the most accurate theories ever conceived to describe our universe ...
  2. [2]
    Foundations of Quantum Mechanics - MDPI
    May 26, 2022 · The foundations of quantum mechanics are a domain in which physics and philosophy concur in attempting to find a fundamental physical theory ...
  3. [3]
    An evolutionary view of quantum foundations - ScienceDirect
    This is an expository paper surveying briefly the evolution of the foundations of quantum theory from its inception to the present day.Missing: review | Show results with:review
  4. [4]
    The Oxford Questions on the foundations of quantum physics
    Sep 8, 2013 · This uses entanglement and illustrates how the techniques required for quantum foundations and for quantum technologies coincide to a remarkable ...
  5. [5]
    [PDF] The Thermal Radiation Formula of Planck (1900) - arXiv
    Feb 12, 2004 · As regards the black body radiation, the main point is that the Wien limit of the Planck's formula is recovered supposing that radiation is ...
  6. [6]
    [PDF] Einstein's Proposal of the Photon Concept-a Translation
    Of the trio of famous papers that Albert Einstein sent to the Annalen der Physik in 1905 only the paper proposing the photon concept has been unavailable in ...
  7. [7]
    [PDF] 1913 On the Constitution of Atoms and Molecules
    On the theory of this paper the only neutral atom which contains a single electron is the hydrogen atom. The permanent state of this atom should correspond to ...
  8. [8]
    [quant-ph/9911107] 75 Years of Matter Wave: Louis de Broglie and ...
    Nov 25, 2004 · A physically real wave associated with any moving particle and travelling in a surrounding material medium was introduced by Louis de Broglie in a series of ...
  9. [9]
    Understanding Heisenberg's 'Magical' Paper of July 1925 - arXiv
    Apr 1, 2004 · In July 1925 Heisenberg published a paper [Z. Phys. 33, 879-893 (1925)] which ended the period of `the Old Quantum Theory' and ushered in the new era of ...
  10. [10]
    An Undulatory Theory of the Mechanics of Atoms and Molecules
    The paper gives an account of the author's work on a new form of quantum theory. §1. The Hamiltonian analogy between mechanics and optics.
  11. [11]
    [PDF] 1.3 THE PHYSICAL CONTENT OF QUANTUM KINEMATICS AND ...
    First we define the terms velocity, energy, etc. (for example, for an electron) which remain valid in quantum mechanics. It is shown.
  12. [12]
    February 1927: Heisenberg's Uncertainty Principle
    Heisenberg outlined his new principle in 14-page a letter to Wolfgang Pauli, sent February 23, 1927. In March he submitted his paper on the uncertainty ...
  13. [13]
    [PDF] Bohr's way to defining complementarity(*) - arXiv
    Abstract. We go through Bohr's talk about complementary features of quantum theory at the Volta. Conference in September 1927, by collating a manuscript ...
  14. [14]
    Mathematical foundations of quantum mechanics : Von Neumann ...
    Jul 2, 2019 · Mathematical foundations of quantum mechanics. by: Von Neumann, John, 1903-1957. Publication date: 1955. Topics: Matrix mechanics. Publisher ...
  15. [15]
    [PDF] Von Neumann's Impossibility Proof - PhilSci-Archive
    Sep 16, 2016 · Most importantly, von Neumann did not claim to have shown the impossibility of hidden variables tout court, but argued that hidden-variable ...
  16. [16]
    [PDF] Can Quantum-Mechanical Description of Physical Reality Be
    (Received March 25, 1935). In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a ...
  17. [17]
    [PDF] 13 Remarks on the Mind-Body Question - The Information Philosopher
    F. Dyson, in a very thoughtful article,1 points to the everbroadening scope of scientific inquiry. Whether or not the relation of mind to body.
  18. [18]
    The Feynman Lectures on Physics Vol. III Ch. 1: Quantum Behavior
    “Quantum mechanics” is the description of the behavior of matter and light in all its details and, in particular, of the happenings on an atomic scale.Missing: citation | Show results with:citation
  19. [19]
  20. [20]
  21. [21]
  22. [22]
    Kochen-Specker contextuality | Rev. Mod. Phys.
    Dec 19, 2022 · The Kochen-Specker (KS) theorem ( Kochen and Specker, 1967 ) deals with assignments of truth values to potential measurement results. In quantum ...
  23. [23]
    [2102.13036] Kochen-Specker Contextuality - Quantum Physics - arXiv
    Feb 25, 2021 · It states that quantum mechanics is in conflict with classical models in which the result of a measurement does not depend on which other compatible ...
  24. [24]
    [PDF] arXiv:0706.0126v4 [quant-ph] 15 Jul 2008
    Jul 15, 2008 · In this letter we provide a test for hid- den variables in local spin-1 system, where the original. Bell's approach clearly failes. We found ...
  25. [25]
    Copenhagen Interpretation of Quantum Mechanics
    May 3, 2002 · The Copenhagen interpretation was the first general attempt to understand the world of atoms as this is represented by quantum mechanics.
  26. [26]
    The Quantum Postulate and the Recent Development of Atomic ...
    The content of this paper is essentially the same as that of a lecture on the present state of the quantum theory delivered on Sept. 16, 1927, at the Volta ...Missing: original | Show results with:original
  27. [27]
    [1003.5209] QBism, the Perimeter of Quantum Bayesianism - arXiv
    Mar 26, 2010 · This article summarizes the Quantum Bayesian point of view of quantum mechanics, with special emphasis on the view's outer edges---dubbed QBism.Missing: original | Show results with:original
  28. [28]
    On the possibility of a realist ontological commitment in quantum ...
    Jan 13, 2018 · This paper reviews the structure of standard quantum mechanics, introducing the basics of the von Neumann-Dirac axiomatic formulation as well as ...
  29. [29]
    A Suggested Interpretation of the Quantum Theory in Terms of ...
    In this paper and in a subsequent paper, an interpretation of the quantum theory in terms of just such "hidden" variables is suggested.
  30. [30]
    "Relative State" Formulation of Quantum Mechanics | Rev. Mod. Phys.
    Apr 18, 2025 · "Relative State" Formulation of Quantum Mechanics. Hugh Everett, III. Hugh Everett, III. Palmer Physical Laboratory, Princeton University, Princeton, New ...
  31. [31]
    Consistent histories and the interpretation of quantum mechanics
    The usual formula for transition probabilities in nonrelativistic quantum mechanics is generalized to yield conditional probabilities for selected sequence.
  32. [32]
    [PDF] INFORMATION, PHYSICS, QUANTUM: THE SEARCH FOR LINKS
    at a very deep bottom, in most instances — an immaterial source and.
  33. [33]
    [quant-ph/9609002] Relational Quantum Mechanics - arXiv
    Aug 31, 1996 · Relational Quantum Mechanics. Authors:Carlo Rovelli. View a PDF of the paper titled Relational Quantum Mechanics, by Carlo Rovelli. View PDF.
  34. [34]
    The Principals Of Quantum Mechanics : Dirac. P.a.m - Internet Archive
    Oct 11, 2020 · The Principals Of Quantum Mechanics. by: Dirac. P.a.m ... PDF WITH TEXT download · download 1 file · SINGLE PAGE PROCESSED JP2 ZIP ...
  35. [35]
    [PDF] Measures on the Closed Subspaces of a Hilbert Space - UCSD Math
    It is easy to see that such a measure can be obtained by selecting a vector v and, for each closed subspace A, taking µ(A) as the square of the norm of the.
  36. [36]
    [PDF] General probabilistic theories: An introduction - arXiv
    Aug 24, 2021 · We introduce the framework of general probabilistic theories (GPTs for short). GPTs are a class of opera- tional theories that generalize both ...
  37. [37]
    [1011.6451] Informational derivation of Quantum Theory - arXiv
    Nov 30, 2010 · Quantum theory can be derived from purely informational principles. Five elementary axioms-causality, perfect distinguishability, ideal compression, local ...Missing: 2009 | Show results with:2009
  38. [38]
    Cloning and Broadcasting in Generic Probabilistic Theories - arXiv
    Nov 30, 2006 · We prove generic versions of the no-cloning and no-broadcasting theorems, applicable to essentially {\em any} non-classical finite-dimensional probabilistic ...<|control11|><|separator|>
  39. [39]
    [quant-ph/0101012] Quantum Theory From Five Reasonable Axioms
    Jan 3, 2001 · In this paper it is shown that quantum theory can be derived from five very reasonable axioms. The first four of these are obviously consistent with both ...
  40. [40]
    [0808.1023] Categorical quantum mechanics - arXiv
    Aug 7, 2008 · Authors:Samson Abramsky, Bob Coecke. View a PDF of the paper titled Categorical quantum mechanics, by Samson Abramsky and Bob Coecke. View PDF.
  41. [41]
    [0906.4725] Interacting Quantum Observables: Categorical Algebra ...
    Jun 25, 2009 · This paper has two tightly intertwined aims: (i) To introduce an intuitive and universal graphical calculus for multi-qubit systems, the ZX-calculus.Missing: original | Show results with:original
  42. [42]
    In defense of the epistemic view of quantum states: a toy theory - arXiv
    Jan 9, 2004 · We present a toy theory that is based on a simple principle: the number of questions about the physical state of a system that are answered must ...
  43. [43]
    [0712.1325] Quantum Circuits Architecture - arXiv
    Dec 9, 2007 · Access Paper: View a PDF of the paper titled Quantum Circuits Architecture, by Giulio Chiribella and 2 other authors. View PDF · TeX Source.Missing: original | Show results with:original
  44. [44]
    Collapse Theories - Stanford Encyclopedia of Philosophy
    Mar 7, 2002 · This line of thought has led to what is known as the CSL (Continuous Spontaneous Localization) model (Pearle 1989; Ghirardi, Pearle, and Rimini ...The Continuous Spontaneous... · CSL and Experiments · Relativistic Dynamical...
  45. [45]
    Testing quantum mechanics - ScienceDirect.com
    This paper presents a general framework for introducing nonlinear corrections into ordinary quantum mechanics, that can serve as a guide to experiments
  46. [46]
    Stochastic mechanics of a relativistic spinless particle - AIP Publishing
    Jun 1, 1990 · An extension of Nelson's stochastic mechanics to the relativistic domain is proposed. To each pure state of a spinless relativistic quantum ...
  47. [47]
    A Relativistic Version of Nelson's Stochastic Mechanics - IOPscience
    A relativistic version of the Nelson-Newton law for Markov diffusions is established. Stochastic generalizations of the proper time and of the relativistic ...Missing: extensions | Show results with:extensions
  48. [48]
    [2401.16511] Quantum-induced Stochastic Optomechanical Dynamics
    Jan 29, 2024 · We study the effective stochastic dynamics of a semiclassical probe induced by linear optomechanical interactions with a quantum oscillator.Missing: continuous- variable models 2020s
  49. [49]
    Deterministic Quantum Mechanics: the Mathematical Equations - arXiv
    May 13, 2020 · We write down the conditions for the Hamiltonian of a quantum system for rendering it mathematically equivalent to a deterministic system.
  50. [50]
    Does Quantum Mechanics Rule Out Free Will? | Scientific American
    Mar 10, 2022 · In a recent video, physicist Sabine Hossenfelder, whose work I admire, notes that superdeterminism eliminates the apparent randomness of quantum mechanics.Missing: Bransden | Show results with:Bransden
  51. [51]
    The Two-State Vector Formalism of Qauntum Mechanics - arXiv
    May 21, 2001 · In particular, the concept of ``weak measurements'' (standard measurements with weakening of the interaction) is discussed in depth ...