Fact-checked by Grok 2 weeks ago

Quantum

Quantum in physics refers to the fundamental discrete unit of a physical quantity, such as , , or , which cannot be subdivided further in certain interactions. This concept underpins , the branch of physics that describes the behavior of matter and at atomic and subatomic scales, where fails to explain phenomena like atomic spectra and the . Quantum mechanics reveals that particles exhibit both wave-like and particle-like properties, known as wave-particle duality, and can exist in multiple states simultaneously until measured, a principle called superposition. The theory emerged in the early 20th century through contributions from scientists addressing inconsistencies in and . introduced the idea of energy quanta in 1900 to explain , while applied it to light in 1905 for the , demonstrating light's particle nature. Niels Bohr's 1913 atomic model incorporated quantized energy levels, and later developments by , , and others in the formalized the theory with and wave mechanics, respectively. Central principles include the Heisenberg uncertainty principle, which states that certain pairs of properties, like position and momentum, cannot be simultaneously measured with arbitrary precision. describes correlated particles whose states are interdependent regardless of distance, challenging classical notions of locality. These concepts have profound implications, forming the basis for understanding chemical bonding, nuclear reactions, and the structure of materials. Quantum mechanics has revolutionized technology, enabling the development of semiconductors, lasers, MRI machines, and emerging fields like and . Despite its predictive success, —such as the versus many-worlds—remain debated, reflecting ongoing questions about the nature of reality at the quantum level.

Historical Development

Origins and Early Ideas

In the late 19th century, classical physics faced a profound challenge in explaining the spectrum of radiation emitted by a blackbody, an idealized object that absorbs all incident radiation. Experimental observations showed that the intensity of this radiation peaked at a frequency depending on temperature and fell off at both low and high frequencies, but classical theory, particularly the Rayleigh-Jeans law derived from equipartition of energy among electromagnetic modes, predicted an unrealistic divergence to infinite energy density at high frequencies (short wavelengths), known as the ultraviolet catastrophe. This failure highlighted the inadequacy of classical electromagnetism and thermodynamics for describing thermal radiation, prompting physicists to seek new approaches. Max Planck, working at the University of Berlin, addressed this puzzle through empirical fitting and theoretical innovation. On October 19, 1900, he presented an interpolation formula to the that matched experimental blackbody spectra, bridging the low-frequency Rayleigh-Jeans regime and the high-frequency Wien's law. To derive this formula rigorously, Planck introduced a revolutionary hypothesis on December 14, 1900, during another society meeting: the energy of oscillators in the blackbody is not continuous but quantized in discrete units proportional to frequency, given by E = h \nu, where h is a universal constant (later named Planck's constant) and \nu is the frequency. This quantization resolved the by limiting energy at high frequencies, yielding for , though Planck initially viewed it as a mathematical expedient rather than a fundamental physical reality. Planck's idea sparked debates within the physics community about the nature of discreteness versus the continuity assumed in classical physics. While Planck hesitated to attribute full physical discreteness to radiation itself, treating quantization as applying only to matter-radiation interactions, others grappled with its implications for energy exchange. In 1905, Albert Einstein extended Planck's hypothesis radically in his paper on the photoelectric effect, proposing that light itself consists of discrete energy packets, or "light quanta" (later termed photons), each with energy E = h \nu. Einstein argued this explained why electrons are ejected from metals only above a threshold frequency, independent of light intensity, resolving discrepancies with wave theory. For this discovery of the photoelectric law, Einstein received the 1921 Nobel Prize in Physics. These early ideas marked the tentative birth of quantum theory, challenging the continuous framework of classical physics.

Key Experiments Leading to Quantum Theory

The photoelectric was first observed in 1887 by during experiments on electromagnetic waves, where he noted that ultraviolet light incident on a metal surface, such as a , facilitated the discharge of by reducing the sparking voltage required, an not explained by classical wave theory. In 1902, extended these observations through quantitative measurements, demonstrating that electrons are emitted from a metal surface only when illuminated by light above a specific threshold frequency, and that the of these photoelectrons increases linearly with the light's frequency rather than its intensity, contradicting classical expectations of energy accumulation over time. In 1923, conducted scattering experiments using X-rays on targets, observing that the scattered radiation exhibited a shift dependent on the scattering angle, which could not be accounted for by classical but instead matched predictions from treating X-rays as particles (photons) colliding with electrons. The observed shift is given by \Delta \lambda = \frac{h}{m_e c} (1 - \cos \theta), where h is Planck's constant, m_e is the electron mass, c is the speed of light, and \theta is the scattering angle; this result also implied photon momentum p = h / \lambda, providing direct evidence for the particle nature of light. The wave nature of particles was experimentally confirmed in by and Lester Germer, who directed a beam of electrons at a and detected intensity maxima in the scattered electrons at specific angles, forming an consistent with from a periodic . These peaks corresponded to the de Broglie wavelength \lambda = h / p for the electrons, where p is their , demonstrating that electrons behave as waves under suitable conditions. In 1922, and performed an experiment using a beam of neutral silver atoms passed through an inhomogeneous , observing that the beam split into two distinct components on a detection screen rather than a continuous distribution, indicating that the atomic magnetic moments—and thus the —take on discrete values aligned or anti-aligned with the field. This spatial separation provided the first direct evidence for the quantization of in individual atoms, later attributed to electron spin.

Formulation of Modern Quantum Mechanics

The formulation of modern quantum mechanics emerged in the mid-1920s as physicists sought to resolve inconsistencies in the , particularly its ad hoc quantization rules and failure to fully explain atomic spectra and stability. A foundational step was Niels Bohr's 1913 atomic model, which posited that electrons in the occupy discrete, stationary orbits with quantized given by L = n \hbar, where n is a positive integer and \hbar = h / 2\pi is the reduced . These quantized states prevented classical radiation losses, and transitions between them emitted photons with frequencies matching the observed hydrogen spectral lines, such as the . This model marked a shift from continuous to discrete quantum postulates, though it remained semi-classical and limited to specific systems. Building on these ideas amid growing experimental pressures, introduced in 1925, a fully quantum framework that abandoned visualizable orbits in favor of abstract mathematical relations between directly observable quantities. In his seminal paper, Heisenberg represented dynamical variables like position and momentum as infinite arrays (matrices) that do not commute, leading to relations such as [x, p] = i \hbar, which underpin the origins of in measurements. This non-commutative algebra allowed precise calculations of transition probabilities and energy levels for the , reproducing Bohr's results while extending the theory to multi-electron systems; it was published in Zeitschrift für Physik and quickly advanced by and . Heisenberg's approach emphasized empirical observables over hidden mechanisms, resolving paradoxes in the . Independently, in late 1926, developed wave mechanics, offering an alternative yet equivalent formulation that interpreted quantum phenomena through wave functions satisfying differential equations. The time-independent for bound states, -\frac{\hbar^2}{2m} \nabla^2 \psi(\mathbf{r}) + V(\mathbf{r}) \psi(\mathbf{r}) = E \psi(\mathbf{r}), describes stationary states where \psi is the wave function, V the potential, m the particle mass, and E the energy eigenvalue; solutions yield quantized energy levels and probabilities via |\psi|^2. 's four papers in demonstrated equivalence to through transformation theory, providing an intuitive wave picture that unified de Broglie's matter waves with quantization. This formulation proved particularly effective for systems with continuous symmetries. Guiding these developments was Niels Bohr's , formalized in , which required to recover classical predictions in the limit of large quantum numbers (n \gg 1), such as high-energy transitions approximating classical . Articulated in his comprehensive review on atomic structure, the principle served as a tool to select valid quantum rules and ensure theoretical consistency with established physics, influencing both matrix and wave approaches. For instance, it justified semiclassical approximations in . These innovations converged at the 1927 on Electrons and Photons, where leading physicists, including Bohr, Heisenberg, Schrödinger, Einstein, and de Broglie, debated the interpretations and implications of the new . The proceedings highlighted the equivalence of formulations and addressed foundational issues like and , marking the theory's consolidation despite ongoing philosophical tensions.

Fundamental Principles

Wave-Particle Duality

Wave-particle duality is a foundational concept in , positing that fundamental entities such as and exhibit both wave-like and particle-like properties, depending on the context of . This duality challenges , where waves and particles were mutually exclusive categories, and it emerged from discrepancies between experimental results that could not be reconciled by either description alone. For , wave behavior was evident in phenomena like and , while particle-like behavior appeared in interactions involving energy transfer, highlighting the need for a unified quantum framework. The wave nature of light was dramatically demonstrated in Thomas Young's of 1801, where light passing through two narrow slits produced an interference pattern of alternating bright and dark fringes on a screen, consistent with wave superposition. In contrast, the , explained by in 1905, revealed light's particle aspect: when ultraviolet light strikes a metal surface, electrons are ejected only if the light's exceeds a threshold, with energy delivered in discrete packets () proportional to , behaving as localized particles called photons. This particle interpretation resolved the effect's dependence on rather than intensity, underscoring that light underlie the particle behavior observed. Extending duality to matter, hypothesized in 1924 that particles like electrons possess wave properties, proposing a given by \lambda = \frac{[h](/page/H+)}{[p](/page/P′′)}, where h is Planck's constant and p is the particle's . This idea was experimentally confirmed in 1927 by and Lester Germer, who observed patterns when electrons were scattered by a , with maxima matching the de Broglie for the electrons' . Independently, demonstrated through thin metal foils, producing ring patterns analogous to X-ray , further verifying wave-like for particles. Niels Bohr provided a conceptual resolution in 1928 through his complementarity principle, which states that wave and particle descriptions are complementary aspects of quantum entities, mutually exclusive in any single measurement but together necessary for a complete understanding. In this view, the choice of experimental setup—such as detecting position (particle-like) or momentum (wave-like)—determines which aspect is observed, reflecting an inherent limitation in quantum description rather than incomplete knowledge. This principle reconciled the duality without contradiction, emphasizing that quantum phenomena transcend classical categories.

Quantization of Physical Quantities

One of the defining features of quantum mechanics is the quantization of physical quantities, where properties such as , , and charge assume discrete rather than continuous values. This discreteness arises from the boundary conditions imposed on quantum wave functions and leads to observable effects like atomic spectra and fundamental limits on at . In contrast to , where quantities vary smoothly, quantum quantization manifests in stable, non-classical behaviors that underpin the stability of . In atoms, electron energies are quantized into discrete levels corresponding to stationary states, which account for the sharp, line-like spectra observed in atomic emissions and absorptions. For the , these energy levels are described by the formula
E_n = -\frac{13.6 \, \text{eV}}{n^2},
where n = 1, 2, [3, \dots](/page/3_Dots) is the ; this was derived in Niels Bohr's semiclassical model, positing that electrons orbit in specific orbits to avoid radiating energy continuously. The full quantum mechanical treatment, solving the for the , confirms these discrete levels and extends the model to multi-electron atoms, revealing the quantization as eigenvalues of the operator.
Angular momentum in quantum systems is similarly quantized. The orbital angular momentum quantum number l takes integer values from 0 to n-1, determining the magnitude \sqrt{l(l+1)} \hbar and possible projections m_l \hbar along an axis, as obtained from the spherical harmonic solutions in the hydrogen atom wave function. Additionally, electrons possess an intrinsic spin angular momentum with quantum number s = 1/2, yielding a magnitude \sqrt{s(s+1)} \hbar = \sqrt{3/4} \hbar and projections \pm \frac{1}{2} \hbar; this "spin" hypothesis, introduced to explain fine structure in atomic spectra, treats the electron as having an internal angular momentum independent of its orbital motion. Electric charge is quantized in multiples of the elementary charge e = 1.602 \times 10^{-19} \, \text{C}, the charge of a single or proton. This was experimentally verified by Robert Millikan's 1913 oil-drop experiment, in which charged oil droplets suspended between electrified plates exhibited charges that were always integer multiples of a fundamental unit, confirming the nature of charge and enabling precise measurements of e and Avogadro's number. A striking consequence of quantization is the presence of , the non-zero minimum energy in even at temperature. For the , modeling vibrations in molecules or fields, the ground-state energy is
E_0 = \frac{1}{2} \hbar \omega,
where \hbar is the reduced Planck's constant and \omega is the classical ; this arises because the Heisenberg uncertainty principle prevents simultaneous zero position and momentum, leading to perpetual fluctuations. The derivation follows from solving the time-independent , yielding evenly spaced levels E_v = \left( v + \frac{1}{2} \right) \hbar \omega for vibrational v = 0, 1, 2, \dots.

Heisenberg Uncertainty Principle

The Heisenberg uncertainty principle posits a fundamental limit on the simultaneous measurement of certain conjugate physical quantities in quantum mechanics, reflecting the intrinsic probabilistic nature of quantum systems. For the position x and momentum p of a particle, the principle states that the product of their uncertainties must satisfy \Delta x \, \Delta p \geq \frac{\hbar}{2}, where \Delta x and \Delta p represent the standard deviations in position and momentum measurements, respectively, and \hbar = h / 2\pi with h denoting Planck's constant. This inequality indicates that improving the precision of one quantity necessarily increases the uncertainty in the other, preventing arbitrary accuracy in joint determinations. The principle originated in Werner Heisenberg's seminal 1927 paper, where he derived it intuitively through arguments about the limits of processes in . Fundamentally, it stems from the non-commuting nature of the quantum mechanical operators for , encapsulated in the [x, p] = i[\hbar](/page/H-bar), which implies that and cannot be simultaneously diagonalized. This relation, introduced in the context of , underpins the mathematical rigor of the uncertainty bound, later formalized more generally by others but originating from Heisenberg's framework. A parallel formulation addresses and time, given by \Delta E \, \Delta t \geq \hbar / 2, where \Delta E is the in and \Delta t the in time. This version arises similarly from non-commutativity in the quantum description of time-dependent processes and has direct physical consequences, such as determining the natural linewidth of spectral lines in atomic transitions. In excited atomic states, the finite lifetime \tau of the state sets \Delta t \approx \tau, leading to an broadening \Delta E \approx \hbar / \tau, which manifests as the width \Gamma = \Delta E / \hbar of or absorption lines, explaining their observed finite resolution beyond instrumental limits. The principle is vividly illustrated by the single-slit experiment, a cornerstone demonstration of quantum behavior. When a particle, such as an or , passes through a slit of width \Delta x, its position is localized to that interval, but the resulting diffraction pattern reveals a spread in transverse \Delta p_x \approx h / \Delta x, saturating the uncertainty bound and highlighting the trade-off between spatial confinement and momentum dispersion. Heisenberg provided another intuitive example through a with a gamma-ray aimed at resolving an 's position: the high-resolution requirement demands short-wavelength gamma rays, whose photons scatter off the electron via Compton effect, disturbing its by an amount \Delta p \approx h / \Delta x, thereby enforcing the relation in any attempt at precise localization.

Mathematical Foundations

Quantum States and Hilbert Space

In quantum mechanics, the state of a system is mathematically represented as a vector in a separable \mathcal{H}, which is a complete that may be infinite-dimensional to accommodate continuous spectra of observables. This abstract framework unifies the matrix of Heisenberg and the wave of Schrödinger, providing a basis for describing without reference to a specific . The structure ensures that states form a linear , allowing for operations like addition and , while the inner product defines probabilities and amplitudes. A pure quantum state is denoted by a normalized ket |\psi\rangle \in \mathcal{H} satisfying \langle \psi | \psi \rangle = 1, where \langle \psi | is the dual and the inner product yields a positive representing the . This condition ensures that the total probability of finding the in any complete set of is unity. Superposition arises naturally from the linearity of the space: any can be expressed as a linear combination |\psi\rangle = \alpha |\phi\rangle + \beta |\chi\rangle, where \alpha, \beta \in \mathbb{C} are complex coefficients with |\alpha|^2 + |\beta|^2 = 1 for , enabling quantum interference effects that have no classical analog. For systems in mixed states, such as statistical ensembles or subsystems entangled with an , the density operator \rho provides a general description: \rho = \sum_i p_i |\psi_i\rangle \langle \psi_i |, where \{p_i\} is a with \sum_i p_i = [1](/page/1) and each |\psi_i\rangle is a pure state. The operator \rho is Hermitian, positive semi-definite, and satisfies \operatorname{Tr}(\rho) = [1](/page/1), with expectation values of observables given by \operatorname{Tr}(\rho A). In open , the reduced density operator for a subsystem is obtained by tracing over the environmental degrees of freedom, \rho_S = \operatorname{Tr}_E (|\Psi\rangle \langle \Psi|), capturing decoherence effects. Quantum states can be expanded in different bases corresponding to observables; for position, the continuous basis states |x\rangle satisfy \langle x | x' \rangle = \delta(x - x'), forming a complete orthonormal set such that any |\psi\rangle = \int dx \, \psi(x) |x\rangle with \psi(x) = \langle x | \psi \rangle. Similarly, the basis |p\rangle uses eigenstates of with \langle p | p' \rangle = \delta(p - p'), related to the basis via , \tilde{\psi}(p) = \langle p | \psi \rangle = \frac{1}{\sqrt{2\pi \hbar}} \int dx \, e^{-ipx/\hbar} \psi(x). Wave functions represent states in the basis.

Schrödinger Equation and Wave Functions

The represents the cornerstone of non-relativistic , governing the dynamics of quantum systems through the evolution of their wave functions. Postulated by in his 1926 series of papers, it posits that the time evolution of the wave function ψ is determined by the system's Ĥ, which encodes the including kinetic and potential contributions. The time-dependent form of the equation is given by i \hbar \frac{\partial \psi}{\partial t} = \hat{H} \psi, where ħ is the reduced Planck's constant, i is the , and ψ is the , a complex-valued function of position and time. This is linear and deterministic, implying that given an initial , the future state of the system is uniquely determined. For systems with time-independent Hamiltonians, solutions can often be separated into spatial and temporal parts, leading to stationary states where the remains constant over time. For such stationary states, the time-independent Schrödinger equation takes the form of an eigenvalue problem: \hat{H} \psi = E \psi, where E is the energy eigenvalue corresponding to the eigenfunction ψ. These eigenvalues represent the discrete or continuous energy levels accessible to the system, reflecting the quantization inherent in quantum mechanics. The wave functions reside in a Hilbert space, providing a mathematical framework for their completeness and orthogonality. Solving this equation for specific potentials yields the allowed energy states and associated wave functions. The physical interpretation of the wave function was established by in 1926, who proposed that the square of its modulus, |ψ(r,t)|², represents the probability density of finding the particle at position r at time t. This probabilistic view resolved the initial classical-like interpretation of wave functions as charge densities, aligning with empirical observations of discrete measurement outcomes. ensures that the of |ψ|² over all equals , preserving total probability. A canonical example illustrating these concepts is the particle in an infinite square well potential, where a particle of mass is confined between x = 0 and x = a with zero potential inside and infinite walls outside. The time-independent solutions are \psi_n(x) = \sqrt{\frac{2}{a}} \sin\left(\frac{n \pi x}{a}\right), \quad n = 1, 2, 3, \dots, with corresponding energies E_n = \frac{n^2 \pi^2 \hbar^2}{2 m a^2}. These quantized energies demonstrate how boundary conditions impose discreteness, and the wave functions vanish at the walls, ensuring . This model, derived from Schrödinger's eigenvalue approach, highlights nodes and probability distributions that vary with n.

Operators and Observables

In , physical observables—such as , , and —are represented by Hermitian operators acting on the space of the . A Hermitian operator \hat{A} satisfies \hat{A}^\dagger = \hat{A}, ensuring that its eigenvalues are real numbers, which correspond to the possible outcomes of measurements, and its eigenstates form a complete orthogonal set, allowing for the expansion of any in terms of these basis vectors. The expectation value of an \hat{A} in a |\psi\rangle is given by the \langle \hat{A} \rangle = \langle \psi | \hat{A} | \psi \rangle, representing the average result of many measurements on identically prepared systems. The variance, which quantifies the spread of measurement outcomes, is \Delta A^2 = \langle \hat{A}^2 \rangle - \langle \hat{A} \rangle^2, where \langle \hat{A}^2 \rangle = \langle \psi | \hat{A}^2 | \psi \rangle; for Hermitian operators, this variance is always non-negative, reflecting the probabilistic nature of quantum measurements. Two observables \hat{A} and \hat{B} are compatible if their operators commute, i.e., [\hat{A}, \hat{B}] = \hat{A}\hat{B} - \hat{B}\hat{A} = 0, in which case they share a common set of eigenstates, allowing simultaneous precise measurements. Non-commuting operators, however, lead to inherent uncertainties in joint measurements, as their eigenstates do not align. A fundamental example involves the position operator \hat{x} and momentum operator \hat{p} in one dimension. In the position representation, \hat{x} \psi(x) = x \psi(x) acts by multiplication, while \hat{p} = -i \hbar \frac{d}{dx}; these satisfy the canonical commutation relation [\hat{x}, \hat{p}] = i \hbar, underscoring their incompatibility.

Interpretations and Philosophical Implications

Copenhagen Interpretation

The Copenhagen interpretation of quantum mechanics emerged in the late 1920s and early 1930s, primarily through the collaborative efforts of and at the . Bohr, building on his earlier atomic model from 1913, integrated Heisenberg's formulation of 1925 with the emerging wave mechanics to address the foundational challenges of quantum theory. Heisenberg contributed key ideas on the limits of measurement, while provided the probabilistic framework in 1926, solidifying the interpretation's emphasis on the role of observation in quantum phenomena. This development occurred amid rapid advances in quantum theory, culminating in Bohr's seminal 1927 Como lecture, where he outlined the core principles. Central to the Copenhagen interpretation is Bohr's principle of complementarity, which posits that quantum entities exhibit mutually exclusive aspects—such as wave-like and particle-like behaviors—that cannot be observed simultaneously in a single experimental setup but are complementary for a complete description of the system. For instance, the reveals interference patterns indicative of waves, yet detecting which slit a particle passes through yields particle trajectories, rendering the wave picture inapplicable. This principle resolves apparent paradoxes by asserting that the choice of measurement apparatus defines the classical context, limiting the applicability of any one description. Complementarity underscores the irreducible observational context in , where the sets fundamental limits on simultaneous knowledge of complementary variables like position and momentum. In this framework, the wave function \psi does not represent a physical entity but a symbolic tool encoding potential outcomes, and causes an abrupt transition—or ""—to one of the eigenstates of the observed , yielding a definite result. Upon interaction with a classical measuring device, the quantum system irrevocably selects an eigenvalue, updating the observer's knowledge of the . This is not a dynamical process governed by the but a non-unitary update tied to the act of observation, distinguishing quantum from classical descriptions. The interpretation embraces the inherent probabilistic nature of quantum predictions, rejecting deterministic hidden variables in favor of statistical outcomes as dictated by the : the probability P(a) of measuring outcome a is given by P(a) = |\langle a | \psi \rangle|^2, where |a\rangle is the eigenstate corresponding to a and |\psi\rangle is the pre-measurement state. Introduced by in , this rule interprets the squared modulus of the wave function as a , making quantum mechanics fundamentally indeterministic without underlying mechanisms to resolve uncertainties. Bohr reinforced this view, arguing that provides complete, context-dependent predictions without need for hidden realities beyond observable effects.

Many-Worlds Interpretation

The (MWI) was originally proposed by physicist in his 1957 Ph.D. thesis at , where he introduced the relative-state formulation of . This interpretation posits that the universe is described by a single, universal that evolves deterministically according to the , without any need for during measurements. Everett's ideas gained widespread attention and were popularized by in a 1970 article in Physics Today, which emphasized the interpretation's resolution of longstanding paradoxes in . In the MWI, a measurement does not cause the wave function to collapse to a single outcome; instead, it entangles the observer with the quantum system, resulting in a superposition of all possible states that decoheres into distinct branches of the universal . Each branch corresponds to a different possible outcome of the , where the observer perceives only one eigenvalue of the being measured, while all other outcomes occur in branches. These branches become effectively independent due to decoherence, rendering them mutually unobservable, yet equally real components of the . The probabilistic nature of quantum outcomes arises from the squared amplitudes of the wave function components in the superposition, matching the without invoking special postulates. The implications of the MWI are profound: at the level of the entire , the evolution is fully deterministic, as the universal follows a unitary, causal without discontinuities. This formulation solves the by eliminating the need for a special role for observers or conscious intervention, treating as just another physical interaction that amplifies quantum superpositions into macroscopic . By doing so, it provides a consistent, observer-independent description of , where the appearance of is subjective to each .

Debates on Reality and Determinism

Quantum mechanics challenges classical notions of an objective and strict , prompting debates on whether the theory describes a mind-independent world governed by predictable causal laws. Central to these discussions is the tension between quantum predictions, which incorporate inherent probabilities, and the desire for a complete, local description of physical phenomena that aligns with intuitive . These issues gained prominence through thought experiments and theorems that expose apparent non-local influences and irreducible , forcing physicists and philosophers to reconsider the foundations of and . The paradox, introduced in 1935, argued that is incomplete because it permits entangled particles to exhibit correlations that imply instantaneous , violating the principle of locality without faster-than-light signaling. Einstein famously described this as "spooky ," suggesting the need for hidden variables to restore a deterministic, realistic framework where particle properties exist independently prior to . This paradox highlighted , where the state of one particle instantly determines the state of another, regardless of spatial separation, raising profound questions about the locality of physical influences. Building on , John Bell's 1964 provided a rigorous test by deriving inequalities that any must satisfy; , however, predicts violations of these inequalities for entangled systems. These inequalities arise from assumptions of locality (no influences) and (pre-existing values for observables), showing that quantum correlations exceed what local realism allows. Experimental verification came in 1982 through Alain 's photon-based tests, which confirmed the violations with high and used time-varying analyzers to close the locality , providing strong evidence against local hidden variable theories. However, the detection remained due to low photon detection efficiency. This was closed in 2015 through independent loophole-free Bell tests by teams led by Ronald Hanson (), Lynden Shalm (NIST), and Marissa Giustina (), which simultaneously addressed both the locality and detection loopholes, empirically ruling out local hidden variable theories and affirming the non-local features of . These results were recognized in the 2022 awarded to John , Alain Aspect, and Anton for their pioneering experiments with entangled photons that established the violation of Bell inequalities and advanced . Quantum indeterminacy further critiques classical , rendering Pierre-Simon Laplace's hypothetical demon—an intellect that could predict all future states from complete present knowledge—impossible due to the fundamental limits imposed by the . In , outcomes are probabilistic rather than predetermined, with collapsing wave functions in ways that introduce irreducible , amplified by in complex systems. This undermines the Laplacian ideal of a , as even perfect initial information cannot yield exact predictions, shifting from strict to statistical laws. Philosophers argue that such indeterminacy implies a where events lack definite causes at the quantum level, challenging the notion of a fully causal, predictable . The broader realism debate questions whether quantum mechanics depicts an observer-independent or if properties emerge only upon . Proponents of contend that the theory's success demands an underlying objective structure, yet and Bell's results suggest that maintaining both and locality is untenable, as implies holistic, non-separable systems. Critics of naive argue that the theory's reliance on probabilities and context-dependent outcomes indicates a departure from classical objectivity, where is not fully determinate without . This tension persists, with ongoing efforts to reconcile quantum predictions with intuitive notions of an independent physical world through refined interpretations.

Applications and Extensions

Quantum Tunneling and Field Theory

Quantum tunneling is a quantum mechanical phenomenon in which a particle can traverse a barrier that, according to , it lacks the energy to surmount. This occurs because the wave function of the particle extends into and beyond the classically forbidden region, yielding a non-zero probability for transmission./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/07%3A_Quantum_Mechanics/7.07%3A_Quantum_Tunneling_of_Particles_through_Potential_Barriers) A seminal application of quantum tunneling explains in radioactive nuclei, where an escapes the surrounding the . In , developed a model treating the as a wave that tunnels through the barrier, predicting decay rates that match experimental observations for elements like and . The probability of tunneling is quantified by the T, approximated using the WKB method as T \approx \exp\left( -2 \int_{x_1}^{x_2} \kappa(x) \, dx \right), where \kappa(x) = \sqrt{2m(V(x) - E)} / \hbar, m is the particle mass, V(x) is the barrier potential, E is the particle energy, and the integral spans the turning points x_1 and x_2 where E = V(x). This exponential dependence on barrier width and height accurately reproduces the Geiger-Nuttall law relating decay half-lives to atomic number./University_Physics_III_-Optics_and_Modern_Physics(OpenStax)/07%3A_Quantum_Mechanics/7.07%3A_Quantum_Tunneling_of_Particles_through_Potential_Barriers) One practical application harnessing quantum tunneling is the (), invented in 1981 by and at . The measures tunneling current between a sharp metallic tip and a sample surface, enabling atomic-resolution imaging of conductive materials by raster-scanning the tip. This breakthrough earned Binnig and Rohrer the 1986 and revolutionized . Quantum tunneling phenomena extend naturally into (QFT), which unifies with by treating particles as excitations of underlying quantum fields. In QFT, the non-relativistic many-body framework evolves into , where fields are expanded in terms of . These operators, a^\dagger_k and a_k, add or remove a particle in momentum state k, satisfying commutation relations [a_k, a^\dagger_{k'}] = \delta_{kk'} for bosons (or anticommutation for fermions), enabling the description of variable particle number and interactions. A striking consequence of QFT is vacuum fluctuations, arising from the Heisenberg uncertainty principle applied to fields, which imbue the vacuum with zero-point energy from all possible modes. The Casimir effect, predicted by Hendrik Casimir in 1948, manifests these fluctuations as an attractive force between two uncharged, parallel conducting plates in . The force originates from the boundary conditions suppressing certain electromagnetic modes between the plates compared to outside, leading to a pressure imbalance; for plates separated by distance d, the force per unit area is F/A = -\pi^2 \hbar c / (240 d^4). This effect has been experimentally verified and underscores the physical reality of quantum .

Quantum Information and Computing

Quantum information science leverages the principles of quantum mechanics to process and transmit information in ways that surpass classical limitations, enabling tasks such as secure communication and efficient computation through phenomena like superposition and entanglement. Unlike classical bits, which represent either 0 or 1, a qubit—the fundamental unit of quantum information—exists in a superposition of states described by the wave function |\psi\rangle = \alpha |0\rangle + \beta |1\rangle, where \alpha and \beta are complex amplitudes satisfying |\alpha|^2 + |\beta|^2 = 1, allowing it to encode exponentially more information as the number of qubits increases. Entanglement further connects multiple qubits such that the state of one instantaneously influences another, regardless of distance, as first highlighted in the Einstein-Podolsky-Rosen paradox. Quantum circuits manipulate s using unitary operators known as quantum gates, analogous to gates but reversible and capable of exploiting quantum effects. The Hadamard gate, a single- operation, creates superposition by transforming the basis state |0\rangle into \frac{|0\rangle + |1\rangle}{\sqrt{2}}, enabling parallel exploration of multiple computational paths. The controlled-NOT (CNOT) gate, a two- gate, flips the target qubit if the control qubit is in state |1\rangle, preserving superposition while generating entanglement; for instance, applying CNOT to |0\rangle \otimes |0\rangle after Hadamard on the first qubit yields the \frac{|00\rangle + |11\rangle}{\sqrt{2}}. These gates form universal sets for quantum computation, allowing simulation of any quantum process. Seminal quantum algorithms demonstrate practical advantages over classical methods. Shor's algorithm, introduced in 1994, factors large integers in polynomial time on a quantum computer by using to find periodicities in the function x^a \mod N, offering an exponential speedup for problems central to like . Grover's search algorithm, proposed in 1996, identifies a marked item in an unsorted database of N entries with high probability using O(\sqrt{N}) queries, providing a quadratic speedup over the classical O(N) bound through . These algorithms highlight quantum computing's potential to solve intractable problems efficiently. A cornerstone of quantum information theory is the , proven in 1982, which states that it is impossible to create an identical copy of an arbitrary unknown due to the linearity of quantum ; attempting to clone |\psi\rangle into a blank state |0\rangle distorts the output unless |\psi\rangle is known classically. This theorem underpins quantum security protocols, such as , by preventing eavesdropping without detection, as any measurement collapses the state and introduces errors.

Experimental Realizations and Technologies

Quantum dots, nanoscale particles typically 2–10 in diameter, exhibit discrete energy levels arising from three-dimensional quantum confinement of electrons and holes, which allows precise control over their . This confinement effect, first theoretically predicted in the and experimentally observed in colloidal systems in the , enables quantum dots to emit light at tunable wavelengths by varying their size, with smaller dots producing higher-energy (blue-shifted) emission. In light-emitting diodes (LEDs), quantum dots serve as color converters or active emitters, achieving narrow emission spectra ( around 20–30 ) and quantum yields exceeding 90%, which enhances color purity and efficiency in displays and lighting. Semiconductor quantum dot lasers leverage these discrete levels to realize low-threshold operation and high temperature stability, outperforming traditional lasers in and sensing applications. For instance, InAs/GaAs lasers operating at 1.3 μm wavelengths demonstrate threshold current densities below 100 A/cm² and operate continuously up to 100°C without cooling, due to the delta-like minimizing carrier leakage. These devices have been integrated into vertical-cavity surface-emitting lasers (VCSELs) for data centers, achieving modulation speeds over 50 Gbps. The Bose-Einstein condensate (BEC), a where a dilute gas of bosons is cooled to temperatures near and occupies the lowest , was first experimentally realized in 1995 by Eric A. Cornell and Carl E. Wieman at using and evaporative cooling of rubidium-87 atoms to 170 nK, resulting in a macroscopic wavefunction with lengths on the order of millimeters. This achievement, independently replicated by with sodium-23 atoms later that year, demonstrated quantum on a macroscopic scale, enabling phenomena like matter-wave interference and in dilute gases. The 2001 was awarded to Cornell, Wieman, and Ketterle for this work, which has since facilitated applications in precision atom interferometry for measuring gravitational fields with sensitivities below 10^{-9} g and in simulating quantum many-body systems. Ion traps provide a robust platform for quantum computing prototypes by confining charged atoms (ions) in electromagnetic fields, allowing precise control of their internal electronic states as qubits with coherence times exceeding 1 second and gate fidelities above 99.9%. The seminal proposal for scalable ion-trap quantum computing came from J. Ignacio Cirac and Peter Zoller in 1995, who described entangling operations via shared motional modes in linear ion chains, a method now implemented in systems holding up to 50 ions. Companies like IonQ have developed trapped-ion processors, such as the 2023 Aria system with 25 algorithmically stable qubits, demonstrating two-qubit gate errors below 0.1% and enabling small-scale quantum simulations. Nuclear magnetic resonance (NMR) techniques served as early prototypes for liquid-state quantum computing in the late 1990s, using molecular spins in solution as qubits manipulated by radiofrequency pulses, with demonstrations of Shor's algorithm for factoring 15 on 7 qubits by 2001. IBM's 2022 Osprey processor, a 433-qubit superconducting system, exemplifies scaling efforts in quantum hardware, though it builds on principles akin to those in ion-trap and NMR prototypes for error-corrected computation. Superconducting quantum interference devices (SQUIDs) exploit the quantum mechanical phenomenon of flux quantization in superconducting loops interrupted by Josephson junctions to achieve ultrasensitive magnetometry, detecting as weak as 10^{-15} T. Invented in 1964 by James Zimmerman based on Josephson's prediction of tunneling in superconductors, SQUIDs operate via of supercurrents, where the critical current modulates periodically with applied flux \Phi according to I_c(\Phi) = I_0 \left| \cos\left(\pi \Phi / \Phi_0 \right) \right|, with the flux quantum \Phi_0 = h / 2e \approx 2.07 \times 10^{-15} Wb. DC SQUIDs, using two junctions in a , provide noise levels below 1 fT/√Hz at 4.2 K, enabling applications in such as for activity mapping and in for mineral exploration.

References

  1. [1]
    Science 101: Quantum Mechanics - Argonne National Laboratory
    So, what is quantum? In a more general sense, the word ​“quantum” can refer to the smallest possible amount of something. The field of quantum ...
  2. [2]
    What Is Quantum Physics? - Caltech Science Exchange
    Quantum physics is the study of matter and energy at the most fundamental level. It aims to uncover the properties and behaviors of the very building blocks ...
  3. [3]
    DOE Explains...Quantum Mechanics - Department of Energy
    To be “quantized” means the particles in a bound state can only have discrete values for properties such as energy or momentum. For example, an electron in an ...
  4. [4]
    Origins of Quantum Theory - University of Pittsburgh
    Unlike relativity theory, the birth of quantum theory was slow and required many hands. It emerged in the course of the first quarter of the twentieth century ...Theories Of Matter At The... · Albert Einstein And The... · Niels Bohr And Atomic...<|control11|><|separator|>
  5. [5]
    [PDF] THE ORIGINS OF THE QUANTUM THEORY
    The first was electrodynamics, the theory of electricity, magnetism, and light waves, brought to final form by James Clerk Maxwell in the 1870s. The second, ...
  6. [6]
    Narrative - 7. Quantum Mechanics - Linus Pauling and The Nature of ...
    By applying an existing mathematics of wave functions to atomic questions, Schrödinger was able to create equations that matched the properties of simple atoms.
  7. [7]
    5 Concepts Can Help You Understand Quantum Mechanics and ...
    Feb 5, 2025 · The Heisenberg uncertainty principle: Nature imposes a fundamental limit on how precisely you can measure something. (You can't measure certain ...
  8. [8]
    Quantum Information | Leinweber Institute for Theoretical Physics
    Quantum information science aims to explore the nature of information at the quantum level, a world in which bits can be both zero and one at the same time.
  9. [9]
    Definition of quantum physics
    What is quantum physics? Quantum physics is necessary to understand the properties of solids, atoms, nuclei, subnuclear particles and light.
  10. [10]
    Interpretations of Quantum Mechanics
    Quantum mechanics was developed in the early twentieth century in response to several puzzles concerning the predictions of classical (pre-20th century) physics ...
  11. [11]
    On the laws of radiation | Proceedings of the Royal ... - Journals
    The law is obtained from the supposition that a state of statistical equilibrium has been arrived at between the energies of different wave-lengths and that of ...
  12. [12]
    Evolution of quasi-history of the Planck blackbody radiation equation ...
    Dec 1, 2018 · The historical time scale is quite clear: Planck's Radiation law is from 1900, the Rayleigh-Jeans law from 1905, and the term “ultraviolet ...
  13. [13]
    October 1900: Planck's Formula for Black Body Radiation
    Planck presented this latest formulation at a meeting of the German Physical Society on October 19, 1900, which was hailed as indisputably correct. But to ...Missing: hypothesis E = hν
  14. [14]
    Energy in packets – What did Planck discover?
    On October 19, 1900, Planck presented a new radiation law. In its derivation he set aside his reservations about the Boltzmann method and introduced “energy ...
  15. [15]
    [PDF] Planck, the Quantum, and the Historians - csbsju
    In December 1900, Planck presented a derivation of his new spectral distribution law at a meeting of the German Physical Society,43 and in January 1901 he.
  16. [16]
    Rewriting the Quantum “Revolution” - ScienceDirect.com
    The orthodox story will be refuted by showing that Planck did not postulate energy discreteness to derive his black-body radiation law in 1900; and Kuhn, though ...
  17. [17]
    Einstein's Proposal of the Photon Concept—a Translation of the ...
    Of the trio of famous papers that Albert Einstein sent to the Annalen der Physik in 1905 only the paper proposing the photon concept has been unavailable in ...<|separator|>
  18. [18]
    The Nobel Prize in Physics 1921 - NobelPrize.org
    The Nobel Prize in Physics 1921 was awarded to Albert Einstein for his services to Theoretical Physics, and especially for his discovery of the law of the ...Missing: light photons
  19. [19]
    Ueber einen Einfluss des ultravioletten Lichtes auf die electrische ...
    Ueber einen Einfluss des ultravioletten Lichtes auf die electrische Entladung. H. Hertz,. H. Hertz. Karlsruhe. Search for more papers by this author.
  20. [20]
    [PDF] A Quantum Theory of the Scattering of X-Rays by Light Elements
    ¹ A. H. Compton, Phil. Mag. 41, 758 (1921). ARTHUR H. COMPTON quantum theory of scattering applies only to light elements. The reason for this restriction ...
  21. [21]
    [PDF] diffraction of electrons by a crystal of nickel
    December, 1927. Vol. 30, No. 6. THE. PHYSICAL REVIEW. DIFFRACTION OF ELECTRONS BY A CRYSTAL OF NICKEL. BY C. DAVISSON AND L. H. GERMER. ABSTRACT. The intensity ...
  22. [22]
    Diffraction of Electrons by a Crystal of Nickel | Phys. Rev.
    Feb 3, 2025 · Diffraction of Electrons by a Crystal of Nickel. C. Davisson and L. H. Germer. Bell Telephone Laboratories, Inc., New York, N. Y.. PDF Share. X ...
  23. [23]
    Der experimentelle Nachweis der Richtungsquantelung im Magnetfeld
    Download PDF ... Cite this article. Gerlach, W., Stern, O. Der experimentelle Nachweis der Richtungsquantelung im Magnetfeld. Z. Physik 9, 349–352 (1922).
  24. [24]
  25. [25]
  26. [26]
  27. [27]
    [PDF] XXXV. A Tentative Theory of Light Quanta. By LOUIS DE BROGLIE
    On the theoretical side Bohr's theory, which is supported by so many experimental proofs, is grounded on the postulate that atoms can only emit or absorb.
  28. [28]
    Experiments on the diffraction of cathode rays - Journals
    Experiments on the diffraction of cathode rays. George Paget Thomson. Google ... THOMSON G (1935) Electron Diffraction as a Method of Research, Nature ...
  29. [29]
    The Quantum Postulate and the Recent Development of Atomic ...
    The content of this paper is essentially the same as that of a lecture on the present state of the quantum theory delivered on Sept. 16, 1927, at the Volta ...Missing: original | Show results with:original<|control11|><|separator|>
  30. [30]
    [PDF] 1913 On the Constitution of Atoms and Molecules
    Bohr, Philos. Mag. 26, 1. 1913. On the Constitution of Atoms and Molecules. N. Bohr,. Dr. phil. Copenhagen. (Received July 1913). Introduction. In order to ...
  31. [31]
    [PDF] 1926-Schrodinger.pdf
    At any rate the equations of ordinary mechanics will be of no more use for the study of these micro-mechanical wave-phe- nomena than the rules of geometrical ...
  32. [32]
    [PDF] Ersetzung der Hypothese vom unmechanischen Zwang durch eine ...
    Ersetzung der Hypothese vom unmechanischen. Zwang durch eine Forderung bezfiglich des inneren Verhaltens jedes einzelnen Elektrons,. § i. Bekanntlich kann man ...
  33. [33]
    [PDF] ELECTRICAL CHARGE AND AVOGADRO CONSTANT.
    oil-drop method must here be made. These assumptions may be stated thus: 1. The drag which the medium exerts upon a given drop is unaffected by its charge.Missing: Robert | Show results with:Robert
  34. [34]
    [PDF] 3. Quantisation as an eigenvalue problem; by E. Schrödinger
    ∗Original title: Quantisierung als Eigenwertproblem. Published in: Annalen der Physik 79. (1926): 361-376. Translated by Oliver F. Piattella. E-mail: oliver ...
  35. [35]
    Why Do Spectral Lines Have a Linewidth? - ACS Publications
    ... the Heisenberg uncertainty principle. ACS Publications. © American Chemical Society and Division of Chemical Education, Inc. Keywords (Audience). what are ...
  36. [36]
    Single-Slit Diffraction and the Uncertainty Principle - ACS Publications
    This short article emphasizes that a consideration of diffraction phenomena is an excellent way to illustrate the uncertainty principle.Missing: example | Show results with:example
  37. [37]
    Mathematical foundations of quantum mechanics : Von Neumann ...
    Jul 2, 2019 · Mathematical foundations of quantum mechanics. by: Von Neumann, John, 1903-1957. Publication date: 1955. Topics: Matrix mechanics. Publisher ...
  38. [38]
    [PDF] Principles of Quantum Mechanics - Dirac.djvu
    THE PRINCIPLE OF SUPERPOSITION. 1. The need for a quantum theory. CLASSICAL mechanics has been developed continuously from the time of Newton and applied to ...
  39. [39]
    [PDF] The Principles of Quantum Mechanics
    The growth of the use of transformation theory, as applied first to relativity and later to the quantum theory, is the essence of the new method in theoretical.
  40. [40]
    Quantisierung Als Eigenwertproblem : E. Schrödinger
    Oct 15, 2023 · Quantisierung Als Eigenwertproblem. by: E. Schrödinger. Publication ... pdf. Addeddate: 2023-10-15 18:24:27. Identifier: quantisierung-als ...
  41. [41]
    Copenhagen Interpretation of Quantum Mechanics
    May 3, 2002 · The Copenhagen interpretation was the first general attempt to understand the world of atoms as this is represented by quantum mechanics.Missing: key original
  42. [42]
    Max Born's Statistical Interpretation of Quantum Mechanics - Science
    Abstract. In the summer of 1926, a statistical element was introduced for the first time in the fundamental laws of physics in two papers by Born.
  43. [43]
    [PDF] The Many-Worlds Interpretation of Quantum Mechanics - PBS
    The present thesis is devoted to showing that this concept of a uni- versal wave mechanics, together with the necessary correlation machinery for its ...Missing: source | Show results with:source
  44. [44]
    Quantum mechanics and reality
    In order to find. PHYSICS TODAY / SEPTEMBER 1970. 33. Page 5. The Copenhagen collapse. This interpretation pictures the total wave function as collapsing to one ...
  45. [45]
    [PDF] arXiv:1909.03697v2 [quant-ph] 26 Nov 2019
    Nov 26, 2019 · Restating a famous argument due to Laplace (known as. Laplace's demon), determinism is usually assumed to be the. “view that a sufficient ...Missing: scholarly articles
  46. [46]
    [PDF] 1 Reframing the Free Will Debate: The Universe is Not Deterministic ...
    Mar 25, 2025 · that corresponds to the claim that Laplace's Demon could “know all the forces by which ... How Determinism Can Fail in Classical Physics and How ...Missing: critique scholarly
  47. [47]
    [PDF] Towards a realistic interpretation of quantum mechanics ... - arXiv
    Oct 3, 2014 · Abstract. It is argued that a realistic interpretation of quantum mechanics is possible and useful. Current interpretations ...
  48. [48]
    Second quantization - Scholarpedia
    Jun 28, 2010 · Second quantization is the basic algorithm for constructing Quantum Mechanics of assemblies of identical particles, introduced by Dirac.
  49. [49]
    Quantum Computing Explained | NIST
    Mar 18, 2025 · Like Schrödinger's unfortunate cat, qubits can be put into superpositions of multiple states. In other words, a qubit can be in state 0, state ...Missing: original | Show results with:original
  50. [50]
    A single quantum cannot be cloned - Nature
    Oct 28, 1982 · A single quantum cannot be cloned. W. K. Wootters &; W. H. Zurek. Nature volume 299, pages 802 ...
  51. [51]
    [PDF] Basic concepts in quantum computation - arXiv
    The size of the network is the number of gates it contains. The most common quantum gate is the Hadamard gate, a single qubit gate. H performing the unitary ...
  52. [52]
    CNOT Gate - Quantum Inspire
    The CNOT gate is two-qubit operation, where the first qubit is usually referred to as the control qubit and the second qubit as the target qubit.
  53. [53]
    Algorithms for quantum computation: discrete logarithms and factoring
    This paper gives Las Vegas algorithms for finding discrete logarithms and factoring integers on a quantum computer that take a number of steps which is ...
  54. [54]
    A fast quantum mechanical algorithm for database search - arXiv
    Nov 19, 1996 · This is an updated version of a paper that was originally presented at STOC 1996. The algorithm is the same; however, the proof has been ...
  55. [55]
    Semiconductor quantum dots: Technological progress and future ...
    Aug 6, 2021 · Here, we offer an overview of advances in the synthesis and understanding of QD nanomaterials, with a focus on colloidal QDs, and discuss their ...
  56. [56]
    Quantum Dots and Their Applications: What Lies Ahead?
    Jun 26, 2020 · QD applications are mostly based on their exquisite optical properties and their role in light emission, conversion, and detection.
  57. [57]
    Recent Developments of Quantum Dot Materials for High Speed and ...
    Mar 24, 2022 · In this paper, we review the development of the molecular beam epitaxial (MBE) growth methods, material properties, and device characteristics of semiconductor ...
  58. [58]
    (PDF) Quantum dot optoelectronic devices: lasers, photodetectors ...
    Aug 6, 2025 · The paper will highlight the major progress made in 1.3 μm quantum dot lasers, quantum dot infrared photodetectors, and quantum dot solar cells.
  59. [59]
    The Nobel Prize in Physics 2001 - NobelPrize.org
    The Nobel Prize in Physics 2001 was awarded jointly to Eric A. Cornell, Wolfgang Ketterle and Carl E. Wieman for the achievement of Bose-Einstein condensation.Missing: 1995 | Show results with:1995
  60. [60]
    [PDF] Eric A. Cornell and Carl E. Wieman - Nobel Lecture
    In his words, "A separation is effected; one part condenses, the rest remains a 'saturated ideal gas' "[3, 4]. This phenomenon we now know as Bose-Einstein ...
  61. [61]
    Ion-Based Quantum Computing Hardware: Performance and End ...
    May 19, 2024 · We provide a survey on the current state-of-the-art in trapped-ion quantum computing, taking up again the perspective of an industrial end-user.
  62. [62]
    Algorithmic cooling and scalable NMR quantum computers - PNAS
    NMR quantum computing has already succeeded in performing complex operations involving up to 7–8 quantum bits (qubits), and therefore NMR quantum computers are ...Missing: prototypes | Show results with:prototypes<|separator|>