Fact-checked by Grok 2 weeks ago

Modern Quantum Mechanics

Modern quantum mechanics is the branch of physics that provides a mathematical framework for understanding the behavior of matter and energy at the and subatomic scales, where fails to explain phenomena such as the stability of atoms and the discrete nature of energy levels. It emerged in the mid-1920s as a complete and consistent theory, building on the incomplete "" of the early 1900s by incorporating probabilistic descriptions of particle states through wave functions and operators. The development of modern quantum mechanics was driven by key breakthroughs between 1925 and 1926, including Werner Heisenberg's , which used non-commuting operators to represent observables like position and momentum, and Erwin 's wave mechanics, which introduced the to describe the evolution of quantum states. These formulations were shown to be equivalent by Erwin in 1926, with further unification by and others, while and others extended it to relativistic contexts and . The theory resolved longstanding puzzles, such as the explained by in 1905 and addressed by in 1900, providing a probabilistic interpretation where outcomes are predicted by probabilities rather than certainties. Central principles of modern quantum mechanics include wave-particle duality, where entities like electrons and photons exhibit both wave-like (interference) and particle-like (localized detection) properties; superposition, allowing quantum systems to exist in multiple states simultaneously until measured; and entanglement, in which particles become correlated such that the state of one instantly influences the other, regardless of distance. The uncertainty principle, formulated by Heisenberg in 1927, states that certain pairs of properties, like and , cannot be simultaneously known with arbitrary , quantified as \Delta x \Delta p \geq \hbar/2, where \hbar is the reduced Planck's constant. Quantization ensures that , , and other observables take discrete values in bound systems, underpinning the structure of atoms and molecules. Modern quantum mechanics has profoundly shaped contemporary and technology, enabling the design of semiconductors, lasers, and MRI machines, and forming the basis for emerging fields like , which leverages superposition and entanglement for exponential computational speedups in specific tasks. Despite its empirical success—predicting phenomena with unprecedented accuracy, such as the hydrogen atom's to parts per million—it continues to inspire debates over interpretations, including the view emphasizing measurement's role and more recent realist approaches seeking hidden variables. Ongoing research extends the theory to and many-body systems, highlighting its enduring relevance in probing the universe's fundamental nature.

Introduction

Definition and Scope

Modern quantum mechanics is the foundational physical theory that describes the behavior of matter and energy at atomic and subatomic scales, employing probabilistic interpretations of wave functions and the mathematical structure of linear algebra to predict outcomes of microscopic systems. This framework emerged in the mid-1920s as a revolutionary departure from classical physics, resolving inconsistencies in describing phenomena like atomic spectra and blackbody radiation through inherently non-deterministic principles. The theory originated with Werner Heisenberg's formulation of matrix mechanics in 1925, which prioritized observable quantities over classical trajectories. This was complemented by Erwin Schrödinger's wave mechanics in 1926, introducing continuous wave functions as an equivalent representation of quantum states. Paul Dirac unified these approaches through his transformation theory in 1927, which integrated both into a coherent abstract formalism, as formalized in his 1930 book The Principles of Quantum Mechanics. The scope of modern quantum mechanics primarily encompasses non-relativistic systems, where particle velocities are negligible compared to the , providing the standard toolkit for analyzing atomic, molecular, and solid-state phenomena. It deliberately excludes , which applies deterministically to macroscopic scales, and pre-1925 quantum concepts, such as Max Planck's introduction of , that lacked a full theoretical basis. Although extensions like address high-speed particles—exemplified by Dirac's 1928 equation for electrons—the core theory remains focused on non-relativistic dynamics. A defining feature of modern quantum mechanics is its probabilistic nature, where physical states yield only statistical predictions for measurements, in stark contrast to the precise, deterministic trajectories of . This theory operates most directly at quantum scales, from atoms to subatomic particles, though its principles underpin broader applications in fields like , which extend to cosmic phenomena while retaining the foundational probabilistic essence.

Historical Overview

The development of modern quantum mechanics began in the mid-1920s amid efforts to resolve inconsistencies in the old quantum theory, particularly in explaining atomic spectra. In 1925, Werner Heisenberg formulated the first complete quantum theory through matrix mechanics, a non-commutative algebraic framework that abandoned classical trajectories in favor of directly computing observable quantities like transition probabilities between energy levels. This approach, motivated by the need to align quantum postulates with empirical spectral data from atoms, marked a radical departure from classical mechanics by introducing matrices whose elements represented amplitudes for quantum jumps. In 1926, introduced wave mechanics as an alternative formulation, describing quantum systems via wave functions governed by a linear that emphasized continuous waves propagating in configuration space. 's method drew inspiration from Louis de Broglie's hypothesis of matter waves and achieved success in solving the problem, yielding quantized energy levels. Later that year, proposed the probabilistic interpretation, asserting that the square of the wave function's modulus represents the probability density of finding a particle in a given region. In 1926, and Carl Eckart independently proved the mathematical equivalence between wave mechanics and , unifying the two approaches and solidifying the foundations of the theory. Experimental confirmations of wave-particle duality bolstered these theoretical advances. The Compton effect, initially observed in 1923 but further verified through scattering experiments in the late 1920s, demonstrated the particle-like behavior of , while the 1927 Davisson-Germer experiment revealed patterns consistent with de Broglie's predicted wavelengths, affirming the wave nature of matter. That same year, the fifth in featured intense debates on quantum interpretation, with participants like defending the Copenhagen view against critiques from Einstein and others. Heisenberg's 1927 paper introduced the , quantifying the inherent limits on simultaneous knowledge of and . By 1930, synthesized these developments in his seminal book , formalizing the theory using in and incorporating relativity for electrons. In 1935, Einstein, , and published their , arguing that ' predictions for entangled systems implied "spooky " and thus incompleteness, sparking ongoing debates about the theory's foundations. Key milestones included Nobel Prizes in Physics awarded to Heisenberg in 1932 for creation, and jointly to Schrödinger and Dirac in 1933 for their foundational contributions to the theory's discovery and development.

Mathematical Foundations

Hilbert Space and State Vectors

In quantum mechanics, the mathematical framework for describing physical states relies on the concept of a , which is a complete over the complex numbers that is separable, meaning it has a countable . This structure ensures that limits of Cauchy sequences of vectors converge within the space, providing the necessary rigor for handling infinite-dimensional systems. Quantum states are represented as normalized vectors |\psi\rangle in such a space, often taken as the square-integrable functions L^2(\mathbb{R}^3) for particles in , where the normalization condition is \langle \psi | \psi \rangle = 1. State vectors, or kets denoted |\psi\rangle, describe pure quantum states, but due to phase invariance, physically distinct states correspond to rays in the rather than individual vectors, as multiplying |\psi\rangle by a complex phase factor e^{i\theta} yields an equivalent state. For mixed states, which arise when the system is in an ensemble of pure states with classical probabilities, the density operator \rho = \sum_i p_i |\psi_i\rangle \langle \psi_i | is used, where p_i \geq 0, \sum_i p_i = 1, and \rho is Hermitian, positive semi-definite, and trace-normalized. Basis representations of states are crucial, with common choices including the position basis | \mathbf{r} \rangle and momentum basis | \mathbf{p} \rangle, where any expands as |\psi\rangle = \int d^3 r \, \psi(\mathbf{r}) | \mathbf{r} \rangle or similarly for momentum. The Dirac notation facilitates computations, defining the inner product or overlap as \langle \phi | \psi \rangle, a complex scalar that measures the projection of one onto another and satisfies in the second argument and antilinearity in the first, with \langle \phi | \phi \rangle > 0 for non-zero states. Key properties of underpin quantum phenomena: \langle \phi | \psi \rangle = 0 holds for distinct eigenstates of the same , ensuring that different measurement outcomes for that are mutually exclusive. Superposition allows states to form linear combinations |\psi\rangle = \sum_n c_n |n\rangle, where the |n\rangle are states, and the probabilities of measuring eigenvalue n are given by |c_n|^2, ensuring the interpretation. This infinite-dimensional formulation assumes familiarity with finite-dimensional linear algebra but is essential because finite-dimensional spaces cannot accommodate observables with continuous spectra, such as or , which require over uncountably many eigenstates rather than .

Operators and Observables

In , physical observables such as , , and are represented by operators acting on the of vectors. These operators, denoted as Â, are Hermitian, meaning † = Â, which guarantees that their eigenvalues are real numbers corresponding to the possible outcomes of measurements. The expectation value of an observable  in a normalized |ψ⟩ is computed as ⟨Â⟩ = ⟨ψ|Â|ψ⟩, providing the average value one would obtain from many measurements on identically prepared systems. The for operators underpins the structure of observables, asserting that  admits a into its eigenvalues and projectors. For a spectrum,  has eigenvalues a_n with corresponding orthonormal eigenstates |a_n⟩ satisfying Â|a_n⟩ = a_n |a_n⟩, and these eigenstates form a complete basis for the , allowing any state to be expanded as |ψ⟩ = ∑ c_n |a_n⟩. In cases of continuous spectra, such as for the , the decomposition involves integrals over a continuous range of eigenvalues, with eigenstates forming a to handle the non-normalizability. Commutation relations between operators capture the compatibility of observables. The of two operators  and B̂ is defined as [Â, B̂] = ÂB̂ - B̂Â. If [Â, B̂] = 0, the observables commute and can be simultaneously diagonalized in a common basis of eigenstates; otherwise, they are incompatible, meaning precise simultaneous measurements are impossible. A canonical example is the and operators in one . In the position representation, the x̂ acts by multiplication: (x̂ ψ)(x) = x ψ(x), while the is p̂ = -iℏ d/dx. These satisfy the fundamental commutation relation [x̂, p̂] = iℏ, which originates from the quantization of classical brackets and ensures the non-commutativity inherent to . Unitary operators play a key role in describing symmetries of , preserving the norm and inner products of states via U† U = U U† = I. For instance, spatial rotations are represented by unitary operators that commute with the if the system is rotationally invariant, leading to of . The time evolution operator, which propagates states forward in time, is also unitary, ensuring probability .

Core Principles

Superposition and Interference

The in states that a quantum system can exist in a of multiple states simultaneously, as long as each state individually satisfies the governing equations of the theory. This arises from the linearity of the time-independent , allowing any superposition of its solutions to also be a . Formally, if {|n\rangle} represents a basis of eigenstates, the general of the system is described by |\psi\rangle = \sum_n c_n |n\rangle, where the coefficients c_n are complex numbers satisfying \sum_n |c_n|^2 = 1 to ensure , and the probability of measuring the system in |n\rangle is |c_n|^2. This principle, first articulated in foundational treatments of , fundamentally distinguishes quantum systems from classical ones, where states are definite and mutually exclusive. Interference effects emerge directly from the due to the relative phases of the complex coefficients c_n, leading to constructive or destructive in the probability amplitudes. In the , for instance, electrons or photons fired at a barrier with two slits produce an pattern on a detection screen, as if the particles propagate as waves of probability that overlap and interfere, even when sent one at a time. This wave-like behavior was experimentally confirmed for electrons in 1927, where the pattern matched the de Broglie wavelength predictions, demonstrating that matter exhibits characteristic of waves rather than classical particles following definite trajectories. The pattern arises from phase differences between paths through the two slits, resulting in regions of high and low detection probability that defy particle-only intuition. Environmental interactions can suppress these interference effects through a process known as decoherence, where coupling to an external environment causes the off-diagonal terms in the system's to decay rapidly, effectively localizing the superposition into classical-like mixtures without altering the underlying quantum evolution. This mechanism explains why macroscopic objects rarely exhibit observable superpositions, as the environment acts as an information sink, making unresolvable in practice. A classic demonstration of superposition in spin systems is the Stern-Gerlach experiment, where silver atoms in a spatially varying separate into discrete beams corresponding to spin-up and spin-down states along the field direction; an unprepared ensemble enters in a superposition, with the beam splitting into two separate paths corresponding to spin-up and spin-down states, illustrating the of the spin components. The Aharonov-Bohm effect further illustrates phase-induced interference, where charged particles encircling a region of —without entering the field—experience a shift in the interference pattern due to the vector potential's influence on the wave function phase, even in field-free regions. Classically, particles and waves are distinct entities, but bridges this duality, with particles behaving as localized excitations of underlying probability waves that interfere like classical waves yet localize upon detection. This incompatibility with is underscored by no-go theorems, such as the Kochen-Specker theorem, which proves that non-contextual hidden-variable theories—assigning definite values to observables independently of —cannot reproduce quantum predictions for systems with dimension greater than two, ruling out deterministic classical underpinnings for superposition.

Uncertainty Principle

The Heisenberg uncertainty principle quantifies the fundamental limit on the simultaneous knowledge of certain pairs of physical properties in quantum mechanics, arising from the non-commutativity of the corresponding operators. For position and momentum, it states that the product of their standard deviations in any quantum state satisfies \Delta x \Delta p \geq \frac{\hbar}{2}, where \hbar = h / 2\pi is the reduced Planck's constant and h is Planck's constant. This inequality was first proposed heuristically by Werner Heisenberg in 1927 and rigorously proven by Earle Hesse Kennard later that year using the canonical commutation relation [\hat{x}, \hat{p}] = i\hbar. The principle highlights that quantum systems cannot have definite values for both position and momentum simultaneously, reflecting the wave-like nature of particles. The general form of the uncertainty principle, derived by Howard Percy Robertson in 1929, applies to any pair of Hermitian operators \hat{A} and \hat{B} representing observables: \Delta A \Delta B \geq \frac{1}{2} \left| \left\langle [\hat{A}, \hat{B}] \right\rangle \right|, where \Delta A = \sqrt{\left\langle (\hat{A} - \langle \hat{A} \rangle)^2 \right\rangle} is the standard deviation, \langle \cdot \rangle denotes the expectation value in the state, and [\hat{A}, \hat{B}] = \hat{A}\hat{B} - \hat{B}\hat{A} is the commutator. For position and momentum, the commutator yields the minimal bound \frac{\hbar}{2}. This relation stems from the algebraic structure of quantum mechanics, where non-zero commutators enforce intrinsic uncertainties. The derivation proceeds from the positivity of the expectation value of a certain . Consider the deviations \Delta \hat{A} = \hat{A} - \langle \hat{A} \rangle and \Delta \hat{B} = \hat{B} - \langle \hat{B} \rangle. The inequality follows by applying the Cauchy-Schwarz inequality to the states \Delta \hat{A} |\psi\rangle and i \Delta \hat{B} |\psi\rangle in Hilbert space, or equivalently from \left\langle (\Delta \hat{A} \Delta \hat{B} + \Delta \hat{B} \Delta \hat{A})^2 \right\rangle + \left\langle (i (\Delta \hat{A} \Delta \hat{B} - \Delta \hat{B} \Delta \hat{A}))^2 \right\rangle \geq 0, which simplifies to $4 (\Delta A)^2 (\Delta B)^2 \geq \left| \left\langle [\hat{A}, \hat{B}] \right\rangle \right|^2. This yields the Robertson relation directly from the commutation properties. The proof assumes a pure state but extends to mixed states via the Schrödinger-Robertson relation. A related form involves energy and time, \Delta E \Delta t \geq \frac{\hbar}{2}, where \Delta E is the uncertainty in energy and \Delta t represents either the lifetime of an unstable quantum state or the duration over which a measurement is performed. Unlike position-momentum, time is not an observable with a Hermitian operator, so this arises from the time-dependent Schrödinger equation and the rate of change of expectation values, such as \Delta E \geq \frac{\hbar}{2 \Delta t} \left| \frac{d \langle \hat{A} \rangle}{dt} \right| for a time-independent Hamiltonian. For unstable states, it manifests as the natural linewidth \Gamma of spectral lines, where \Delta E = \frac{\hbar \Gamma}{2} and \Delta t is the excited-state lifetime, as verified in atomic spectroscopy experiments measuring exponential decay rates. The has profound implications for quantum systems, prohibiting classical-like trajectories with precise position and at all times and enforcing delocalization in states. It connects to the duality: a \psi(x) localized to width \Delta x in position space spreads over \Delta p \sim \hbar / \Delta x in space, as the of a narrow function is broad. This underlies the spreading of free wave packets and limits the resolution in quantum measurements, such as in where localizing a particle disturbs its . Experimental verifications confirm the principle's predictions. In neutron interferometry during the 1980s, interference patterns from thermal neutrons passing through a silicon crystal interferometer demonstrated position-momentum uncertainties consistent with \Delta x \Delta p \geq \hbar / 2, with phase shifts induced by magnetic fields altering the momentum spread while localizing the beam path. Single-photon experiments, such as those using attenuated laser light in double-slit setups, show that localizing a photon's position at the slits increases its transverse momentum uncertainty, broadening the diffraction pattern on the detection screen in agreement with the inequality. These tests, along with Mössbauer spectroscopy for time-energy forms, rule out classical explanations and affirm the quantum limit.

Quantum Measurement and Dynamics

Measurement Postulate and Collapse

In , the measurement postulate specifies the outcomes and probabilities associated with observing a , or , represented by a Hermitian operator \hat{A}. The possible results of such a measurement are the eigenvalues a_n of \hat{A}, and the probability of obtaining a specific eigenvalue a_n is given by the : P(a_n) = |\langle n | \psi \rangle|^2, where |\psi\rangle denotes the pre-measurement of the system and |n\rangle is the corresponding normalized eigenstate of \hat{A}. This probabilistic interpretation, introduced by , transforms the squared modulus of the wave function's projection onto the eigenbasis into a measure of likelihood, marking a departure from classical . Upon obtaining the measurement outcome a_n, the quantum state undergoes an abrupt change known as , or the projection postulate, wherein the system projects onto the eigenstate |n\rangle (up to normalization). This process is irreversible and non-unitary, contrasting sharply with the continuous, unitary governing the system between measurements. formalized this postulate in his axiomatic framework, emphasizing that measurement induces a discontinuous transition that selects one outcome from the superposition, thereby actualizing a definite state. The collapse ensures that subsequent measurements of the same observable yield the same eigenvalue with certainty, reflecting the system's new post-interaction. Von Neumann further distinguished between idealized, or "detailed," measurements—where the apparatus precisely projects the state onto a single eigenstate—and "fuzzy" measurements, which involve partial gain without full into a pure eigenstate. This distinction highlights the role of the measurement apparatus in the process and underscores limitations in achieving perfect ideality in practice. In the context of the Einstein-Podolsky-Rosen () paradox, the projection postulate exacerbates tensions by implying that a measurement on one entangled particle instantaneously collapses the state of a distant counterpart, challenging locality while preserving the formalism's consistency. An alternative perspective on apparent collapse arises from decoherence theory, which attributes the loss of quantum coherence to unavoidable interactions with the rather than an intrinsic projection. In this view, the system becomes entangled with environmental , and tracing over the latter yields a mixed state that mimics classical probabilities without invoking a true, objective collapse; however, decoherence does not fully resolve the , as it explains the emergence of preferred bases but not the definitive selection of outcomes. Wojciech Zurek's early work established the pointer basis—stable states robust against decoherence—as key to this environmental selection mechanism. The and collapse give rise to the , a foundational challenge questioning why and how the quantum-to-classical transition occurs amid unitary evolution. Eugene thought experiment illustrates this issue: a friend measures a superposition inside a lab, collapsing it from their , yet the external observer (Wigner) views the entire setup—including the friend—as still superposed until their own , raising paradoxes about the role and timing of . This scenario amplifies the tension between subjective collapse and objective reality, fueling ongoing debates without a consensus resolution within the postulate itself.

Time Evolution and the Schrödinger Equation

In quantum mechanics, the time evolution of a closed system's state is governed by the time-dependent Schrödinger equation, which describes how the wave function changes deterministically over time. This fundamental equation, postulated by Erwin Schrödinger in 1926, takes the form i \hbar \frac{\partial}{\partial t} |\psi(t)\rangle = \hat{H} |\psi(t)\rangle, where |\psi(t)\rangle is the state vector in Hilbert space, \hbar is the reduced Planck's constant, and \hat{H} is the Hamiltonian operator representing the total energy of the system. The equation ensures that the evolution is unitary, preserving the normalization of the wave function and thus the probabilities of measurement outcomes. The \hat{H} typically comprises the operator \hat{T} and the operator \hat{V}, so \hat{H} = \hat{T} + \hat{V}. For many systems, such as those in non-relativistic , \hat{T} = -\frac{\hbar^2}{2m} \nabla^2 in position representation, while \hat{V} depends on the specific potential. When \hat{H} is time-independent, the admits solutions separable into spatial and temporal parts: \psi(\mathbf{r}, t) = \psi_n(\mathbf{r}) e^{-i E_n t / \hbar}, where \psi_n(\mathbf{r}) are stationary states satisfying the time-independent \hat{H} \psi_n = E_n \psi_n, with E_n as the discrete energy eigenvalues. This separation reveals the quantized energy levels inherent to bound . The solution to the time-dependent can be expressed using the unitary time evolution operator U(t) = e^{-i \hat{H} t / \hbar}, which propagates the initial as |\psi(t)\rangle = U(t) |\psi(0)\rangle. This operator is unitary (U^\dagger U = I) because \hat{H} is Hermitian, ensuring that inner products and norms remain invariant under evolution, which upholds the probabilistic interpretation of . Unlike processes, which cause abrupt , this evolution is continuous and reversible for isolated systems. A key consequence of the Schrödinger equation is the Ehrenfest theorem, which connects quantum expectation values to classical equations of motion. It states that for any operator \hat{A}, \frac{d}{dt} \langle \hat{A} \rangle = \frac{i}{\hbar} \langle [\hat{H}, \hat{A}] \rangle + \left\langle \frac{\partial \hat{A}}{\partial t} \right\rangle, where \langle \cdot \rangle denotes the expectation value in the state |\psi(t)\rangle, and [\cdot, \cdot] is the commutator. For position \hat{x} and momentum \hat{p}, this yields m \frac{d}{dt} \langle \hat{x} \rangle = \langle \hat{p} \rangle and \frac{d}{dt} \langle \hat{p} \rangle = -\left\langle \frac{\partial V}{\partial x} \right\rangle, mirroring Newton's laws in the classical limit when wave packets are localized. Illustrative examples highlight these principles. For the , with \hat{H} = \frac{\hat{p}^2}{2m} + \frac{1}{2} m \omega^2 \hat{x}^2, the energy eigenvalues are E_n = \hbar \omega \left( n + \frac{1}{2} \right) for n = 0, [1](/page/1), 2, \dots, and the stationary states are Hermite-Gaussian wave functions, demonstrating equidistant spacing and . Similarly, the atom's \hat{H} = -\frac{\hbar^2}{2m_e} \nabla^2 - \frac{e^2}{4\pi \epsilon_0 r} yields exact solutions with energy levels E_n = -\frac{13.6 \, \mathrm{eV}}{n^2} for principal quantum number n = 1, 2, \dots, and wave functions involving associated and , explaining the discrete atomic spectrum. These cases underscore how the resolves classical paradoxes like and atomic through quantized dynamics.

Advanced Formulations

Path Integral Formulation

The path integral formulation provides an alternative to the operator formalism of , expressing the evolution of quantum states as a sum over all possible classical paths weighted by a derived from the action. Developed by Richard P. Feynman, this approach originated in his 1942 PhD thesis, The Principle of Least Action in Quantum Mechanics, where he explored summing amplitudes over paths to resolve issues in relativistic , and was formally introduced in his 1948 paper. The formulation emphasizes the "sum-over-histories" interpretation, where quantum interference arises from contributions of paths near the classical trajectory in the stationary phase limit. The core of the is the expression for the transition amplitude between an at x_i and time t_i to a final at x_f and t_f: \langle x_f, t_f | x_i, t_i \rangle = \int \mathcal{D}x \, \exp\left( \frac{i}{\hbar} S \right), where the functional is taken over all paths x(t) satisfying the conditions x(t_i) = x_i and x(t_f) = x_f, and S = \int_{t_i}^{t_f} L(x, \dot{x}, t) \, dt is the classical functional with L. This amplitude squared gives the probability density for the transition, capturing quantum effects through the oscillatory nature of the integrand. The derivation proceeds by slicing time into N small intervals \epsilon = (t_f - t_i)/N, approximating the propagator as a product of short-time evolution operators via the Trotter formula: e^{-i (T + V) \epsilon / \hbar} \approx e^{-i T \epsilon / 2\hbar} e^{-i V \epsilon / \hbar} e^{-i T \epsilon / 2\hbar}, where T is the operator and V the operator. Integrating over intermediate positions yields the discretized path sum, which becomes the continuum functional integral as \epsilon \to 0 and N \to \infty. This limit is equivalent to the time-dependent , as verified by expanding the for small perturbations and recovering the standard . A key advantage of the is its intuitive depiction of quantum interference: paths with actions differing by multiples of $2\pi \hbar contribute coherently, while others cancel, naturally explaining wave-like behavior without explicit wave functions. It also lends itself to relativistic extensions, as the Lorentz-invariant replaces the non-relativistic form seamlessly. Later reformulations by and Gian-Carlo connected the to operator methods in , enabling rigorous perturbative treatments via Wick contractions. Applications include perturbative expansions in , where the generates Feynman diagrams as discretized worldline histories for processes, simplifying calculations of higher-order corrections. Non-perturbatively, it describes quantum tunneling through barriers using solutions—saddle points in the —providing exponential prefactors for transition rates in systems like the .

Second Quantization for Many Particles

Second quantization provides a powerful framework for describing systems of many in , extending the single-particle formalism to handle variable particle numbers and naturally. In this approach, quantum states are represented in terms of occupation numbers for single-particle modes, rather than explicit wavefunctions for each particle. The creation a^\dagger_k adds a particle to the mode labeled by k, while the annihilation a_k removes one, acting on basis states |n_k\rangle where n_k denotes the number of particles in that mode. For bosons, the commutation relations [a_k, a^\dagger_l] = \delta_{kl} and [a_k, a_l] = 0 hold, allowing arbitrary occupation numbers, whereas for fermions, the anticommutation relations \{a_k, a^\dagger_l\} = \delta_{kl} and \{a_k, a_l\} = 0 enforce the by restricting n_k to 0 or 1. The underlying in is the , which is the over all possible particle numbers n of the n-particle , accommodating systems where the total number of particles is not fixed. For identical particles, the states in the n-particle sector are symmetrized for bosons or antisymmetrized for fermions to ensure indistinguishability; for fermions, this is often achieved through Slater determinants constructed from single-particle orbitals. This structure resolves the overcounting issues in , where permuting identical particles would yield equivalent states, by building the correct symmetry directly into the operator algebra. extends the standard to include the vacuum state |0\rangle (with no particles) and allows for processes like particle creation and annihilation, which are essential for phenomena involving variable particle numbers. A key application of is in formulating the many-body for interacting particles. In a non-interacting basis of single-particle energies \varepsilon_k, the term is \hat{H}_0 = \sum_k \varepsilon_k a^\dagger_k a_k, where a^\dagger_k a_k counts the of k. Interactions are incorporated via two-body potentials as \hat{H}_\text{int} = \frac{1}{2} \sum_{k,l,m,n} V_{klmn} a^\dagger_k a^\dagger_l a_m a_n, with V_{klmn} the matrix elements of the interaction in the chosen basis; this form ensures proper antisymmetrization for fermions or symmetrization for bosons through the operator ordering. The full \hat{H} = \hat{H}_0 + \hat{H}_\text{int} governs the dynamics of the system in Fock space, enabling calculations of ground states and excitations in condensed matter and atomic physics. For identical particles, inherently captures their quantum statistics. Fermions obey the due to anticommutation, prohibiting multiple occupancy in the same and leading to phenomena like shell filling in atoms. In contrast, bosons can condense into the lowest-energy , as exemplified by Bose-Einstein condensation, where a macroscopic number of particles occupy the at low temperatures, described by a in the Fock basis with large n_0. This condensation is observable in dilute gases of alkali atoms and underlies in . The transition from first to second quantization involves promoting single-particle wavefunctions to field operators that create or annihilate particles at specific positions. In first quantization, the many-particle wavefunction \Psi(\mathbf{r}_1, \dots, \mathbf{r}_N) describes fixed N particles, but in second quantization, the field operator \psi^\dagger(\mathbf{r}) creates a particle at position \mathbf{r}, with \psi(\mathbf{r}) annihilating one, satisfying [\psi(\mathbf{r}), \psi^\dagger(\mathbf{r}')] = \delta(\mathbf{r} - \mathbf{r}') for bosons or anticommuting for fermions. These operators expand in a basis as \psi(\mathbf{r}) = \sum_k \phi_k(\mathbf{r}) a_k, bridging the coordinate representation to the occupation-number basis and facilitating the treatment of translationally invariant systems.

Interpretations and Philosophical Aspects

Copenhagen Interpretation

The Copenhagen interpretation emerged in the late 1920s through the collaborative efforts of and at the , providing the first comprehensive philosophical framework for by emphasizing the limitations of classical concepts in describing atomic phenomena. This approach, often termed the "Copenhagen spirit," rejected realist interpretations of the quantum world in favor of a pragmatic view where physical theory serves to predict observable outcomes rather than depict hidden realities. Central to its tenets is the treatment of the wave function as a symbolic tool encoding probabilities about future measurements, rather than an objective description of physical states, aligning with Max Born's statistical interpretation of quantum amplitudes. A key element is Bohr's principle of complementarity, articulated in his September 1927 lecture at the Como conference commemorating , which asserts that mutually exclusive experimental arrangements—such as those revealing wave-like or particle-like localization—offer complementary aspects of quantum phenomena that together provide a complete description, but cannot be observed simultaneously due to the apparatus's role in the setup. This principle underscores the context-dependent nature of quantum descriptions, where wave and particle views are not contradictory but complementary limits of the theory. Complementarity extends to the foundational , interpreting it not as an epistemic limitation but as an intrinsic feature of quantum interactions that preclude simultaneous precise knowledge of complementary variables like and . The observer effect further defines the interpretation: any measurement entails an irreversible interaction between the quantum system and a classical apparatus, causing the wave function to "collapse" to a definite outcome and demarcating the quantum-classical boundary, without invoking underlying deterministic mechanisms or hidden variables. Bohr integrated these ideas with his earlier correspondence principle, formalized in works from 1918 to 1923, which demands that quantum predictions asymptotically approach classical results in the limit of high quantum numbers or large scales, ensuring continuity between the new quantum theory and established physics while guiding the theory's development. This philosophical stance was prominently debated at the October 1927 Solvay Conference on electrons and photons, where proponents like Bohr and Heisenberg defended the interpretation against skeptics, solidifying its dominance in the physics community. Werner Heisenberg later reflected on these developments in his 1958 book Physics and Philosophy: The Revolution in Modern Science, coining the term "Copenhagen interpretation" and elaborating its implications for epistemology and causality in quantum dynamics. Criticisms of the interpretation arose early, notably from , who at the 1927 and in subsequent debates argued that ' probabilistic predictions indicated an incomplete theory, famously remarking in a 1926 letter to that "God does not play dice with the universe" to express his aversion to inherent . Einstein's 1935 paper with and sharpened this critique, contending that the theory's allowance for "spooky " through entangled states violated locality and realism, implying the need for hidden variables to restore completeness—a view the framework explicitly rejects by prioritizing observable phenomena over unmeasurable elements. Despite such objections, the interpretation's lack of a rigorous, formal definition for the measurement process and the precise conditions for has persisted as an internal point of contention, though it remains the orthodox lens for applying .

Many-Worlds Interpretation

The (MWI), originally formulated as the relative-state interpretation by in his 1957 doctoral thesis, posits that the universal of the entire universe evolves deterministically and unitarily according to the , without any collapse upon . In this framework, quantum superpositions persist indefinitely, and what appears as a single outcome in a measurement is instead the observer becoming entangled with the quantum system, leading to a branching of the universal state into multiple, non-interacting components or "worlds," each corresponding to a definite outcome relative to the observer's state. This entanglement with the environment causes decoherence, effectively isolating these branches and giving the illusion of classical outcomes while preserving the unitary evolution across all branches. Everett's thesis emphasized that there is no need for a special postulate governing , as the dynamical laws suffice for all processes, including those involving observers; the preferred basis for branching emerges naturally from interactions with the environment rather than being imposed . The gained prominence through Bryce DeWitt's efforts in the , who popularized it by coining the "many-worlds" and editing a collection of key papers that highlighted its implications for resolving foundational issues in . An important extension came with B. Griffiths' development of the approach in 1984, which applies Everettian ideas to assign probabilities to sequences of events (histories) in a way that remains consistent with the unitary evolution, providing a framework for reasoning about quantum predictions without collapse. One key advantage of the MWI is that it resolves the by treating measurement as an ordinary physical process fully governed by unitary , eliminating the need for a non-unitary collapse mechanism. It also maintains that effects between branches could in be under controlled conditions that reverse decoherence, aligning with the full predictive content of . Critics have argued that the MWI incurs significant ontological extravagance by positing an immense proliferation of parallel worlds to account for all possible outcomes, raising questions about in physical theory. Additionally, deriving the for probabilities within a purely unitary framework has been contentious, though proponents have addressed this through decision-theoretic arguments, such as those formalized by David Wallace, which show that rational agents in branching worlds would effectively follow Born-rule probabilities to maximize utility.

Applications and Modern Extensions

Quantum Information Theory

Quantum information theory applies to the processing, storage, and transmission of information, leveraging phenomena such as superposition and entanglement to perform tasks impossible with classical systems. This field emerged prominently in the 1980s, with foundational ideas proposed by on simulating quantum systems using quantum computers. Key concepts include the , the basic unit of , which extends the classical bit by allowing superpositions of states. A is represented by a two-level quantum system in a superposition state |\psi\rangle = \alpha |0\rangle + \beta |1\rangle, where \alpha and \beta are complex amplitudes satisfying |\alpha|^2 + |\beta|^2 = 1, enabling probabilistic outcomes upon measurement. Pure qubit states can be visualized on the , a in three-dimensional real space where the corresponds to |0\rangle, the south pole to |1\rangle, and equatorial points to equal superpositions; the position encodes the expectation values of Pauli operators. This geometric representation, adapted from contexts, facilitates understanding of state evolution under unitary operations. Quantum gates manipulate qubits via unitary transformations, analogous to classical logic gates but operating on superpositions to produce entanglement. The Hadamard gate H creates superposition from a basis state, transforming |0\rangle to \frac{1}{\sqrt{2}} (|0\rangle + |1\rangle) and enabling balanced probabilistic outcomes. The controlled-NOT (CNOT) gate entangles two qubits by flipping the target qubit if the control is |1\rangle, producing states like the \frac{1}{\sqrt{2}} (|00\rangle + |11\rangle) from |+0\rangle. A universal gate set, such as {H, T, CNOT}, where T applies a \pi/4 phase shift, can approximate any multi-qubit unitary to arbitrary precision, forming the basis for quantum circuits. The demonstrates a fundamental limit: it is impossible to create an identical copy of an arbitrary unknown due to the of quantum evolution. If were possible, applying a unitary to |\psi\rangle |0\rangle to yield |\psi\rangle |\psi\rangle would contradict the distinct evolutions of non-orthogonal states like |0\rangle and |+\rangle, as preserves only linear combinations. This theorem underpins security, preventing perfect eavesdropping on unknown qubits. Bell inequalities quantify correlations in entangled states, testing local hidden variable theories against quantum predictions. The Clauser-Horne-Shimony-Holt (CHSH) inequality states that for local realistic models, the correlation parameter S = E(a,b) - E(a,b') + E(a',b) + E(a',b') satisfies |S| \leq 2, where E are expectation values of joint measurements. Quantum mechanics allows |S| \leq 2\sqrt{2} \approx 2.828, as achieved with entangled photons. Alain Aspect's 1982 experiments violated CHSH by over 5 standard deviations using time-varying analyzers on entangled photons separated by 12 meters, closing the locality loophole. Loophole-free violations followed in 2015 with electron spins over 1.3 km, achieving S = 2.42 \pm 0.20. By 2022, multiple experiments, including those with superconducting circuits and high-dimensional photons, confirmed violations without detection or locality loopholes, solidifying quantum nonlocality. Recent experimental progress includes Google's Willow quantum processor, announced in December 2024, which features 105 qubits and achieves verifiable quantum advantage through advanced error correction techniques. Quantum key distribution exploits these principles for secure communication, with the BB84 protocol enabling two parties to share a secret key using polarized photons. Alice sends qubits in one of two bases (rectilinear or diagonal), chosen randomly, encoding bits as |0\rangle, |1\rangle or |+\rangle, |-\rangle; Bob measures in a random basis, discarding mismatches after public basis comparison. Security arises from the Heisenberg uncertainty principle: eavesdropping disturbs states, detectable via error rates exceeding 25% for cloning attempts, ensuring information-theoretic security against quantum adversaries.

Relativistic Quantum Mechanics and Quantum Field Theory

Relativistic quantum mechanics emerged in the late 1920s to reconcile with Einstein's , addressing the limitations of the non-relativistic for high-speed particles. The first attempt was the Klein-Gordon equation, proposed independently by and Walter Gordon in 1926, which describes relativistic spin-0 particles but suffers from issues like negative probability densities in its single-particle interpretation. These problems, including solutions without clear physical meaning, highlighted the need for a more robust framework, ultimately resolved by interpreting the equation within (QFT) where particles are excitations of underlying fields. For particles like , formulated a relativistic wave equation in 1928 that incorporates both and while naturally accounting for . The is given by i \hbar \frac{\partial \psi}{\partial t} = c \boldsymbol{\alpha} \cdot \mathbf{p} \psi + \beta m c^2 \psi, where \psi is a four-component , \boldsymbol{\alpha} and \beta are 4x4 matrices, \mathbf{p} is the , m is the particle mass, c is the , and \hbar is the reduced Planck's constant. This equation predicts the existence of , specifically the as the of the , which was experimentally confirmed by Carl Anderson in 1932 through observations. Despite its successes, the also features negative energy states, leading to further challenges that QFT addresses by treating particles and antiparticles as creations and annihilations in a field. Quantum field theory represents the culmination of modern quantum mechanics by quantizing relativistic fields, where fields such as the \phi(x) or the Dirac field are promoted to operator-valued distributions that create and annihilate particles at points x. This approach resolves the interpretive issues of single-particle relativistic equations by viewing particles as excitations of these fields, enabling a consistent description of multi-particle processes including pair creation and annihilation. In QFT, interactions are handled via the , which computes amplitudes and probabilities for particle transitions, providing a perturbative framework for relativistic . A landmark development was (QED), the QFT for electromagnetic interactions, independently advanced in the 1940s by , , and Sin-Itiro Tomonaga, who shared the 1965 Nobel Prize in Physics for their fundamental work. Their formulations, unified by , introduced techniques to eliminate infinities arising in perturbative calculations, allowing precise predictions such as the anomalous of the , verified to high accuracy. Building on this, the of , formulated in the 1970s as a gauge QFT incorporating the strong, weak, and electromagnetic forces, unifies these interactions among quarks, leptons, and gauge bosons. The model's , proposed by and others in 1964, explains particle masses via interactions with the Higgs field, with the discovered at CERN's in 2012, confirming a key prediction and completing the Standard Model's particle content. Despite these triumphs, QFT faces ongoing challenges, particularly in reconciling it with to form a theory of , where non-renormalizable divergences and the lack of a complete fixed point hinder progress. Efforts like and aim to address these issues, but no fully consistent framework has yet been experimentally verified. Path integrals, originally developed by Feynman, play a central role in QFT formulations but require further refinement for gravitational contexts. As of 2025, new theoretical proposals include a of gravity by Mikko Partanen and Jukka Tulkki at , designed to be compatible with the , potentially resolving singularities in black holes and the . Additionally, research published in demonstrates that classical theories of gravity can produce through virtual matter exchange, indicating that observing entanglement in gravitational experiments may not conclusively prove effects.

References

  1. [1]
    DOE Explains...Quantum Mechanics - Department of Energy
    Quantum mechanics is the field of physics that explains how extremely small objects simultaneously have the characteristics of both particles (tiny pieces ...
  2. [2]
    [PDF] Quantum Mechanics
    Sep 14, 2021 · Some history & basic features of quantum theory. ‣1900-1920: the “Old Quantum Theory.” ‣1920's: “Quantum Mechanics”: modern theory of quantum ...
  3. [3]
    Quantum Mechanics Timeline
    This is the beginning of modern quantum mechanics. 1926 - Almost simultaneously with Heisenberg, Austrian physicist Erwin Schrodinger develops a theory of ...
  4. [4]
    Quantum Mechanics - Stanford Encyclopedia of Philosophy
    Nov 29, 2000 · Sakurai, J.J., 1993 [2021], Modern Quantum Mechanics (revised edition), Reading, MA: Addison Wesley; third edition, with coauthor Jim ...
  5. [5]
    What Is Quantum Physics? - Caltech Science Exchange
    Quantum physics is the study of matter and energy at the most fundamental level. It aims to uncover the properties and behaviors of the very building blocks ...Share This · The Origins Of Quantum... · Mathematics And The...
  6. [6]
    5 Concepts Can Help You Understand Quantum Mechanics and ...
    Feb 5, 2025 · With its wave-like particles and particle-like waves, quantum mechanics certainly challenges our intuitions of how the world works. Accepting ...Quantum In Action · Superposition In Action · Entanglement In Action...
  7. [7]
    A century of quantum mechanics - CERN
    Jul 9, 2025 · On 9 July 1925, in a letter to Wolfgang Pauli, Werner Heisenberg revealed his new ideas, which were to revolutionise physics.
  8. [8]
    [PDF] 1926
    The paper gives an account of the author's work on a new form of quantum theory. §1. The Hamiltonian analogy between mechanics and optics. §2. The.
  9. [9]
    [PDF] QUANTUM MECHANICS-Non-Relativistic Theory
    The topics dealt with include the basic concepts, Schrodinger's equation, angular momentum, motion in centrally symmetric fields perturbation theory, the quasi ...
  10. [10]
    [PDF] Quantum Mechanics - Northern Illinois University
    Apr 2, 2025 · Quantum mechanics started to assume its modern form in a brilliant flash of activity from 1924-1930. In the subsequent century, there have been ...
  11. [11]
    January 1928: The Dirac equation unifies quantum mechanics and ...
    Nov 19, 2024 · A seminal paper by Paul Dirac, who relied on mathematical intuition, laid the foundation for quantum electrodynamics.Missing: 1930 | Show results with:1930
  12. [12]
    Heisenberg's Breakthrough - American Institute of Physics.
    The first page of Heisenberg's break-through paper on quantum mechanics, published in the Zeitschrift für Physik, 33 (1925), 879-893, received 29 July 1925.
  13. [13]
    Understanding Heisenberg's 'Magical' Paper of July 1925 - arXiv
    Apr 1, 2004 · In July 1925 Heisenberg published a paper [Z. Phys. 33, 879-893 (1925)] which ended the period of `the Old Quantum Theory' and ushered in the new era of ...
  14. [14]
    An Undulatory Theory of the Mechanics of Atoms and Molecules
    The paper gives an account of the author's work on a new form of quantum theory. §1. The Hamiltonian analogy between mechanics and optics.
  15. [15]
    The birth of wave mechanics (1923–1926) - ScienceDirect.com
    It is Schrödinger who solved the problem, in a series of papers [8] published in 1926, just after Heisenberg [9], on the one hand, Born and Jordan [10], on the ...
  16. [16]
  17. [17]
    The Principles of Quantum Mechanics - P. A. M. Dirac
    The standard work in the fundamental principles of quantum mechanics, indispensable both to the advanced student and to the mature research worker.
  18. [18]
    Can Quantum-Mechanical Description of Physical Reality Be ...
    Feb 18, 2025 · Einstein and his coauthors claimed to show that quantum mechanics led to logical contradictions. The objections exposed the theory's strangest ...
  19. [19]
    Mathematical foundations of quantum mechanics : Von Neumann ...
    Jul 2, 2019 · Mathematical foundations of quantum mechanics. by: Von Neumann, John, 1903-1957. Publication date: 1955. Topics: Matrix mechanics. Publisher ...
  20. [20]
    None
    Below is a merged response that consolidates all the information from the provided summaries into a single, comprehensive overview. To maximize detail and clarity, I will use a table in CSV format to summarize key definitions and aspects across the sections, followed by a narrative summary that integrates the extracted quotes and additional details. This approach ensures all information is retained while maintaining readability and density.
  21. [21]
    [PDF] Von Neumann's 1927 Trilogy on the Foundations of Quantum ... - arXiv
    May 20, 2025 · In his first solo paper on quantum mechanics, (von Neumann. 1927a), von Neumann would essentially reject the methodology of the Jordan ...
  22. [22]
    [PDF] Dirac's Principles - The Information Philosopher
    Dirac grounded his quantum mechanics on three basic ideas, the principle of superposition, the axiom of measurement, and the projection postulate, all of which ...
  23. [23]
    Decoherence, einselection, and the quantum origins of the classical
    May 22, 2003 · Decoherence, caused by environment interaction, leads to einselection, a quantum process that enforces classicality by selective loss of ...
  24. [24]
    Significance of Electromagnetic Potentials in the Quantum Theory
    Mar 3, 2025 · Aharonov and Bohm proposed a scenario in which quantum particles experience electromagnetic effects even though there is no field in their ...
  25. [25]
    Wigner's Friend - Experiments - The Information Philosopher
    Wigner claimed that a quantum measurement requires a conscious observer, without which nothing ever happens in the universe. In 1961 he complicated the problem ...Missing: paper | Show results with:paper<|separator|>
  26. [26]
    [PDF] An undulatory theory of the mechanics of atoms and molecules - ISY
    The paper gives an account of the author's work on a new form of quantum theory. §1. The Hamiltonian analogy between mechanics and optics. §2. The.
  27. [27]
    [PDF] Ehrenfest D. Kriesell page 1 of 9 - abc
    Remarks on the approximate validity of classical mechanics within quantum mechanics. By P. Ehrenfest in Leiden, Holland. Received September 5, 1927. The ...
  28. [28]
    [PDF] collected papers on
    These papers include practically all that. Professor Schrödinger has written on Wave Mechanics. The translation has been made by J. F. Shearer, M.A., B.Sc ...
  29. [29]
    [PDF] Path Integral Methods and Applications - arXiv
    The path integral is a formulation of quantum mechanics equivalent to the standard ... Feynman's original paper,1 which essentially laid the foundation of ...
  30. [30]
    The Euclidean path integrals and the Feynman-Dyson-Wick ...
    The author derives the Feynman rules for the complex phi 4 theory using the Euclidean path integral formulation. Export citation and abstractBibTeXRIS.
  31. [31]
    Physics and philosophy : the revolution in modern science
    Sep 7, 2022 · Physics and philosophy : the revolution in modern science. by: Heisenberg, Werner, 1901-1976. Publication date: 1958. Topics: Quantum theory ...
  32. [32]
    The Quantum Postulate and the Recent Development of Atomic ...
    This was Bohr's Como Lecture in September, 1927, famous as the first publication of his concept of "complementarity." It was also published in Dutch, French ...
  33. [33]
  34. [34]
    [PDF] Can Quantum-Mechanical Description of Physical Reality Be
    A. EINSTEIN, B. PODOLSKY AND N. ROSEN, Institute for Advanced Study, Princeton, New Jersey. (Received March 25, 1935). In a complete theory there is an ...
  35. [35]
    "Relative State" Formulation of Quantum Mechanics | Rev. Mod. Phys.
    Apr 18, 2025 · "Relative State" Formulation of Quantum Mechanics. Hugh Everett, III* ... Quantum Milestones, 1957: Sprouting Parallel Universes. Published ...
  36. [36]
    [PDF] Many Worlds, the Born Rule, and Self-Locating Uncertainty1 - arXiv
    Mar 25, 2015 · A proper measure of ontological extravagance relies on the number of types of fundamental entities proposed by the theory (like the wave ...
  37. [37]
    [PDF] Quantum mechanics and reality
    Bryce DeWitt is professor of physics at the University of North Carolina ... This concept is alien to experimental physics because it involves many ele- ments of ...
  38. [38]
    Consistent histories and the interpretation of quantum mechanics
    Consistent histories and the interpretation of quantum mechanics. Griffiths, Robert B. Abstract. The usual formula for transition probabilities in ...
  39. [39]
    A formal proof of the Born rule from decision-theoretic assumptions
    Jun 15, 2009 · I develop the decision-theoretic approach to quantum probability, originally proposed by David Deutsch, into a mathematically rigorous proof of the Born rule.
  40. [40]
    Simulating physics with computers | International Journal of ...
    Feynman, RP Simulating physics with computers. Int J Theor Phys 21, 467–488 (1982). https://doi.org/10.1007/BF02650179
  41. [41]
    A single quantum cannot be cloned - Nature
    Oct 28, 1982 · A single quantum cannot be cloned. W. K. Wootters &; W. H. Zurek. Nature volume 299, pages 802 ...
  42. [42]
    Proposed Experiment to Test Local Hidden-Variable Theories
    The proposed experiment tests local hidden-variable theories by extending the polarization correlation of optical photons, using a generalized Bell theorem.
  43. [43]
    Experimental Test of Bell's Inequalities Using Time-Varying Analyzers
    The results are in good agreement with quantum mechanical predictions but violate Bell's inequalities by 5 standard deviations.
  44. [44]
    Loophole-free Bell inequality violation using electron spins ... - Nature
    Oct 21, 2015 · Here we report a Bell experiment that is free of any such additional assumption and thus directly tests the principles underlying Bell's inequality.
  45. [45]
    [PDF] Scientific Background on the Nobel Prize in Physics 2022
    Oct 4, 2022 · Clauser was by now well aware that no previous experiment had tested the Bell–CHSH inequality. ... In 2022 three groups used loophole free Bell ...
  46. [46]
    Quantum cryptography: Public key distribution and coin tossing - arXiv
    Mar 14, 2020 · This is a best-possible quality scan of the original so-called BB84 paper as it appeared in the Proceedings of the International Conference ...Missing: URL | Show results with:URL
  47. [47]
    [PDF] Chapter 3 The Klein-Gordon Equation
    1928: Dirac "invents" the Dirac equation. The probability density is positive; however, negative energies are allowed (Proc. Roy. Soc. A117, 610-628).
  48. [48]
    [PDF] Copyright c 2019 by Robert G. Littlejohn
    We have identified two difficulties with the Klein-Gordon equation: the existence of negative energy solutions, for which we have no physical interpretation, ...
  49. [49]
    Antimatter | CERN
    In 1928, British physicist Paul Dirac wrote down an equation that combined quantum theory and special relativity to describe the behaviour of an electron ...Missing: original | Show results with:original
  50. [50]
    How Dirac predicted antimatter - New Scientist
    May 12, 2009 · Paul Dirac developed a theory that combined quantum mechanics, used to describe the subatomic world, with Einstein's special relativity.
  51. [51]
    Discovering the positron | timeline.web.cern.ch
    02 January 1928. Dirac's equation predicts antiparticles. In 1928, British physicist Paul Dirac wrote down… · 06 May 1929. First sighting of positron disregarded.
  52. [52]
    [PDF] SECOND QUANTIZATION Lecture notes with course Quantum Theory
    Second quantization is the standard formulation of quantum many-particle theory. It is important for use both in Quantum Field Theory (because a quantized field ...
  53. [53]
    [PDF] Quantum Field Theory 1 Contents
    Page 1. Quantum Field Theory 1. Contents. 1 Second quantization. 4. 2 Second quantization ... theory can be mapped to an operator in the second theory. However ...
  54. [54]
    Nobel Prize in Physics 1965 - Presentation Speech - NobelPrize.org
    Using the method of renormalization which he also developed Schwinger was able to prove that a small anomalous contribution should be added to the value of ...
  55. [55]
    Renormalization Theory | RIKEN
    The new theory was key to the development of quantum electrodynamics and has become a central technique in quantum field theory. Tomonaga once reflected, “My ...
  56. [56]
    Sin-Itiro Tomonaga – Nobel Lecture - NobelPrize.org
    Furthermore, Feynman devised a convenient method based on an ingenious idea which could be used to extend the approximation of Schwinger and ours to higher ...
  57. [57]
    The deepest problem: some perspectives on quantum gravity - arXiv
    Feb 16, 2022 · This reveals a conflict between foundational principles of quantum field theory: those of quantum mechanics, relativity, and locality.
  58. [58]
    Quantum Gravity and Field Theory - MIT Physics
    In recent years a set of new developments has begun to draw unexpected connections between a number of problems relating aspects of gravity, black holes, ...Netta Engelhardt · Daniel Harlow · Krishna Rajagopal · Hong Liu
  59. [59]
    The Higgs boson - CERN
    ... Standard Model to supersymmetry ... The existence of this mass-giving field was confirmed in 2012, when the Higgs boson particle was discovered at CERN.What's so special about the... · How does the Higgs boson...