Fact-checked by Grok 2 weeks ago

Quantum information science

Quantum information science (QIS) is an interdisciplinary field that merges with information and computation theory to process, store, and transmit information using , such as particles or photons in states of superposition and entanglement, enabling transformative technologies in computing, communication, and sensing that surpass classical limitations. At its core, QIS relies on foundational quantum concepts: the , the basic unit of analogous to a classical bit but capable of existing in multiple states simultaneously due to superposition; entanglement, where qubits become interconnected such that the state of one instantly influences another regardless of distance; and properties like and squeezing that allow for enhanced precision and security in information handling. These principles differ fundamentally from classical , where bits are strictly 0 or 1, by exploiting quantum parallelism to perform complex calculations exponentially faster for certain problems, such as simulating molecular interactions or optimizing large datasets. Key applications of QIS span multiple domains. In quantum computing, it promises to solve intractable problems in , , and by leveraging algorithms that harness entanglement for massive parallelism. Quantum communication employs techniques like to enable unbreakable encryption, protecting data against eavesdropping through the of quantum states. Quantum sensing and metrology utilize entangled particles for ultra-precise measurements, advancing fields like , , and with sensitivities far beyond classical sensors. Additionally, quantum networking aims to connect quantum devices globally, forming the basis for a quantum that integrates with . The field has accelerated since the 1990s, with foundational work on quantum algorithms and error correction, including the U.S. National Quantum Initiative Act of 2018, which coordinates federal investments to advance QIS research and commercialization, and ongoing international efforts such as the United Nations declaring 2025 the International Year of Quantum Science and Technology. Institutions like NIST and the Department of Energy have led standards development and prototypes, such as quantum logic gates and atomic clocks, while international efforts focus on scaling qubits and mitigating decoherence challenges. Despite hurdles like qubit instability, QIS holds potential to revolutionize industries, bolster national security, and drive economic growth through innovations estimated to contribute up to $2 trillion in value to key industries by 2035.

Foundations of Quantum Information

Quantum Bits and States

A , or quantum bit, serves as the fundamental unit of in quantum information science, defined as a two-level quantum mechanical system that can exist in states analogous to the classical bit's 0 and 1, but with the capacity for quantum superpositions. Unlike a classical bit, which is strictly in one of two mutually exclusive states, a qubit's is described within a two-dimensional complex , enabling richer informational properties. The mathematical representation of a qubit state employs Dirac notation, where a general pure state is expressed as |\psi\rangle = \alpha |0\rangle + \beta |1\rangle, with \alpha, \beta \in \mathbb{C} satisfying the normalization condition |\alpha|^2 + |\beta|^2 = 1. This vector in can be geometrically visualized using the , a in \mathbb{R}^3 where the north and south poles represent the basis states |0\rangle and |1\rangle, respectively, and equatorial points denote equal superpositions. The 's coordinates, derived from the expectation values of the , fully parameterize pure states, providing an intuitive tool for state manipulation and analysis. Qubits generalize to qudits, d-dimensional that extend the two-level structure to higher dimensions, offering increased information capacity per unit for certain tasks in processing. For systems comprising multiple qubits, the total state resides in the of individual Hilbert spaces; for n qubits, this yields a $2^n-dimensional space, allowing composite states to be formed as |\psi\rangle \otimes |\phi\rangle for separable cases. Physical realizations of qubits draw from diverse quantum systems to encode these two-level states. particles, such as electrons, implement qubits via their states aligned with an external field, where |0\rangle and |1\rangle correspond to spin-up and spin-down projections. Photonic qubits utilize , with horizontal and vertical polarizations serving as the computational basis, enabling low-loss transmission in optical quantum networks. Superconducting circuits realize qubits through nonlinear elements like Josephson junctions, which create anharmonic energy ladders approximating two-level systems at microwave frequencies for scalable on-chip integration.

Superposition and Entanglement

In quantum information science, the superposition principle allows a quantum system, such as a qubit, to exist in a linear combination of multiple states simultaneously, enabling phenomena like interference that have no classical analog. This arises from the linearity of the Schrödinger equation, which governs the evolution of quantum states. For instance, a qubit state can be expressed as |\psi\rangle = \alpha |0\rangle + \beta |1\rangle, where \alpha and \beta are complex amplitudes satisfying |\alpha|^2 + |\beta|^2 = 1, and the probabilities of measurement outcomes are given by |\alpha|^2 and |\beta|^2. Interference effects manifest when superposed states combine, as illustrated by the quantum double-slit experiment, where single particles like electrons produce an interference pattern on a screen, indicating each particle explores multiple paths in superposition before detection. Entanglement represents a stronger form of quantum correlation, where the joint state of a composite system cannot be decomposed into a product of individual states, even if the subsystems are spatially separated. Formally, for a bipartite system, a pure state |\Psi\rangle is entangled if it cannot be written as |\Psi\rangle = |\phi_A\rangle \otimes |\phi_B\rangle, leading to non-local correlations that violate classical intuitions. This concept was highlighted in the Einstein-Podolsky-Rosen (EPR) paradox, which questioned the completeness of quantum mechanics by arguing that measuring one particle instantaneously determines the state of a distant partner, implying "spooky action at a distance." John Bell's theorem formalized this by showing that local hidden variable theories, which assume predetermined outcomes independent of distant measurements, must satisfy specific inequalities for correlation functions. A prominent form is the Clauser-Horne-Shimony-Holt (CHSH) inequality: \left| \langle AB \rangle + \langle AB' \rangle + \langle A' B \rangle - \langle A' B' \rangle \right| \leq 2, where A, A' and B, B' are measurement outcomes on subsystems A and B, respectively; quantum mechanics predicts violations up to $2\sqrt{2} for maximally entangled states. To quantify entanglement in bipartite systems, measures such as concurrence and entanglement entropy are employed. Concurrence, for two qubits in a pure state |\psi\rangle, is defined as C(|\psi\rangle) = |\langle \psi | \tilde{\psi} \rangle|, where |\tilde{\psi}\rangle = \sigma_y \otimes \sigma_y |\psi^*\rangle is the spin-flipped state, providing a value between 0 (separable) and 1 (maximally entangled); for mixed states, it involves an optimization over decompositions. Entanglement entropy, based on the von Neumann entropy of the reduced density matrix \rho_A = \mathrm{Tr}_B (|\Psi\rangle\langle\Psi|), is S(\rho_A) = -\mathrm{Tr}(\rho_A \log_2 \rho_A), which quantifies the entanglement as the uncertainty in subsystem A alone, reaching 1 ebit for a maximally entangled qubit pair. These metrics highlight entanglement's role as a resource distinct from classical correlations. Experimental confirmation of these non-classical features came through violations of Bell inequalities using entangled photon pairs. In 1982, Alain Aspect and collaborators demonstrated a clear violation of the CHSH inequality by over 5 standard deviations in an experiment with time-varying polarization analyzers on photons separated by 12 meters, closing potential loopholes related to detection efficiency and locality while aligning with quantum predictions. This landmark result underscored the irreconcilability of quantum mechanics with local realism, paving the way for entanglement's central role in quantum information protocols.

Quantum Measurement and Decoherence

In quantum information science, the measurement process extracts classical information from a quantum , fundamentally altering its according to the measurement postulate. Upon of an with eigenstates |i\rangle, the system in |\psi\rangle projects onto one of the eigenstates, with the probability of outcome i given by the : P(i) = |\langle i | \psi \rangle|^2. This probabilistic outcome arises from the inner product in , ensuring that repeated measurements on identical preparations yield statistics matching the squared amplitudes. The projection leads to the collapse of the wavefunction, where the superposition prior to —such as a coherent of states—reduces to a single definite state post-measurement. This phenomenon raises the , questioning why and how collapse occurs, as the predicts unitary evolution without such abrupt changes. The , developed by and , posits that measurement involves an irreversible interaction with a classical apparatus, resolving the problem by distinguishing from the macroscopic measuring device, though it leaves the boundary between quantum and classical ambiguous. Decoherence provides a dynamical for the apparent without invoking special rules, arising from the unavoidable of a quantum to its . Starting from a pure state \rho = |\psi\rangle\langle\psi|, environmental interactions suppress off-diagonal elements, transforming the system into a mixed state that mimics classical probabilities. This loss of occurs rapidly, as the environment entangles with the system, leading to the effective selection of pointer states robust against . The evolution of the under open-system dynamics is described by the Lindblad , the most general form for Markovian processes preserving complete positivity and : \frac{d\rho}{dt} = -i[H, \rho] + \sum_k \left( L_k \rho L_k^\dagger - \frac{1}{2} \{ L_k^\dagger L_k, \rho \} \right), where H is the and \{L_k\} are Lindblad operators representing environmental jumps. In contexts, decoherence limits the coherence time of qubits, necessitating isolation techniques to maintain superpositions for processing. In real systems, decoherence timescales are extraordinarily brief for macroscopic objects due to strong environmental coupling; for instance, a 1 g object at with a 1 cm spatial superposition decoheres in approximately $10^{-23} s, vastly shorter than thermal relaxation times. Such rapid rates underscore why quantum superpositions are observable only in carefully controlled microscopic settings, like trapped ions or superconducting circuits, while macroscopic systems appear classically definite.

Quantum Information Processing

Quantum Computing Paradigms

Quantum computing paradigms provide the architectural frameworks for implementing processing, distinct from classical models by leveraging quantum mechanical principles such as superposition and entanglement of qubits, which serve as the fundamental building blocks. The most widely studied paradigm is the model, where computation proceeds through a sequence of unitary operations applied to an initial , followed by . In this model, universal quantum computation can be achieved using a set of elementary : single-qubit gates including the Pauli gates (X, Y, Z), the Hadamard gate H that creates superposition, and the two-qubit controlled-NOT (CNOT) gate that enables entanglement. Any single-qubit unitary operation can be decomposed into a rotation form U = e^{-i \theta \mathbf{n} \cdot \boldsymbol{\sigma}/2}, where \theta is an angle, \mathbf{n} is a unit vector, and \boldsymbol{\sigma} represents the vector of Pauli matrices; combined with CNOT, this set suffices for arbitrary quantum circuits. Alternative paradigms offer different approaches to universality. Adiabatic quantum computing relies on the , evolving a quantum system slowly from an initial with a known to a final encoding the problem solution; if the evolution is sufficiently gradual, the system remains in the , solving optimization tasks like . This , proposed in 2000, contrasts with the discrete gate sequence of the model by using continuous-time dynamics. Measurement-based quantum computing, or the one-way model, uses a highly entangled resource state, such as a , where computation is performed solely through adaptive single-qubit measurements that propagate information unidirectionally through the lattice, consuming the resource without requiring further unitary gates. Introduced in 2001, this approach simplifies hardware demands by offloading entanglement generation to preprocessing. Physical implementations realize these paradigms using diverse hardware platforms, each exploiting specific quantum systems to encode and manipulate qubits. Trapped-ion systems confine charged atoms, like calcium or ions, in electromagnetic traps, using pulses for precise operations; they achieve long times (seconds) and high-fidelity gates (>99.9%) but face challenges in ion shuttling. Superconducting qubits, based on Josephson junctions in superconducting circuits cooled to millikelvin temperatures, enable fast gates (nanoseconds) and integrate with , as in designs, though they suffer shorter times (microseconds) due to . Topological qubits leverage anyons—exotic quasiparticles in two-dimensional materials like fractional quantum Hall states or Majorana fermions in nanowires—for inherent via non-local encoding, protecting against local errors without active correction; while experimental realization was nascent, in 2025 announced Majorana 1, an eight-qubit topological quantum processor using topoconductors, though the claim faces community scrutiny. Photonic systems encode qubits in , path, or time-bin states, using linear , beam splitters, and detectors for gates; they offer room-temperature operation and via fiber but require probabilistic non-deterministic elements like single-photon sources. These platforms have driven key milestones, such as the first experimental two-qubit CNOT in 1995 using ions at NIST, demonstrating controlled entanglement between qubits. More recently, IBM's processor, announced in 2025, features 120 qubits with 218 next-generation connections for reduced , marking progress toward scalable, fault-tolerant systems.

Quantum Algorithms and Complexity

Quantum algorithms exploit the principles of quantum mechanics, such as superposition and entanglement, to solve certain computational problems more efficiently than classical algorithms. These algorithms operate within the gate-based model of , where universal quantum gates manipulate states to achieve computational tasks. Seminal examples demonstrate or speedups over classical counterparts for specific problems believed to be intractable classically. Shor's algorithm, introduced by Peter Shor in 1994, provides an exponential speedup for and problems, which are foundational to . The algorithm reduces the problem to finding the period of a using the (QFT), defined as transforming a state |j\rangle to \frac{1}{\sqrt{N}} \sum_k e^{2\pi i j k / N} |k\rangle, enabling polynomial-time execution on a quantum computer with sufficient qubits. This approach factors an n-bit in O(n^3) time, contrasting with the best known classical sub-exponential algorithms. Grover's algorithm, developed by Lov Grover in 1996, offers a quadratic speedup for unstructured search problems, such as finding a marked item in an unsorted database of N entries. It achieves this through amplitude amplification, iteratively increasing the probability of measuring the target state from O(1/N) to O(1) in O(\sqrt{N}) queries, compared to the classical \Theta(N) requirement. This makes it optimal in the black-box query model, where no further speedup is possible without additional structure. The Harrow-Hassidim-Lloyd (HHL) algorithm, proposed in 2009, solves linear systems of equations Ax = b exponentially faster for sparse, well-conditioned matrices, providing a solution vector in O(\kappa \log N) time, where \kappa is the condition number and N the matrix dimension, versus classical O(N) or worse. It uses phase estimation and controlled operations to invert eigenvalues in the matrix's singular value decomposition, applicable to tasks like optimization and machine learning. However, practical implementations require low \kappa and sparse access, limiting broad applicability. Quantum complexity classes formalize the power of these algorithms relative to classical ones. The class BQP, introduced by Ethan Bernstein and in 1993, encompasses decision problems solvable in polynomial time with bounded error on a , containing but its relation to remains open. BQP includes problems like factoring (in ), believed outside , yet oracles exist separating BQP from (e.g., Bernstein-Vazirani problem) and suggesting BQP does not contain NP-complete problems in black-box models. Relative to , quantum algorithms provide speedups for some NP-hard problems but no general polynomial-time solution, as BQP ⊆ PSPACE. Theoretical limits constrain quantum advantages, with no proven super-exponential speedups beyond specific cases like Shor's exponential for factoring. In query complexity models, quantum algorithms achieve at most speedups for unstructured problems (Grover's bound), and broader gains require problem , as unstructured search cannot exceed O(\sqrt{N}). These bounds highlight that while quantum computers excel in parallel exploration of solution spaces, they do not universally surpass classical limits exponentially.

Quantum Error Correction and Fault Tolerance

Quantum error correction is essential for protecting fragile quantum information from noise and enabling reliable quantum computation, as quantum states are susceptible to errors from environmental interactions. The , independently proved by Wootters and Zurek in , establishes that it is impossible to create a perfect copy of an arbitrary unknown due to the linearity of quantum evolution. This prohibition rules out direct replication-based redundancy used in classical error correction, necessitating alternative strategies that encode logical qubits across multiple physical qubits to detect and correct errors through syndrome measurements without disturbing the encoded information. Decoherence, primarily caused by unwanted coupling to the environment, represents the dominant error mechanism driving the need for such robust encoding schemes. Early quantum error-correcting codes focused on protecting against single-qubit errors by mapping logical states into entangled multi-qubit subspaces. In 1995, Peter Shor introduced the first such code, the 9-qubit Shor code, which encodes one logical qubit into nine physical qubits by concatenating a three-qubit repetition code for bit-flip errors with a transformed version for phase-flip errors, allowing correction of any single Pauli error on the physical qubits. This construction demonstrated that quantum errors could be identified and reversed via transversal operations and stabilizer measurements, preserving the logical state. Building on this, Andrew Steane proposed the 7-qubit Steane code in 1996, a Calderbank-Shor-Steane (CSS) code derived from the classical [7,4,3] Hamming code, which efficiently encodes one logical qubit into seven physical qubits and corrects arbitrary single-qubit errors using fewer resources while supporting transversal gates for fault-tolerant operations. For scalable implementations, surface codes have emerged as a leading architecture due to their high error thresholds and compatibility with planar qubit arrays. Introduced by in 1997 as part of topological , surface codes encode logical qubits on a two-dimensional lattice where errors manifest as violations of local checks, enabling efficient nearest-neighbor decoding and error rates that scale favorably with code distance. These codes protect against both bit-flip and phase-flip errors through a dual lattice structure, making them ideal for near-term hardware with limited connectivity. The viability of large-scale quantum computing hinges on the quantum threshold theorem, which guarantees that fault-tolerant computation can be achieved with arbitrary accuracy if the underlying physical error rate remains below a constant threshold. Formally proved by Aharonov and Ben-Or in 1997, the theorem shows that by recursively concatenating error-correcting codes, the logical error rate can be suppressed exponentially provided the gate-level error probability is less than approximately $10^{-3}, allowing polynomial resource overhead for computations of arbitrary length. Experimental progress has validated these concepts in various platforms. In trapped-ion systems, fault-tolerant Steane code operations were demonstrated in 2023, achieving multi-round syndrome extraction and error correction with logical error rates below physical rates, confirming the code's efficacy against realistic noise; more recently, Quantinuum unveiled a third-generation in 2025 that simplifies error correction for better scalability. Similarly, in superconducting qubits, Google reported in 2023 the first demonstration of error suppression in a surface code logical qubit, where increasing the code distance from 3 to 5 reduced the logical error rate by over an ; building on this, Google's Willow processor in 2024 demonstrated below-threshold surface code memories with distances up to 7, exponentially improving error-corrected qubits as they scale. Additionally, in neutral-atom platforms, Harvard achieved a milestone in in November 2025, advancing in array-based systems. These milestones underscore the transition from theoretical constructs to practical implementations essential for reliable processing.

Quantum Communication and Cryptography

Quantum Key Distribution

Quantum key distribution (QKD) is a cryptographic technique that allows two parties, typically , to produce a shared random secret key for symmetric , with security rooted in the fundamental rather than computational hardness assumptions. Unlike classical key distribution methods, QKD provides , ensuring that any attempt by an eavesdropper, Eve, to intercept the key introduces detectable disturbances in the due to the and Heisenberg's . This enables the parties to verify the key's integrity and abort if tampering is suspected, achieving unconditional security against all possible attacks, including those from quantum computers. The seminal protocol, developed by Charles Bennett and in 1984, forms the basis of most prepare-and-measure QKD systems and uses states of s to encode classical bits. In this scheme, Alice randomly selects between two bases— (horizontal/vertical s for bits 0/1) or diagonal (45°/135° s)—to prepare and transmit individual s to Bob over an or free-space channel. Bob independently chooses a measurement basis for each , also randomly selecting or diagonal. After transmission, Alice and Bob publicly announce their basis choices via a classical channel but not the measurement outcomes, retaining only the bits where their bases matched during a process called sifting, which typically yields about half the transmitted bits as raw key material. To detect eavesdropping, they sacrifice a random subset of the sifted bits for error estimation, computing the quantum bit error rate (QBER); if the QBER exceeds a predefined threshold (often around 11% for optimal security), indicating potential interception, they discard the key and restart. The remaining sifted bits undergo privacy amplification and error correction to distill a secure final key. In contrast, entanglement-based protocols like the E91 scheme, proposed by in 1991, distribute pairs of entangled from a central source to , leveraging quantum correlations for both and security verification. each measure their in one of three randomly chosen bases, with two bases used for key sifting (similar to ) and the third pair for testing violations of Bell's inequalities. Significant violations confirm the absence of , as any interception would degrade the entanglement and reduce the CHSH inequality value below the classical limit of 2 (quantum maximum approaching $2\sqrt{2}). The sifted key bits, correlated due to entanglement, are then processed through error correction and privacy amplification. This approach provides device-independent security in principle, as it detects side-channel attacks on measurement devices. Rigorous security proofs for protocols like and E91 against general attacks rely on the Csiszár-Körner bound from classical , which quantifies the maximum secret key rate as the difference between the I(A:B) shared by and the maximum I(E) accessible to . For collective attacks—where attacks identical copies of the quantum states—the bound ensures a positive asymptotic key rate R \geq I(A:B) - I(A:E) (or symmetrically I(B:E)) when channel noise is below a critical , typically derived via entropic relations. These proofs, advanced in the mid-2000s, confirm that secure keys can be extracted at rates scaling with the sifted key length minus leaked , even under realistic imperfections like photon loss. Entanglement variants, such as E91, further tighten these bounds by incorporating outcomes to upper-bound Eve's knowledge. Practical QKD systems have transitioned from laboratory demonstrations to commercial and long-distance deployments. In 2007, ID Quantique achieved the world's first real-life installation, securing a 2-km link between and the of 's centers using their Cerberis , which integrated BB84-based QKD with AES-256 for continuous at rates up to 1 Mbps. This marked the of QKD for government applications, demonstrating robustness over metropolitan networks despite and dark counts in detectors. On a global scale, China's Micius satellite enabled entanglement-based QKD over 1200 km in 2017, distributing photon pairs from orbit to ground stations in Delingha and , achieving a CHSH value of 2.37 ± 0.09 and sifted key rates of 1.1 kbps after atmospheric turbulence compensation, paving the way for intercontinental quantum networks. By 2025, QKD has seen broader adoption, with innovations in hybrid systems combining QKD and post-quantum algorithms for enhanced security.

Quantum Teleportation and Dense Coding

Quantum teleportation is a protocol that enables the transfer of an arbitrary quantum state from one party (Alice) to another (Bob) without physically transporting the quantum carrier, relying instead on shared entanglement and a classical communication channel. Proposed in 1993, the process begins with Alice and Bob sharing an entangled Bell state, such as the Einstein-Podolsky-Rosen (EPR) pair \frac{1}{\sqrt{2}}(|00\rangle + |11\rangle). Alice, holding the unknown qubit state |\psi\rangle = \alpha|0\rangle + \beta|1\rangle, performs a joint Bell-state measurement on her qubit and one half of the EPR pair, yielding one of four possible outcomes with equal probability. She then transmits the two classical bits corresponding to this measurement result to Bob over a classical channel. Based on these bits, Bob applies one of the Pauli operators (I, X, Z, or XZ) to his half of the EPR pair, reconstructing the original state |\psi\rangle with perfect fidelity in the ideal case. This method requires two classical bits to teleport one qubit, highlighting the no-cloning theorem's role in preventing direct copying of unknown quantum states. The mathematical foundation of quantum teleportation rests on the properties of maximally entangled states and the completeness of the Bell basis. The four Bell states are \left|\Phi^\pm\right\rangle = \frac{1}{\sqrt{2}}(|00\rangle \pm |11\rangle) and \left|\Psi^\pm\right\rangle = \frac{1}{\sqrt{2}}(|01\rangle \pm |10\rangle). When measures in this basis, the post-measurement state on Bob's side becomes one of the transformed versions of |\psi\rangle, correctable by the corresponding unitary operations. To verify the shared entanglement necessary for reliable , the Clauser-Horne-Shimony-Holt ( serves as a key test, quantifying Bell nonlocal correlations. The CHSH parameter S = E(\theta_1, \phi_1) + E(\theta_1, \phi_2) + E(\theta_2, \phi_1) - E(\theta_2, \phi_2), where E are correlation functions, satisfies |S| \leq 2 for local hidden variable theories but can reach $2\sqrt{2} for , confirming the pair's quality. In practice, the classical channel can be secured using to prevent eavesdropping on the two bits, ensuring the protocol's integrity. Experimental demonstrations of quantum teleportation have advanced rapidly since its inception. The first realization occurred in 1997 by a team at the , using entangled photons generated via parametric down-conversion; they achieved an average of 0.70, surpassing the of 0.67 and confirming the protocol's viability. A notable early long-distance fiber-optic experiment in 2004 transmitted a teleported state over 600 meters across the Danube River in , attaining a of 0.84 with time-bin-encoded photons, demonstrating robustness against fiber dispersion. These achievements underscore teleportation's potential for quantum networks, though challenges like decoherence over extended distances persist. Superdense coding, also known as dense coding, complements by allowing the transmission of two classical bits using a single , effectively doubling the communication capacity when entanglement is available. Introduced in 1992, the protocol assumes share an pair \frac{1}{\sqrt{2}}(|00\rangle + |11\rangle). To encode two classical bits (00, 01, 10, or 11), applies the I for 00, X for 01, Z for 10, or XZ (up to a ) for 11 to her qubit, then sends it to via a . performs a Bell-state on the received qubit and his entangled partner, decoding the two bits directly from the outcome. This efficiency arises because the four possible operations map uniquely to the Bell basis, exploiting entanglement to convey more than a classical bit alone. Without prior entanglement, the same qubit could carry only one bit, illustrating quantum communication's advantage. The again verifies the shared pair's entanglement before encoding. Experimental implementations, such as early photonic demonstrations in the late , have confirmed fidelities above 90%, paving the way for enhanced transfer in entangled networks.

Quantum Networks and Repeaters

Quantum networks aim to interconnect quantum processors and memories over long distances, enabling , distributed , and sensing applications beyond the capabilities of classical . These networks face significant challenges due to loss in optical fibers and decoherence, which limit entanglement distribution to short ranges of about 100 kilometers. Quantum repeaters address this by dividing the channel into shorter segments, generating entanglement locally, and extending it through purification and swapping protocols. The vision of a quantum internet encompasses a classical-quantum where quantum links handle entanglement distribution and classical channels manage control signals, facilitating tasks like blind quantum computing and global sensor arrays. This architecture supports distributed processing, with quantum repeaters serving as core nodes to mitigate exponential loss in over distance. Early demonstrations, such as the Quantum Network operational in 2003, connected 10 nodes across and using fiber optics for , laying groundwork for scalable entanglement-based systems. Quantum operate via two main steps: entanglement purification to improve the quality of noisy links and entanglement swapping to connect segments without direct . The seminal Duan-Lukin-Cirac-Zoller (DLCZ) , proposed in 2001, uses atomic ensembles as quantum memories and linear to generate and store entanglement through spontaneous , where a heralded Stokes signals successful entanglement creation. Purification refines imperfect Bell pairs via local operations and classical communication, while swapping combines memories from adjacent segments to form end-to-end entanglement, enabling polynomial scaling of communication rates with distance. Repeater architectures differ in how they handle qubit transmission: memory-based designs store stationary qubits in matter systems like atoms or defects, using photons for short-range entanglement distribution, whereas flying-qubit approaches encode information directly in or frequency for , often requiring advanced correction. Heralded entanglement , common to both, relies on detection of ancillary photons to confirm successful distribution, boosting efficiency in probabilistic setups. serves as a building block for in these architectures, transferring states between memories without physical transport. Recent advances leverage nitrogen-vacancy (NV) centers in diamond as robust quantum memories due to their long coherence times exceeding seconds at room temperature and compatibility with telecom wavelengths via hybrid integration. In 2023, experiments demonstrated programmable arrays of NV spins for scalable repeaters, achieving high-fidelity entanglement storage and retrieval essential for multi-node networks. These developments, combined with improved photon extraction efficiencies, bring practical quantum repeaters closer to realizing a global quantum internet. As of 2025, further progress includes Caltech's demonstration of multiplexed entanglement using ytterbium ions in nanophotonic cavities to increase entanglement rates and Purdue University's quantum network testbed distributing photonic entanglement between multiple nodes, advancing scalable implementations.

Mathematical and Theoretical Framework

Hilbert Space Formalism

The formalism provides the mathematical foundation for quantum information science, representing quantum states and operations within infinite-dimensional complex vector spaces equipped with an inner product that induces a , ensuring completeness for physical realizability. These spaces are typically separable, meaning they possess a countable , which aligns with the discrete nature of quantum measurements in finite-dimensional systems relevant to information processing. For instance, the state of a single is described in a two-dimensional spanned by basis vectors |0\rangle and |1\rangle. Observables in this formalism are represented by Hermitian operators on the , whose yields the possible outcomes as eigenvalues and the associated projection operators for post-measurement states. The expectation value of an observable A for a state |\psi\rangle is given by \langle A \rangle = \langle \psi | A | \psi \rangle, reflecting the probabilistic nature of quantum measurements. The of closed quantum systems is governed by unitary operators derived from the , specifically U(t) = e^{-iHt/\hbar}, where H is the Hermitian operator describing the system's energy. This unitary transformation preserves the inner product and thus the probabilities in the , ensuring reversible dynamics for isolated systems. To describe mixed states, arising from incomplete knowledge or ensembles of pure states, the density operator \rho is employed, defined as a \rho = \sum_i p_i |\psi_i\rangle\langle\psi_i|, where p_i \geq 0, \sum_i p_i = 1, and \rho is Hermitian, positive semi-definite, and trace-normalized \operatorname{Tr}(\rho) = 1. The expectation value of an A then becomes \langle A \rangle = \operatorname{Tr}(\rho A), generalizing the pure-state case. For composite systems, such as multiple qubits or entangled parties, the total Hilbert space is the tensor product of individual spaces, \mathcal{H} = \mathcal{H}_A \otimes \mathcal{H}_B, allowing representation of correlated states like |\Phi^+\rangle = \frac{1}{\sqrt{2}} (|00\rangle + |11\rangle). To obtain the reduced description for subsystem A, the partial trace over B is taken: \rho_A = \operatorname{Tr}_B (\rho_{AB}) = \sum_i \langle i_B | \rho_{AB} | i_B \rangle, where \{ |i_B\rangle \} is an orthonormal basis for \mathcal{H}_B, yielding a valid density operator on \mathcal{H}_A. This operation captures the local statistics while marginalizing over the traced-out degrees of freedom.

Quantum Information Measures

Quantum information measures quantify uncertainty, correlations, and information content in quantum systems, extending classical concepts to density operators that describe quantum states in . These measures are essential for analyzing the efficiency of quantum protocols, channel capacities, and entanglement resources. The , S(\rho) = -\operatorname{Tr}(\rho \log_2 \rho), serves as the primary measure of quantum uncertainty for a density operator \rho, generalizing the Shannon entropy to mixed quantum states. Introduced by , this entropy vanishes for pure states and reaches its maximum \log_2 d for a d-dimensional maximally mixed state, capturing both classical and quantum contributions to disorder. Building on , the quantum mutual information between subsystems A and B of a bipartite state \rho_{AB} is defined as I(A:B) = S(\rho_A) + S(\rho_B) - S(\rho_{AB}), measuring total correlations including entanglement. This quantity is symmetric and non-negative, with equality to twice the entanglement entropy for pure bipartite states. The coherent information, I_c(A \to B) = S(\rho_B) - S(\rho_{AB}), further quantifies quantum correlations preserved through a channel from A to B, playing a key role in determining the quantum capacity of noisy quantum channels. The Holevo bound provides an upper limit on the classical information transmittable over a quantum channel, given by \chi = S(\bar{\rho}) - \sum_i p_i S(\rho_i) \leq \log_2 d, where \bar{\rho} = \sum_i p_i \rho_i is the average output state for an ensemble \{p_i, \rho_i\} with input dimension d. This bound, achievable in the asymptotic limit, highlights the advantage of quantum encoding over classical for certain channels. In entanglement theory, these measures determine the distillable entanglement from mixed states, with the entanglement distillation rate bounded by the coherent information; for example, Bennett et al. showed that from noisy Bell pairs, near-maximal ebits can be extracted using local operations and classical communication, approaching the Hashing or Breeding protocols' yields.

Connections to Classical Information Theory

Quantum information science extends classical information theory by incorporating quantum mechanical principles, particularly in how uncertainty and information transmission are quantified. In classical , the Shannon entropy H(X) = -\sum_i p_i \log_2 p_i measures the average uncertainty in a X with \{p_i\}, providing a foundation for lossless data compression and channel coding limits. This entropy determines the minimal number of bits needed to encode messages from a source reliably. The quantum generalization replaces probability distributions with density operators, where the serves as the quantum analog, enabling similar but superposition-aware measures of content. Channel capacities in quantum settings build directly on Shannon's classical capacity, which for a noisy is the maximum C = \max_{p(x)} I(X;Y) achievable with reliable transmission. The , introduced by , quantifies the rate at which (qubits) can be transmitted reliably over a noisy , often exceeding classical limits due to but bounded by the coherent information. For instance, the entanglement-assisted further enhances this rate by allowing pre-shared entanglement between sender and receiver, as formalized by Bennett et al., where the equals the maximized over entangled input states. These quantum capacities highlight how entanglement—a purely quantum resource—can surpass classical bounds, such as doubling the of certain channels compared to unassisted transmission. Noisy quantum channels model decoherence effects that degrade information, analogous to classical noise but with quantum-specific dynamics. The depolarizing channel, a common model for symmetric errors, replaces the input state \rho with the maximally mixed state with probability p, effectively erasing information isotropically: \mathcal{N}(\rho) = (1-p)\rho + p \frac{I}{2}. Its quantum capacity, given by the maximum coherent information, vanishes for p \geq \frac{2}{3}, the point at which the channel becomes entanglement-breaking. In contrast, the amplitude damping channel captures relaxation, such as in qubits, where the decays to the with probability \gamma, modeled by Kraus operators that bias the Bloch vector toward the . This channel's quantum capacity decreases monotonically with \gamma, vanishing at full damping, and illustrates how quantum noise can be asymmetric, unlike uniform classical . A key contrast arises in data compression: while Shannon's theorem compresses classical sources to the entropy rate, Schumacher compression achieves asymptotic quantum data compression for ensembles of quantum states, packing typical subspaces to the rate without disturbing non-typical states. This quantum scheme preserves superpositions and entanglement across blocks, enabling reliable recovery, but requires quantum encoding operations that have no direct classical counterpart, underscoring how quantum information's non-commutative nature extends classical limits while introducing new challenges like no-cloning.

History and Societal Impact

Historical Development

The foundations of quantum information science trace back to the early development of quantum mechanics in the 1930s, when key concepts like superposition and entanglement were introduced. In 1935, Erwin Schrödinger highlighted the paradoxical implications of quantum superposition through his famous thought experiment known as Schrödinger's cat, illustrating how quantum states could exist in multiple configurations simultaneously until observed. That same year, Albert Einstein, Boris Podolsky, and Nathan Rosen proposed the EPR paradox, questioning the completeness of quantum mechanics via the phenomenon of entanglement, where distant particles remain correlated regardless of separation. These ideas laid the groundwork for understanding quantum systems as carriers of information, though their information-theoretic implications were not immediately explored. A significant theoretical milestone came in 1973 with Alexander Holevo's theorem, which established the upper bound on the amount of classical information that can be transmitted through a , demonstrating inherent limits analogous to but distinct from classical bounds. This result, published in Problems of Information Transmission, marked the beginning of formal theory by quantifying how restricts reliable communication. The field's emergence as a distinct discipline accelerated in the late and early with proposals for . In 1980, Paul Benioff constructed the first quantum mechanical model of by adapting a classical to operate under quantum dynamics, proving that quantum systems could perform universal without energy dissipation beyond physical limits. This was followed in 1982 by Richard Feynman's seminal proposal that quantum computers could efficiently simulate any quantum physical process, a task intractable for classical computers due to exponential . The mid-1990s saw pivotal advances that solidified quantum information science as a viable research area. In 1995, introduced the first quantum error-correcting codes, showing how redundancy in quantum states could protect fragile information from decoherence, a breakthrough essential for practical quantum devices. In 2000, David DiVincenzo outlined criteria for physically realizing a scalable quantum computer, specifying requirements such as qubit initialization, , and measurement, which became a foundational checklist for experimental efforts. These developments spurred rapid growth, leading to institutional support. In 2000, for Quantum Information (IQI) was established at Caltech as a dedicated center for quantum research, evolving into the Institute for Quantum Information and Matter (IQIM) in 2011 with NSF Physics Frontiers Center funding to integrate theory and experiment. On the international scale, the launched the Quantum Flagship initiative in 2018, a €1 billion, decade-long program to advance quantum technologies across communication, computing, and sensing.

Key Figures and Milestones

is recognized as a foundational figure in quantum information science for proposing in 1982 that quantum systems could be efficiently simulated using quantum computers, highlighting the limitations of classical computers in modeling quantum phenomena. advanced this vision in 1985 by formalizing the concept of a , establishing a theoretical framework for quantum computation equivalent to classical Turing machines but leveraging and . In quantum cryptography, Artur Ekert introduced the E91 protocol in 1991, utilizing entangled particles and to enable secure resistant to . Bennett, collaborating with others, proposed in 1993, a protocol for transferring quantum states using entanglement and classical communication, which underpins many applications. Peter Shor's 1994 for factoring large integers exponentially faster than classical methods demonstrated quantum computing's potential to disrupt . Lov Grover's 1996 provided a quadratic speedup for unstructured database searches, showcasing practical advantages in optimization problems. Key experimental milestones include the 1998 demonstration of the first two-qubit quantum computer using , which implemented Deutsch's algorithm and validated basic quantum operations. In 2019, Google's achieved by performing a specific sampling task in 200 seconds that would take a 10,000 years, marking a threshold where quantum devices outperform classical ones for certain problems. The 2022 was awarded to , , and for their pioneering experiments confirming , which laid the groundwork for protocols. In December 2024, Quantum AI announced the Willow processor, a 105-qubit superconducting chip that demonstrated error rates below the fault-tolerance threshold, enabling scalable and marking progress toward practical .

Current Challenges and Future Directions

One of the primary challenges in quantum information science arises from the tension between qubit coherence times and quantum operation speeds. In superconducting qubit systems, typical coherence times, measured by relaxation (T₁) and (T₂) times, range from 50 μs to over 1 ms, with fluxonium qubits achieving up to 1.48 ms (as of 2023). As of 2025, qubits have demonstrated coherence times exceeding 1 ms. However, single-qubit gates operate in approximately 50 ns, while two-qubit gates require 250–450 ns, necessitating thousands of operations within the coherence window to execute meaningful algorithms before errors accumulate. This disparity limits circuit depth and overall computational utility, as decoherence introduces noise that propagates through multi-qubit interactions. Cryogenic requirements exacerbate these scalability issues, as most leading qubit modalities, such as superconducting circuits, demand temperatures near 10–20 mK to minimize noise and maintain quantum states. Scaling to thousands or millions of s requires advanced dilution refrigerators and cryogenic control electronics, but current systems struggle with heat dissipation from control wiring and amplifiers, leading to and increased error rates. Efforts to integrate cryogenic for on-chip control aim to reduce wiring complexity, yet achieving efficient operation at millikelvin temperatures remains a significant engineering hurdle for large-scale deployment. Standardization in quantum algorithms poses another key challenge, particularly for optimization methods like the Quantum Approximate Optimization Algorithm (QAOA) and (QML) frameworks. QAOA, which alternates cost and mixer Hamiltonians to approximate solutions for combinatorial problems, suffers from barren plateaus in its parameter landscape, where gradients vanish exponentially with count, complicating classical optimization of variational parameters. In QML, the lack of unified software protocols and data encoding schemes across platforms hinders and , with hardware constraints like further amplifying algorithmic complexity. These issues impede the development of robust, vendor-agnostic tools, as diverse architectures demand tailored adaptations without a common for evaluation. Societally, the migration to quantum-secure cryptography presents urgent challenges, as quantum algorithms like Shor's threaten to break widely used public-key systems such as and , potentially exposing encrypted data harvested today. NIST's standards, including lattice-based algorithms like ML-KEM (based on ), which were finalized in August 2024, require comprehensive inventorying of cryptographic assets, protocol updates, and performance testing, a process projected to span decades due to integration and risks. Ethical concerns surrounding —demonstrated in tasks beyond classical capabilities—include exacerbating the , as access to quantum resources concentrates in affluent nations, and raising privacy risks from unintended decryption of historical data. In June 2024, the proclaimed 2025 as the International Year of Quantum Science and Technology to celebrate 100 years of and promote its societal benefits. Looking ahead, fault-tolerant quantum computers, capable of error correction below thresholds around 1% per gate, are anticipated in the 2030s, with targeting a 200 logical system by 2029 and aiming for universal fault tolerance by 2030. These milestones could unlock practical applications in and optimization, driving a global quantum economy with up to $1.3 trillion in value across industries like and pharmaceuticals by 2035, according to McKinsey analysis.

References

  1. [1]
    Quantum Information Science | PNNL
    Quantum information science studies quantum mechanics to advance computing, secure communication, and solve challenges beyond classical systems.
  2. [2]
    Quantum information science | NIST
    electrons inside an atom, tiny circuits, massless particles of light — to ...
  3. [3]
    Quantum Computing Explained | NIST
    Mar 18, 2025 · Like Schrödinger's unfortunate cat, qubits can be put into superpositions of multiple states. In other words, a qubit can be in state 0, state ...
  4. [4]
    Quantum Information Science and Technology - FBI.gov
    Quantum information science and technology (QIS) is the use of the laws of quantum physics for the storage, transmission, manipulation, computing, ...Potential Threats · Working With The Fbi And Our... · ResourcesMissing: authoritative | Show results with:authoritative
  5. [5]
    Unlocking Big Technologies with Quantum-scale Science - NSF
    Qubits (the quantum equivalent of computer bits) can use superposition to represent both 0 and 1 at the same time. Superposition gives quantum computers the ...
  6. [6]
    Quantum Information Science | Department of Energy
    Nov 15, 2024 · Quantum information science uses novel principles from quantum mechanics to transform computing, sensing, and communication as we know them.
  7. [7]
    Qudits and High-Dimensional Quantum Computing - Frontiers
    Qudit is a multi-level computational unit alternative to the conventional 2-level qubit. Compared to qubit, qudit provides a larger state space to store and ...2 Quantum Gates For Qudits · 2.2 Examples Of Qudit Gates · 3 Quantum Algorithms Using...Missing: seminal | Show results with:seminal
  8. [8]
    Scalable quantum computing based on stationary spin qubits in ...
    Dec 18, 2014 · Here we investigate the possibility of achieving a scalable and compact quantum computing based on stationary electron-spin qubits.
  9. [9]
    Superconducting Circuit Companion---an Introduction with Worked ...
    Dec 14, 2021 · This tutorial aims at giving an introductory treatment of the circuit analysis of superconducting qubits, ie, two-level systems in superconducting circuits.
  10. [10]
    [PDF] 1926-Schrodinger.pdf
    It was stated in the beginning of this paper that in the present theory both the laws of motion and the quantum conditions can be deduced from one Hamiltonian ...
  11. [11]
    [PDF] Can Quantum-Mechanical Description of Physical Reality Be
    A. EINSTEIN, B. PODOLSKY AND N. ROSEN, Institute for Advanced Study, Princeton, New Jersey. (Received March 25, 1935). In a complete theory there is an ...
  12. [12]
    [PDF] ON THE EINSTEIN PODOLSKY ROSEN PARADOX*
    THE paradox of Einstein, Podolsky and Rosen [1] was advanced as an argument that quantum mechanics could not be a complete theory but should be supplemented ...
  13. [13]
    Proposed Experiment to Test Local Hidden-Variable Theories
    Proposed Experiment to Test Local Hidden-Variable Theories. John F. Clauser*. Michael A. Horne · Abner Shimony · Richard A. Holt.
  14. [14]
    [PDF] Entanglement of Formation of an Arbitrary State of Two Qubits - arXiv
    Abstract. The entanglement of a pure state of a pair of quantum systems is defined as the entropy of either member of the pair.Missing: original | Show results with:original
  15. [15]
    Experimental Test of Bell's Inequalities Using Time-Varying Analyzers
    The results are in good agreement with quantum mechanical predictions but violate Bell's inequalities by 5 standard deviations. ... Aspect, P. Grangier ...
  16. [16]
    [PDF] Experimental Test of Bell's Inequalities Using Time-Varying Analyzers
    Dec 20, 1982 · The experiment used time-varying analyzers to measure photon correlations, violating Bell's inequalities by 5 standard deviations, while ...Missing: URL | Show results with:URL
  17. [17]
    Copenhagen Interpretation of Quantum Mechanics
    May 3, 2002 · The Copenhagen interpretation was the first general attempt to understand the world of atoms as this is represented by quantum mechanics.The Interpretation of the... · Misunderstandings of... · The Measurement Problem
  18. [18]
  19. [19]
    [quant-ph/9503016] Elementary gates for quantum computation - arXiv
    Mar 23, 1995 · We derive upper and lower bounds on the exact number of elementary gates required to build up a variety of two-and three-bit quantum gates.
  20. [20]
    Elementary gates for quantum computation | Phys. Rev. A
    Nov 1, 1995 · We derive upper and lower bounds on the exact number of elementary gates required to build up a variety of two- and three-bit quantum gates.
  21. [21]
    [quant-ph/0001106] Quantum Computation by Adiabatic Evolution
    Abstract: We give a quantum algorithm for solving instances of the satisfiability problem, based on adiabatic evolution.Missing: original | Show results with:original
  22. [22]
    A One-Way Quantum Computer | Phys. Rev. Lett.
    May 28, 2001 · We present a scheme of quantum computation that consists entirely of one-qubit measurements on a particular class of entangled states, the cluster states.Missing: paradigm | Show results with:paradigm
  23. [23]
    Materials challenges and opportunities for quantum computing ...
    Apr 16, 2021 · We identify key materials challenges that currently limit progress in five quantum computing hardware platforms, propose how to tackle these problems,Superconducting Qubits · Quantum Dots · Trapped Ions<|separator|>
  24. [24]
    Demonstration of a Fundamental Quantum Logic Gate
    The demonstration was of a two-bit 'controlled-NOT' quantum logic gate, which, with single-bit operations, forms a universal quantum logic gate.Missing: 1998 NIST
  25. [25]
    IBM Unveils Breakthrough 127-Qubit Quantum Processor
    Nov 16, 2021 · 'Eagle' is IBM's first quantum processor developed and deployed to contain more than 100 operational and connected qubits.
  26. [26]
    Quantum algorithms: an overview | npj Quantum Information - Nature
    Jan 12, 2016 · Quantum computers are designed to outperform standard computers by running quantum algorithms. Areas in which quantum algorithms can be ...
  27. [27]
    [quant-ph/9508027] Polynomial-Time Algorithms for Prime ... - arXiv
    Aug 30, 1995 · This paper considers factoring integers and finding discrete logarithms, two problems which are generally thought to be hard on a classical computer.
  28. [28]
    A fast quantum mechanical algorithm for database search - arXiv
    Nov 19, 1996 · The algorithm is within a small constant factor of the fastest possible quantum mechanical algorithm. Comments: 8 pages, single postscript file.
  29. [29]
    [0811.3171] Quantum algorithm for solving linear systems of equations
    Nov 19, 2008 · Access Paper: View a PDF of the paper titled Quantum algorithm for solving linear systems of equations, by Aram W. Harrow and 1 other ...
  30. [30]
    [PDF] Quantum Computational Complexity - arXiv
    Apr 21, 2008 · For example, BQP is the quantum complexity class of all decision problems that can be solved in polynomial time by a quantum computer. Quantum ...Missing: seminal | Show results with:seminal
  31. [31]
    A single quantum cannot be cloned - Nature
    Oct 28, 1982 · We show here that the linearity of quantum mechanics forbids such replication and that this conclusion holds for all quantum systems.
  32. [32]
    [quant-ph/9512032] Good Quantum Error-Correcting Codes Exist
    Dec 30, 1995 · A quantum error-correcting code is defined to be a unitary mapping (encoding) of k qubits (2-state quantum systems) into a subspace of the quantum state space ...
  33. [33]
    [quant-ph/9605021] Simple Quantum Error Correcting Codes - arXiv
    May 15, 1996 · Access Paper: View a PDF of the paper titled Simple Quantum Error Correcting Codes, by Andrew Steane (Clarendon Laboratory and 1 other authors.
  34. [34]
    Fault Tolerant Quantum Computation with Constant Error - arXiv
    Nov 14, 1996 · Authors:Dorit Aharonov (Physics and computer science, Hebrew Univ.), Michael Ben-Or (Computer science, Hebrew univ.) View a PDF of the paper ...
  35. [35]
    Demonstration of fault-tolerant Steane quantum error correction - arXiv
    Dec 15, 2023 · This study presents the implementation of multiple rounds of fault-tolerant Steane QEC on a trapped-ion quantum computer.
  36. [36]
    Suppressing quantum errors by scaling a surface code logical qubit
    Feb 22, 2023 · These results mark an experimental demonstration in which quantum error correction begins to improve performance with increasing qubit ...
  37. [37]
    Information-theoretic security proof for quantum-key-distribution ...
    Jul 25, 2005 · We present a technique for proving the security of quantum-key-distribution (QKD) protocols. It is based on direct information-theoretic arguments.
  38. [38]
    Quantum cryptography: Public key distribution and coin tossing - arXiv
    Mar 14, 2020 · This is a best-possible quality scan of the original so-called BB84 paper as it appeared in the Proceedings of the International Conference ...
  39. [39]
    Quantum cryptography based on Bell's theorem | Phys. Rev. Lett.
    Quantum cryptography based on Bell's theorem. Artur K. Ekert. Merton College and Physics Department, ... 67, 661 – Published 5 August, 1991. DOI: ...
  40. [40]
    IDQ Celebrates 10-Year Anniversary of the World's First Real-Life ...
    Nov 23, 2017 · On 21st October 2007 the Geneva government used IDQ's hybrid quantum cryptography solution, which combines state of the art Layer 2 encryption ( ...
  41. [41]
    Satellite-based entanglement distribution over 1200 kilometers
    Jun 16, 2017 · Here we demonstrate satellite-based distribution of entangled photon pairs to two locations separated by 1203 kilometers on Earth.
  42. [42]
    Teleporting an unknown quantum state via dual classical and ...
    Teleporting an unknown quantum state via dual classical and Einstein-Podolsky-Rosen channels. Charles H. Bennett, Gilles Brassard, Claude Crépeau, Richard ...
  43. [43]
    Quantum internet: A vision for the road ahead | Science
    Oct 19, 2018 · The vision of a quantum internet is to provide fundamentally new internet technology by enabling quantum communication between any two points on Earth.
  44. [44]
    [quant-ph/0412029] The DARPA Quantum Network - arXiv
    Dec 3, 2004 · The DARPA Quantum Network became fully operational on October 23, 2003 in BBN's laboratories, and in June 2004 was fielded through dark fiber ...
  45. [45]
    Recent progress in hybrid diamond photonics for quantum ... - Nature
    May 8, 2025 · This review discusses recent progress and challenges in the hybrid integration of diamond color centers on cutting-edge photonic platforms.
  46. [46]
    [PDF] A Mathematical Theory of Communication
    N(t) = N(t -t1) +N(t -t2) +•••+N(t -tn). 0 where X0 is the largest real solution of the characteristic equation: X,t1 +X,t2 +•••+X,tn = 1 3 Page 4 and ...
  47. [47]
    Quantum coding | Phys. Rev. A - Physical Review Link Manager
    Quantum coding. Benjamin Schumacher. Department of Physics, Kenyon College, Gambier, Ohio ... A 51, 2738 – Published 1 April, 1995. DOI: https://doi.org/ ...
  48. [48]
    [quant-ph/9604015] The capacity of the noisy quantum channel - arXiv
    Apr 19, 1996 · Access Paper: View a PDF of the paper titled The capacity of the noisy quantum channel, by Seth Lloyd (MIT Mechanical Engineering). View PDF ...
  49. [49]
    Entanglement-assisted capacity of a quantum channel and ... - arXiv
    Jun 8, 2001 · The entanglement-assisted classical capacity of a noisy quantum channel is the amount of information per channel use that can be sent over the channel.Missing: original | Show results with:original
  50. [50]
    [quant-ph/0204172] The capacity of the quantum depolarizing channel
    Apr 30, 2002 · It is shown that this capacity can be achieved by encoding messages as products of pure states belonging to an orthonormal basis of the state space.<|separator|>
  51. [51]
    The Story of IQIM: Institute for Quantum Information and Matter
    Jul 26, 2022 · Then, in 2000, Preskill and Kimble received a grant from the National Science Foundation, which they used to form the Institute for Quantum ...Missing: date | Show results with:date
  52. [52]
    Introduction to the Quantum Flagship
    The Quantum Flagship is a large-scale, 1 billion euro, 10-year research initiative to expand European leadership in quantum technologies.
  53. [53]
    Simulating physics with computers | International Journal of ...
    Feynman, RP Simulating physics with computers. Int J Theor Phys 21, 467–488 (1982). https://doi.org/10.1007/BF02650179
  54. [54]
    Quantum theory, the Church–Turing principle and the universal ...
    Deutsch David. 1985Quantum theory, the Church–Turing principle and the universal quantum computerProc. R. Soc. Lond. A40097–117http://doi.org/10.1098/rspa ...
  55. [55]
    Algorithms for quantum computation: discrete logarithms and factoring
    This paper gives Las Vegas algorithms for finding discrete logarithms and factoring ... 20-22 November 1994. Date Added to IEEE Xplore: 06 August 2002. ISBN ...
  56. [56]
    Quantum supremacy using a programmable superconducting ...
    Oct 23, 2019 · ... Quantum supremacy is demonstrated using a programmable superconducting processor known as Sycamore, taking approximately 200 seconds to ...
  57. [57]
    Press release: The Nobel Prize in Physics 2022 - NobelPrize.org
    Oct 4, 2022 · Alain Aspect, John Clauser and Anton Zeilinger have each conducted groundbreaking experiments using entangled quantum states.
  58. [58]
    Millisecond Coherence in a Superconducting Qubit | Phys. Rev. Lett.
    A superconducting fluxonium qubit achieved a coherence time of 1.43 milliseconds, exceeding the state of the art for transmons by an order of magnitude.Abstract · Article Text
  59. [59]
    Experimental comparison of two quantum computing architectures
    Whereas the superconducting system offers faster gate clock speeds and a solid-state platform, the ion-trap system features superior qubits and reconfigurable ...Missing: anyons | Show results with:anyons
  60. [60]
    Cryo-CMOS Mixed-Signal Circuits for Scalable Quantum Computing
    The critical challenge in this venture is to achieve efficient cryogenic operation at low temperatures, i.e., close to the qubit around 4 K, simultaneously ...
  61. [61]
    Cryogenic quantum computer control signal generation using high ...
    Oct 15, 2024 · In this work, we integrate capacitors with cryogenic high-electron mobility transistor (HEMT) arrays and demonstrate quasi-static bias generation using gate ...<|separator|>
  62. [62]
    Toward a linear-ramp QAOA protocol: evidence of a scaling ... - Nature
    Aug 4, 2025 · The barren plateau is another challenge in QAOA. It refers to regions in the cost function landscape where the gradient is nearly zero, making ...Missing: machine | Show results with:machine
  63. [63]
    Challenges with Adopting Post-Quantum Cryptographic Algorithms
    Apr 28, 2021 · This paper introduces challenges associated with adopting and using post-quantum cryptography once new algorithms and new standards using them are ready.
  64. [64]
  65. [65]
    Ethical and Security Implications of Quantum Computing - NHSJS
    Jul 31, 2025 · This review aims to assess the ethical implications of quantum advancements, focusing on privacy, security, and the digital divide. It ...
  66. [66]
    IBM lays out clear path to fault-tolerant quantum computing
    Jun 10, 2025 · IBM lays out a clear, rigorous, comprehensive framework for realizing a large-scale, fault-tolerant quantum computer by 2029.Missing: 2030s | Show results with:2030s<|control11|><|separator|>
  67. [67]
    Quantinuum Unveils Accelerated Roadmap to Achieve Universal ...
    Quantinuum Unveils Accelerated Roadmap to Achieve Universal, Fully Fault-Tolerant Quantum Computing by 2030. With thousands of physical qubits, hundreds of ...