Fact-checked by Grok 2 weeks ago

Information transfer

Information transfer is the process by which meaningful data or signals are conveyed from a source to a receiver via a communication channel, potentially in the presence of noise or interference, forming the cornerstone of information theory as established by Claude Shannon in 1948. In this framework, the efficiency of transfer is quantified by mutual information, which measures the reduction in uncertainty about the source message upon observing the channel output, enabling reliable communication up to the channel's capacity—the supreme limit on the rate of error-free information transmission determined by bandwidth and signal-to-noise ratio. For noisy channels, the Shannon-Hartley theorem specifies this capacity as C = B \log_2(1 + \frac{S}{N}), where B is bandwidth, S is signal power, and N is noise power, guiding the design of modern telecommunications systems. Beyond classical channels, advanced measures like transfer entropy, introduced by Thomas Schreiber in 2000, extend the concept to dynamical systems by quantifying directed information flow from one process to another while accounting for intrinsic dynamics and common influences, revealing causal influences in fields such as neuroscience and climate modeling. This measure, defined as T_{Y \to X} = H(X_t | X_{t-1}^\infty) - H(X_t | X_{t-1}^\infty, Y_{t-1}^\infty) for processes X and Y, detects asymmetries in coupling and has been pivotal in analyzing complex networks. More broadly, information transfer principles underpin data compression, error-correcting codes, and even biological signaling, where they model phenomena like neural information propagation or genetic exchanges, though applications vary by domain. Further developments, such as unified theories integrating transmission and identification tasks, further refine bounds on transfer rates for probabilistic computing and beyond-Shannon paradigms.

Definition and Fundamentals

Basic Definition

Information transfer is the process of communicating data or knowledge from one entity to another, often through a medium that facilitates the conveyance of signals or messages across physical or abstract pathways. This encompasses both tangible forms, such as electrical or optical signals in telecommunications, and intangible exchanges, like the sharing of ideas in human discourse. A key distinction exists between data and information: data refers to raw, unprocessed symbols or facts lacking inherent meaning, while information arises when data is processed and contextualized to provide meaning or reduce uncertainty. For instance, in a verbal conversation, the sequence of spoken words constitutes data, but the interpreted intent or message forms the information; similarly, during file sharing, the binary digits represent data, whereas the resulting readable document provides information. Fundamental to this process are components including the communication channel, which serves as the pathway for transmission; source encoding, where the originating message is formatted for the medium; and receiver decoding, which reconstructs the message at the destination. These elements, as conceptualized in Claude Shannon's foundational model, underscore the structured nature of effective transfer.

Historical Development

The concept of information transfer has roots in ancient philosophical inquiries into communication and semiotics, where early thinkers explored how meaning is conveyed from one entity to another. In the 4th century BCE, Aristotle laid foundational ideas in his work Rhetoric, emphasizing the role of the speaker, speech, and audience in effective communication, which influenced subsequent understandings of persuasive and informational exchange. These early contributions framed communication as a deliberate process of encoding and decoding messages, predating modern technical models. The 19th century introduced practical precursors to systematic information transfer through advancements in electrical communication technologies. Samuel Morse's development of the electromagnetic telegraph and Morse code in 1837 enabled the rapid transmission of coded messages over long distances, revolutionizing how discrete information could be sent and received. This was soon complemented by telephony, with Alexander Graham Bell's patent for the telephone in 1876 allowing for the analog transfer of voice signals, further demonstrating the potential for real-time information conveyance across wires. These inventions shifted information transfer from manual or visual methods to engineered electrical systems, setting the stage for scalable communication networks. In the 20th century, the field evolved toward theoretical formalization, particularly with the advent of . Norbert Wiener's 1948 book Cybernetics: Or Control and Communication in the Animal and the Machine introduced the interdisciplinary of processes through and , treating as a measurable in both mechanical and biological systems. This work bridged and , highlighting how information transfer underpins adaptive systems. A landmark in this progression was Claude E. Shannon's seminal 1948 paper "A Mathematical Theory of Communication," published in the Bell System Technical Journal, which established information theory by quantifying the reliability and efficiency of message transmission from source to receiver. Widely recognized as the birth of modern information theory, Shannon's framework provided the analytical tools to model information transfer amid noise and uncertainty, influencing subsequent developments in communication engineering.

Theoretical Foundations in Information Theory

Shannon's Model of Communication

Claude Shannon introduced a foundational mathematical model for communication in his seminal 1948 paper, which conceptualizes information transfer as a process involving the encoding, transmission, and decoding of messages through a potentially noisy channel. The model, often referred to as the Shannon-Weaver model following Warren Weaver's interpretive additions in their 1949 book, delineates a linear flow of information while emphasizing engineering efficiency over linguistic meaning. At its core, the model addresses the technical challenge of reliably reproducing a message at a destination despite distortions, laying the groundwork for quantifying channel capacity and error rates in communication systems. The model's key components include the information source, which generates a message as a sequence of symbols or a continuous function of time, such as speech or telegraph signals; the transmitter, which encodes the message into a suitable signal for transmission, for example, through modulation or quantization; the channel, the physical medium (e.g., wire or radio waves) that carries the signal; the receiver, which decodes the incoming signal to reconstruct the message; and the destination, the intended recipient of the reconstructed message. A noise source is also incorporated, representing external disturbances like thermal noise or interference that corrupt the signal during transit. Textually, the model's flow can be described as a schematic diagram: the source outputs a message to the transmitter, which produces a signal passed through the channel (where noise may intervene), arriving at the receiver for decoding and delivery to the destination, ensuring the process is viewed as a probabilistic transformation of ensembles rather than individual instances. Central assumptions underpin the model's operation, treating information as arising from probabilistic events where messages are selected from a finite set of possibilities, with uncertainty measured logarithmically in bits (base-2). It posits symbol independence in discrete no-memory channels, allowing statistical analysis of sequences without temporal dependencies, and deliberately focuses on technical fidelity—accurate signal reproduction—while excluding semantic or effectiveness levels of communication, as later clarified by Weaver. These assumptions enable the model to prioritize engineering metrics like bandwidth and noise resilience over interpretive contexts. Published in the Bell System Technical Journal in July and October 1948, the paper revolutionized telecommunications by providing a rigorous framework that directly influenced standards for digital encoding, error correction, and capacity optimization in systems like telephony and data networks. Its principles underpin modern protocols, from internet routing to satellite communications, establishing information theory as the bedrock of reliable data transfer.

Measures of Information: Entropy and Mutual Information

In information theory, entropy serves as a fundamental measure of the uncertainty or information content associated with a random variable, quantifying the average number of bits required to encode the outcomes of a discrete source. Introduced by Claude Shannon, the entropy H(X) of a discrete random variable X with possible values \{x_1, x_2, \dots, x_n\} and probability mass function p(x_i) is defined as H(X) = -\sum_{i=1}^n p(x_i) \log_2 p(x_i), where the logarithm is base 2 to express the result in bits. This formula arises from the need to minimize the average code length in source coding, where rarer events require longer codes to maintain efficiency, balancing the total redundancy across all probabilities. For instance, a fair coin flip has H(X) = 1 bit, reflecting maximal uncertainty between two equally likely outcomes, while a biased coin with probability 0.9 for heads yields H(X) \approx 0.47 bits, indicating lower uncertainty and thus less information on average. Mutual information extends entropy to measure the amount of information one random variable contains about another, capturing the reduction in uncertainty about one variable upon observing the other. For two discrete random variables X and Y, the mutual information I(X; Y) is given by I(X; Y) = H(X) - H(X \mid Y), where H(X \mid Y) is the conditional entropy, representing the average uncertainty in X given knowledge of Y. This expression derives from the chain rule for entropy: the joint entropy H(X, Y) = H(X) + H(Y \mid X) = H(Y) + H(X \mid Y), which rearranges to show that I(X; Y) = H(X, Y) - H(X \mid Y) - H(Y \mid X) + H(Y), but simplifies to the uncertainty reduction form when focusing on X's perspective. Mutual information is symmetric, non-negative, and zero if X and Y are independent, highlighting shared information without assuming causality. A key application in assessing information transfer appears in channel capacity calculations, such as for the binary symmetric channel (BSC), where input bits are flipped with crossover probability p. The channel capacity C, the maximum mutual information I(X; Y) over input distributions, simplifies to C = 1 - H(p) for binary inputs, with H(p) = -p \log_2 p - (1-p) \log_2 (1-p). For p = 0, C = 1 bit per use (perfect transmission), while p = 0.5 yields C = 0 (no reliable transfer), illustrating how entropy quantifies noise-induced information loss. These measures assume discrete, memoryless sources, limiting their direct applicability to continuous or dependent processes without extensions like differential entropy. Moreover, entropy and mutual information capture syntactic structure but ignore semantic content, treating all bits equally regardless of meaning.

Applications in Communication and Computing

Classical Data Transmission

Classical data transmission involves the conversion of discrete binary data into continuous analog signals for propagation over physical channels, such as wires or air, while adhering to principles from information theory to maximize reliable throughput. This process relies on modulation to encode bits onto carrier waves, error control mechanisms to mitigate noise-induced distortions, and awareness of channel constraints like bandwidth. In practical systems, these elements ensure that information is transferred with minimal loss, bridging the gap between digital sources and analog media. Modulation techniques are fundamental to classical data transmission, transforming binary sequences into varying signal properties for efficient carriage over analog channels. Amplitude Shift Keying (ASK) modulates the amplitude of a carrier wave, where binary '1' corresponds to a higher amplitude and '0' to a lower or zero amplitude, making it simple but susceptible to noise variations. Frequency Shift Keying (FSK) varies the carrier frequency instead, assigning distinct frequencies to each bit value, which offers better noise immunity at the cost of wider bandwidth usage. Phase Shift Keying (PSK), particularly Binary PSK (BPSK), shifts the phase of the carrier (e.g., 0° for '0' and 180° for '1'), providing robust performance in noisy environments due to constant amplitude and frequency. These methods, rooted in early digital communication engineering, enable the mapping of bits to detectable signal changes while optimizing for channel characteristics. Error detection and correction are essential in classical transmission to combat impairments like noise and interference, ensuring data integrity without retransmission in many cases. Hamming codes, introduced by Richard Hamming, are linear error-correcting codes that add parity bits to detect and correct single-bit errors in blocks of data. For instance, the (7,4) Hamming code encodes 4 data bits into 7 total bits using 3 parity bits, where each parity bit checks a specific combination of data and other parity positions; if a single error occurs, its position is identified by the syndrome pattern formed by recalculating parities. Cyclic Redundancy Checks (CRC), developed by W. Wesley Peterson and D. T. Brown, provide efficient error detection for larger blocks by treating data as a polynomial and appending a remainder from division by a generator polynomial, capable of detecting burst errors up to the polynomial degree. These techniques add redundancy—typically 10-30% overhead—while maintaining high detection rates, often exceeding 99.9% for common error patterns in digital links. Bandwidth limitations impose fundamental constraints on transmission rates, as articulated by the Nyquist theorem, which states that the maximum data rate C = 2B \log_2 M bits per second, where B is the channel bandwidth in Hz and M is the number of distinct signal levels. For binary signaling (M=2), this simplifies to $2B bits per second, highlighting how higher-level modulation (e.g., 4-ASK with M=4) can increase capacity but requires greater signal-to-noise ratio to distinguish levels. In practice, this theorem guides system design to avoid intersymbol interference, though real channels fall short due to noise, as bounded by Shannon's capacity. Wired transmission, exemplified by Ethernet standards like 1000BASE-T, faces challenges primarily from signal attenuation, where high-frequency components degrade over distance in twisted-pair cables, limiting reliable links to 100 meters. Attenuation increases with frequency and cable length, necessitating equalization circuits to compensate for losses up to 20-30 dB at gigabit rates. In contrast, wireless transmission via Wi-Fi (IEEE 802.11) encounters more severe attenuation from path loss, multipath fading, and obstacles like walls, demanding adaptive modulation and power control to sustain data rates. Recent advancements, such as Wi-Fi 7 (IEEE 802.11be, finalized in 2025), introduce multi-link operation and wider channels to achieve theoretical speeds up to 46 Gbps while improving reliability in challenging environments.
TechniqueSignal Parameter VariedKey AdvantageKey LimitationExample Application
ASKAmplitudeSimple implementationNoise-sensitiveOptical links
FSKFrequencyGood noise immunityBandwidth-intensiveEarly modems
PSKPhaseEfficient bandwidth usePhase synchronization requiredSatellite comms

Protocols and Efficiency Metrics

The Open Systems Interconnection (OSI) model establishes a layered architecture for network communication, with the Physical, Data Link, and Network layers directly underpinning information transfer. The Physical layer manages the transmission and reception of unstructured bit streams over a physical transmission medium, such as cables or wireless signals. The Data Link layer facilitates reliable transfer between directly connected nodes, incorporating framing, error detection via cyclic redundancy checks, and medium access control to prevent collisions. The Network layer oversees end-to-end delivery across multiple links, using logical addressing (e.g., IP addresses) and routing algorithms to forward packets toward their destination. These layers form the basis for the TCP/IP protocol stack, which maps the OSI framework to internetworking practices and enables interoperable data exchange in diverse environments. Key protocols at the transport and application layers implement these foundational mechanisms for specific transfer needs. The Transmission Control Protocol (TCP), operating at the transport layer, provides reliable, ordered, and error-checked delivery of byte streams through a three-way handshake for connection setup, sequence numbering, acknowledgments, and selective retransmissions of lost segments. In scenarios requiring minimal overhead, the User Datagram Protocol (UDP) supports connectionless, low-latency datagram transmission without built-in reliability, making it ideal for applications like video streaming where occasional packet loss is tolerable. For file transfer, the File Transfer Protocol (FTP) builds on TCP to enable command-based retrieval, storage, and management of files between hosts, supporting both active and passive modes for data connections. The Secure File Transfer Protocol (SFTP) is a network protocol offering file access, transfer, and management functionalities over a reliable data stream, typically secured via the Secure Shell (SSH) protocol, providing encryption, server authentication, and data integrity as a secure alternative to FTP. Performance of these protocols is assessed through standardized efficiency metrics that quantify transfer quality and reliability. Throughput represents the actual rate of successful data delivery from source to destination, measured in bits per second and influenced by , protocol overhead, and network conditions. Latency measures the end-to-end delay for a packet to traverse the network, encompassing propagation, transmission, queuing, and processing times, with lower values essential for interactive applications. Jitter, the variation in packet arrival times, can disrupt real-time communications like voice over IP, where consistent low jitter (typically under 30 ms) is required to avoid perceptible distortions. These metrics are often evaluated using benchmarks like those in RFC 2544, which test device performance under controlled traffic loads. A critical derivative metric is goodput, which focuses on the rate of useful application-level data delivered, excluding headers, retransmissions, and errors. It provides a more accurate gauge of effective information transfer in lossy environments. Protocols face significant challenges in maintaining efficiency, particularly congestion control and scalability in large networks. Congestion occurs when network resources are overwhelmed, leading to packet drops and reduced throughput; TCP Reno addresses this through an additive-increase/multiplicative-decrease strategy, incorporating slow start (exponential congestion window growth until a threshold), congestion avoidance (linear growth post-threshold), fast retransmit (triggered by three duplicate acknowledgments), and fast recovery (temporary window inflation to probe capacity without full reset). In expansive networks with thousands of nodes, scalability issues emerge from routing table bloat, increased propagation delays, and amplified congestion effects, necessitating adaptive algorithms like Scalable TCP that adjust window growth more aggressively for high-bandwidth-delay product links to sustain performance without instability.

Biological Contexts

Genetic Information Flow

Genetic information flow refers to the processes by which genetic instructions encoded in DNA are transferred and expressed within cells, primarily following the central dogma of molecular biology, which posits a unidirectional flow from DNA to RNA to proteins. This framework was first proposed by Francis Crick in 1958, building on the double-helix structure of DNA elucidated by James Watson and Francis Crick in 1953. The central dogma outlines that genetic information is stored in DNA, transcribed into messenger RNA (mRNA), and then translated into proteins, ensuring the faithful propagation of heritable traits. Reverse transcription from RNA to DNA, as seen in retroviruses, represents a notable exception but does not alter the core principle for most cellular contexts. The initial step, transcription, involves the synthesis of mRNA from a DNA template by the enzyme RNA polymerase, which binds to promoter regions on the DNA and unwinds the double helix to read one strand as a template. RNA polymerase adds complementary ribonucleotides to form a single-stranded mRNA molecule, which is then processed—capped, polyadenylated, and spliced in eukaryotes—to become mature mRNA capable of exiting the nucleus. This mRNA carries the genetic code to ribosomes, where translation occurs: the ribosome, composed of ribosomal RNA and proteins, scans the mRNA sequence in triplets called codons, recruiting transfer RNA (tRNA) molecules that match each codon via their anticodon loops to deliver specific amino acids. These amino acids are linked into a polypeptide chain, folding into functional proteins that execute cellular functions. To maintain the integrity of this information transfer, fidelity mechanisms minimize errors. During DNA replication prior to cell division, DNA polymerases incorporate proofreading exonucleases that detect and excise mismatched nucleotides, achieving an error rate as low as 10^{-9} to 10^{-11} per base pair in bacteria like Escherichia coli. In translation, tRNA anticodon-codon base pairing is monitored by the ribosome, with kinetic proofreading ensuring mismatches reduce binding efficiency by up to 1,000-fold, thus preserving sequence accuracy. These safeguards are crucial for preventing mutations that could disrupt protein function. Beyond vertical inheritance during replication, horizontal gene transfer (HGT) enables direct exchange of genetic material between cells, accelerating adaptation. In bacteria, conjugation involves the transfer of plasmids—circular DNA molecules—via a pilus bridge between donor and recipient cells, often carrying advantageous genes. Transformation allows competent bacteria to uptake free DNA from the environment, while transduction occurs when bacteriophages (viruses) inadvertently package and deliver host DNA to new hosts during infection. A prominent example is the spread of antibiotic resistance genes among bacterial populations, where HGT via plasmids and phages has facilitated the rapid dissemination of resistance to drugs like penicillin in pathogens such as Staphylococcus aureus. This process underscores HGT's role in microbial evolution and public health challenges.

Neural Information Transfer

Neural information transfer in biological systems primarily occurs through the propagation of electrical signals along neurons and their transmission across synapses. This process enables the nervous system to process and relay sensory inputs, motor commands, and cognitive functions. At the core of this transfer is the action potential, a rapid electrochemical event that travels along the axon of a neuron. Action potentials are generated when a neuron is sufficiently depolarized, triggering the opening of voltage-gated sodium (Na⁺) channels, which allow an influx of Na⁺ ions and cause the membrane potential to rise sharply from approximately -70 mV to +30 mV, producing a spike of about 100 mV in amplitude. This depolarization is followed by the activation of voltage-gated potassium (K⁺) channels, leading to K⁺ efflux and repolarization, while the sodium-potassium pump (Na⁺/K⁺-ATPase) maintains long-term ion gradients using ATP. The seminal Hodgkin-Huxley model, derived from experiments on squid giant axons, mathematically describes this ionic mechanism, showing how conductance changes in Na⁺ and K⁺ channels propagate the signal. These spikes travel along the axon at speeds ranging from 1 m/s in unmyelinated fibers to up to 100 m/s in myelinated ones, facilitated by saltatory conduction where the myelin sheath insulates the axon, allowing the action potential to "jump" between nodes of Ranvier. Upon reaching the axon terminal, the action potential triggers synaptic transmission, the process by which information is passed from the presynaptic neuron to the postsynaptic cell. Depolarization opens voltage-gated calcium (Ca²⁺) channels, causing Ca²⁺ influx that promotes the fusion of synaptic vesicles with the presynaptic membrane and the release of neurotransmitters, such as acetylcholine at neuromuscular junctions or glutamate in central nervous system synapses. These neurotransmitters diffuse across the synaptic cleft and bind to receptors on the postsynaptic membrane, generating excitatory postsynaptic potentials (EPSPs) or inhibitory postsynaptic potentials (IPSPs). For instance, glutamate binding to ionotropic receptors like AMPA allows Na⁺ influx, depolarizing the postsynaptic neuron and increasing the likelihood of firing an action potential (EPSP), while inhibitory neurotransmitters like GABA open Cl⁻ channels, hyperpolarizing the membrane (IPSP). This quantal release mechanism, where neurotransmitters are packaged in vesicles and released in discrete packets, was pioneered by Bernard Katz's work on the neuromuscular junction. Neurons encode information in the pattern of action potentials rather than their amplitude, which remains all-or-nothing. Two primary schemes are rate coding, where the frequency of spikes represents stimulus intensity—for example, sensory neurons in the somatosensory system increase firing rates from 10 to 100 Hz as touch pressure intensifies—and temporal coding, where the precise timing or intervals between spikes convey information, such as in auditory processing where spike timing synchronizes to sound wave phases. Rate coding is efficient for sustained signals like pain intensity, while temporal coding supports high-fidelity transmission for dynamic stimuli, with evidence from retinal ganglion cells showing that timing patterns can double information capacity compared to average rates alone. These codes often coexist in neural populations, allowing robust information transfer across networks. Synaptic plasticity modulates information transfer, enabling learning and adaptation. A key mechanism is long-term potentiation (LTP), a persistent strengthening of synaptic efficacy following high-frequency stimulation, first demonstrated in the hippocampus by Bliss and Lømo in 1973. LTP induction requires NMDA receptor activation, which, upon coincident presynaptic glutamate release and postsynaptic depolarization, permits Ca²⁺ influx; this triggers signaling cascades that insert additional AMPA receptors into the postsynaptic membrane, enhancing excitatory transmission. For example, in CA1 pyramidal cells, LTP can increase synaptic strength by 50-100% for hours or longer, underpinning associative memory formation. This receptor trafficking distinguishes LTP from short-term plasticity and highlights its role in refining neural circuits for efficient information flow.

Quantum and Advanced Contexts

Quantum Teleportation

Quantum teleportation is a process in quantum information theory that enables the transfer of an unknown quantum state from one location to another without physically transporting the quantum system itself, relying instead on pre-shared entanglement and a classical communication channel. This technique addresses the challenges posed by the no-cloning theorem, which prohibits the perfect duplication of arbitrary quantum states, thereby necessitating the destruction of the original state during transfer. The foundational insight stems from the Einstein-Podolsky-Rosen (EPR) paradox, resolved through John Bell's 1964 theorem, which demonstrates that quantum mechanics permits non-local correlations via entangled particles that violate classical local realism, allowing for the exploitation of these correlations in teleportation. The protocol, as originally proposed, involves two parties, Alice and Bob, who share an entangled pair of qubits in a Bell state, such as the maximally entangled state \frac{1}{\sqrt{2}} (|00\rangle + |11\rangle). Alice, holding the qubit to be teleported (denoted \psi) and one half of the entangled pair, performs a joint measurement on \psi and her entangled qubit in the Bell basis, yielding one of four possible outcomes corresponding to the four Bell states. This measurement projects the qubits into one of these states and collapses Bob's distant entangled qubit into a corresponding state, but distorted by the measurement result. Alice then transmits the two classical bits encoding her measurement outcome to Bob via a classical channel. Upon receiving the classical bits, Bob applies conditional unitary corrections to his qubit: specifically, the Pauli operators X (for a bit flip) or Z (for a phase flip), or both XZ, depending on the two-bit value (00: identity; 01: Z; 10: X; 11: XZ). These operations reconstruct the original state \psi on Bob's qubit, while the measurement at Alice's end destroys the information in her qubits, ensuring compliance with the no-cloning theorem. The no-cloning theorem, which states that no unitary operation can perfectly copy an unknown quantum state onto a blank state, implies that such destructive measurement is essential, as direct transmission or copying of quantum information is impossible. This process transfers the quantum information non-locally, complementing classical mutual information by leveraging quantum superposition and entanglement. The success of quantum teleportation is quantified by the fidelity, defined as the overlap F = \langle \psi | \rho | \psi \rangle between the original pure state \psi and the teleported mixed state \rho, averaged over possible input states. In the ideal case with perfect entanglement and noiseless channels, the fidelity achieves F = 1, fully reconstructing the state. However, in the presence of noise or imperfect entanglement, the fidelity degrades; notably, any protocol limited to classical resources (without entanglement) cannot exceed an average fidelity of $2/3 for qubit states, serving as a benchmark to distinguish quantum from classical teleportation.

Recent Developments in Quantum Networks

In 2024, researchers at Northwestern University achieved a milestone in integrating quantum communication with existing infrastructure by demonstrating quantum teleportation over more than 30 kilometers of fiber-optic cable simultaneously carrying classical internet traffic. This experiment utilized wavelength division multiplexing to separate quantum signals at 1550 nm from classical data channels, maintaining high-fidelity state transfer with an average quantum bit error rate below 5%. The breakthrough addresses key scalability challenges by showing that quantum networks can coexist with conventional telecom systems without requiring dedicated fibers, paving the way for hybrid quantum-classical infrastructures. Building on this, a 2025 experiment at the University of Oxford marked the first successful implementation of a distributed quantum algorithm across multiple processors connected via optical links. By employing heralded remote entanglement and gate teleportation—specifically, teleporting a controlled-Z gate between two trapped-ion quantum processors separated by 2 meters—the team executed a variational quantum eigensolver algorithm with fidelities exceeding 80%. This approach leverages entanglement swapping to enable modular quantum computing, where independent processors function as a unified system, overcoming limitations in scaling single-device architectures and demonstrating potential for fault-tolerant distributed computation. Scalability remains challenged by noise accumulation and synchronization, but the work highlights progress toward interconnecting remote quantum modules over longer distances. Quantum repeaters have emerged as a critical technology for extending entanglement distribution beyond direct transmission limits, particularly for long-distance quantum networks. Recent advancements focus on protocols involving entanglement purification to enhance link fidelity and swapping to chain elementary segments, mitigating photon loss in optical fibers. For instance, experimental progress includes entanglement distribution over 100 km using quantum memories in single segments, with rates up to several Hz in lab settings employing solid-state quantum memories and atomic ensembles for storage. These developments address exponential loss scaling, enabling viable quantum internet backbones, though challenges like memory coherence times (currently limited to milliseconds) and multi-node integration persist. In applications, quantum key distribution (QKD) networks have advanced significantly, exemplified by China's Micius satellite launched in 2017 and extended through the 2020s with ground-to-satellite and intercontinental links. Follow-on missions, including a 2025 microsatellite demonstration, have enabled real-time QKD over intercontinental distances exceeding 12,000 km, with secret key rates up to 1 Mbit per satellite pass (as of March 2025). Error rates have improved dramatically, with quantum bit error rates reduced to below 1% in optimized setups, supporting final key security parameters below 10^{-10} through advanced error correction and privacy amplification. These networks, now spanning Asia-Africa connections, underscore QKD's role in secure global communication, with ongoing efforts to integrate repeaters for terrestrial extensions.

References

  1. [1]
    [PDF] A Mathematical Theory of Communication
    The entropy of this source determines the channel capacity which is necessary and sufficient. In the example the only information retained is that all the ...
  2. [2]
  3. [3]
    [nlin/0001042] Measuring Information Transfer - arXiv
    Jan 19, 2000 · Measuring Information Transfer. Authors:Thomas Schreiber. View a PDF of the paper titled Measuring Information Transfer, by Thomas Schreiber.
  4. [4]
  5. [5]
    Information transfer – Knowledge and References - Taylor & Francis
    Information transfer refers to the process of communicating data or knowledge between two or more entities, regardless of the context.
  6. [6]
    [PDF] What is Shannon information? - PhilSci-Archive
    Jul 27, 2014 · According to a deeply rooted intuition, information is related with data, it has or carries content. In order to elucidate this idea, the ...
  7. [7]
    Invention of the Telegraph | Articles and Essays | Samuel F. B. ...
    Long before Samuel F. B. Morse electrically transmitted his famous message "What hath God wrought?" from Washington to Baltimore on May 24, 1844, ...Missing: 19th | Show results with:19th
  8. [8]
    A mathematical theory of communication | Nokia Bell Labs Journals ...
    In the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible.
  9. [9]
    [PDF] The Mathematical Theory of Communication - MPG.PuRe
    The mathematical theory of the engineering aspects of com- munication, as developed chiefly by Claude Shannon at the Bell. Telephone Laboratories, admittedly ...
  10. [10]
    How Claude Shannon Invented the Future | Quanta Magazine
    Dec 22, 2020 · Shannon's general theory of communication is so natural that it's as if he discovered the universe's laws of communication, rather than ...
  11. [11]
    [PDF] Error Detecting and Error Correcting Codes
    April, 1950. Copyright, 1950, American Telephone and Telegraph Company. No. 2. Error Detecting and Error Correcting Codes. By R. W. HAMMING. 1. INTRODUCTION.Missing: URL | Show results with:URL
  12. [12]
    Cyclic Codes for Error Detection - Semantic Scholar
    In order to provide error detection in communication networks a method called Cyclic Redundancy Check has been used for almost 40 years. This algorithm is ...Missing: original | Show results with:original
  13. [13]
    Certain Topics in Telegraph Transmission Theory - Semantic Scholar
    Certain Topics in Telegraph Transmission Theory · H. Nyquist · Published in Transactions of the American… 1 April 1928 · Engineering.
  14. [14]
    [PDF] Development of Ethernet / Physical Layer Aspects - arXiv
    However, it has some inherent problems which make the design challenging, such as frequency dependent attenuation, crosstalk between wire pairs, echo and ...
  15. [15]
  16. [16]
    RFC 9293 - Transmission Control Protocol (TCP) - IETF Datatracker
    TCP supports unicast delivery of data. ... Transmission Control Protocol: a host-to-host protocol for reliable communication in internetwork environments.Table of Contents · Purpose and Scope · Introduction · Functional Specification
  17. [17]
    RFC 768 - User Datagram Protocol - IETF Datatracker
    This User Datagram Protocol (UDP) is defined to make available a datagram mode of packet-switched computer communication in the environment of an ...
  18. [18]
    RFC 959 - File Transfer Protocol - IETF Datatracker
    The primary function of FTP defined as transfering files efficiently and reliably among hosts and allowing the convenient use of remote file storage ...
  19. [19]
    draft-ietf-secsh-filexfer-13
    The SSH File Transfer Protocol provides secure file transfer functionality over any reliable, bidirectional octect stream.
  20. [20]
    [PDF] Energy Efficiency and Throughput for TCP Traffic in Multi-Hop ...
    We see that as PER increases, the TCP goodput decreases and the the optimal transmission range (i.e., the range corresponding to maximum TCP goodput) increases.
  21. [21]
    RFC 2581: TCP Congestion Control
    RFC 2581 defines TCP's four congestion control algorithms: slow start, congestion avoidance, fast retransmit, and fast recovery.
  22. [22]
    Scalable TCP: improving performance in highspeed wide area ...
    Scalable TCP is a simple sender-side alteration to the TCP congestion window update algorithm. It offers a robust mechanism to improve performance in highspeed ...
  23. [23]
    Central Dogma of Molecular Biology - Nature
    The central dogma of molecular biology deals with the detailed residue-by-residue transfer of sequential information.
  24. [24]
    [PDF] ON PROTEIN SYNTHESIS - Squarespace
    Protein synthesis is a large subject in a state of rapid development. To Cover it completely in this article would be impossible. I have therefore deliber.
  25. [25]
    A Structure for Deoxyribose Nucleic Acid - Nature
    The determination in 1953 of the structure of deoxyribonucleic acid (DNA), with its two entwined helices and paired organic bases, was a tour de force in ...
  26. [26]
    From DNA to RNA - Molecular Biology of the Cell - NCBI Bookshelf
    First, and most obvious, RNA polymerase catalyzes the linkage of ribonucleotides, not deoxyribonucleotides. Second, unlike the DNA polymerases involved in DNA ...Transcription Produces RNA... · Signals Encoded in DNA Tell...
  27. [27]
    Translation of mRNA - The Cell - NCBI Bookshelf - NIH
    Thus, mRNAs are usually translated by a series of ribosomes, spaced at intervals of about 100 to 200 nucleotides (Figure 7.14). The group of ribosomes bound to ...Transfer RNAs · The Ribosome · The Organization of mRNAs...
  28. [28]
    DNA replication fidelity in Escherichia coli: a multi-DNA polymerase ...
    The error rate during DNA replication is as low as 10−9 to 10−11 errors per base pair. How this low level is achieved is an issue of major interest. This review ...
  29. [29]
    A Uniform Response to Mismatches in Codon-Anticodon Complexes ...
    Feb 3, 2006 · We show by kinetic analysis that single mismatches at any position of the codon-anticodon complex result in slower forward reactions and a uniformly 1000-fold ...
  30. [30]
    Horizontal Gene Transfer - PMC - NIH
    Transformation: Bacteria take up DNA from their environment. Conjugation: Bacteria directly transfer genes to another cell. Transduction: Bacteriophages ...
  31. [31]
    Horizontal Gene Transfer Mediated Bacterial Antibiotic Resistance
    Horizontal gene transfer (HGT) allows bacteria to exchange their genetic materials (including antibiotic resistance genes, ARGs) among diverse species (Le Roux ...
  32. [32]
    Physiology, Action Potential - StatPearls - NCBI Bookshelf - NIH
    To reestablish the appropriate balance of ions, an ATP-driven pump (Na/K-ATPase) induces movement of sodium ions out of the cell and potassium ions into the ...
  33. [33]
    A quantitative description of membrane current and its application to ...
    This paper by Hodgkin and Huxley provides a quantitative description of membrane current and its application to nerve conduction and excitation.
  34. [34]
    Quantal Transmission at Neuromuscular Synapses - NCBI - NIH
    Most of the pioneering work on neuromuscular transmission was performed by Bernard Katz and his collaborators at University College London during the 1950s ...
  35. [35]
    Excitatory and Inhibitory Postsynaptic Potentials - NCBI - NIH
    PSPs are called excitatory (or EPSPs) if they increase the likelihood of a postsynaptic action potential occurring, and inhibitory (or IPSPs) if they decrease ...
  36. [36]
    NMDA Receptor-Dependent Long-Term Potentiation and Long ...
    Long-term potentiation and long-term depression (LTP/LTD) can be elicited by activating N-methyl-d-aspartate (NMDA)-type glutamate receptors.Missing: seminal | Show results with:seminal
  37. [37]
    On the Einstein Podolsky Rosen paradox | Physics Physique Fizika
    Acta 36, (1963). J. S. Bell, to be published. D. Bohm, Phys. Rev. 85, and (1952).
  38. [38]
    Teleporting an unknown quantum state via dual classical and ...
    Quantum Milestones, 1993: Teleportation Is Not Science Fiction. Published 10 March, 2025. Theorists proposed an idea they called quantum teleportation—a means ...
  39. [39]
    A single quantum cannot be cloned - Nature
    Oct 28, 1982 · We show here that the linearity of quantum mechanics forbids such replication and that this conclusion holds for all quantum systems.
  40. [40]
  41. [41]
    Distributed quantum computing across an optical network link - Nature
    Feb 5, 2025 · Entanglement is heralded between network qubits through the interference of photons on beam splitters. A photonic switchboard provides a ...
  42. [42]
    Quantum repeaters: current research trends and latest achievements
    We present a review of current theoretical and experimental studies focused on the development of a quantum repeater, a key device for extended quantum networks ...
  43. [43]
    Micius quantum experiments in space | Rev. Mod. Phys.
    Jul 6, 2022 · Measure-device-independent QKD completely eliminates all security loopholes in detection systems and allows QKD networks to be secure with ...
  44. [44]
    USTC Demonstrates Successful Satellite-Enabled Quantum Key ...
    Mar 24, 2025 · Chinese researchers have made a major breakthrough by developing the world's first quantum microsatellite and demonstrating real-time QKD.Missing: extensions 2020s error