A radio access network (RAN) is a fundamental component of wireless telecommunications systems that connects end-user devices, such as smartphones and computers, to the core network through radio links, enabling voice, data, and multimedia services.[1][2] It forms the outermost layer of a cellular network, managing radio resources and facilitating seamless communication across coverage areas divided into cells.[3][4]The RAN architecture typically includes key elements such as base stations (e.g., eNodeB in 4G LTE or gNodeB in 5G), which house radio transceivers; remote radio heads (RRHs) for signal amplification; baseband units (BBUs) for digital signal processing; and antennas that transmit and receive radio waves.[1][2] These components work together to handle functions like resource allocation, mobility management, error detection, and encryption, ensuring efficient spectrum use and handover between cells as users move.[3][4] In modern implementations, RAN supports advanced technologies such as multiple-input multiple-output (MIMO) antennas for higher data rates and beamforming for targeted signal direction.[2][4]Historically, RAN evolved from analog 1G systems introduced in 1979 to digital 2G networks in 1991, with subsequent generations like 3G (2001) enabling mobile internet, 4GLTE (2009) providing IP-based all-packet connectivity, and 5G New Radio (NR) from 2018 offering speeds exceeding 1 Gbps via sub-6 GHz and millimeter-wave bands.[1] This progression has shifted RAN toward virtualization and openness, including cloud RAN (C-RAN) for centralized processing and open RAN (O-RAN) standards to promote interoperability among vendors.[3][4]In contemporary networks, particularly 5G, RAN plays a pivotal role in supporting diverse applications like enhanced mobile broadband, ultra-reliable low-latency communications for autonomous vehicles, and massive machine-type communications for IoT devices, while integrating with core networks via fronthaul and backhaul links using fiber optics or microwave.[3][2] Its scalability and energy efficiency are crucial for meeting growing demands, with innovations like software-defined networking (SDN) and network functions virtualization (NFV) enabling network slicing for customized services.[2][3]
Overview
Definition
A radio access network (RAN) is the component of a mobile telecommunication system that connects user equipment (UE), such as smartphones and Internet of Things (IoT) devices, to the core network through radio access technology (RAT). It implements protocols for radio transmission and reception, managing the allocation and release of specific radio resources to establish connections between the UE and the network.[5] The RAN ensures efficient use of the radio spectrum by handling transmission and reception within a set of cells.[6]Positioned between the UE and the core network (CN), the RAN manages the air interface for wireless communication and performs initial signal processing, such as modulation and error correction. In contrast, the CN focuses on non-radio functions like packet routing, session management, and billing.[6] This separation allows the RAN to optimize radio performance independently from the broader network operations.[1]Central to the RAN are radio access technologies (RATs), which define the specific methods for wireless transmission; examples include GSM for second-generation (2G) cellular networks and NR (New Radio) for fifth-generation (5G) systems.[6] The term "radio access network" was formalized within the 3GPP standards in 1998, as part of the development of third-generation (3G) mobile systems based on evolved GSM core networks and the Universal Terrestrial Radio Access (UTRA).[7]
Role in Mobile Networks
The radio access network (RAN) serves as the essential interface in mobile networks, enabling wireless connectivity by bridging user equipment such as smartphones and IoT devices to the core network (CN). It facilitates the conversion of analog radio signals from user devices into digital data packets that can be routed through the CN for further processing and transmission. This integration occurs primarily via backhaul connections, such as fiber optic or microwave links, which transport the digitized traffic from base stations to the CN, ensuring seamless end-to-end communication.[3][2]A key function of the RAN is supporting user mobility through handover mechanisms, which allow active connections to transfer between adjacent cells or base stations without interruption, maintaining service continuity as users move across coverage areas. This is managed through direct coordination between base stations via interfaces like X2 (in LTE) or Xn (in 5G), based on signal strength and load conditions. In terms of performance, the RAN directly influences critical metrics including latency, throughput, and coverage; for instance, advanced features like massive MIMO enable up to 10 times higher downlink throughput compared to previous generations, while beamforming improves coverage in dense urban environments. Additionally, the RAN enforces initial quality of service (QoS) parameters, such as prioritizing voice traffic over data to minimize latency for real-time applications and ensuring reliable packet delivery for services like video streaming.[3][2][8]Within the broader mobile ecosystem, the RAN interacts closely with the CN to support essential functions like user authentication and session routing, where it forwards authentication requests and routing information to the core network via interfaces such as N2 (in 5G) or S1 (in LTE), and coordinates with other RAN nodes via X2/Xn. It also accommodates multi-radio access technology (multi-RAT) scenarios, including dual connectivity in 5G networks, where devices simultaneously connect to 4G LTE and 5G NR base stations to aggregate resources for enhanced reliability and capacity. Economically, the RAN represents a major portion of mobile operators' capital expenditures, typically accounting for 45-70% of total network capex due to the costs of deploying and upgrading base stations and antennas, which drive overall infrastructure investments.[3][9][10][11]
Components
Base Stations and Radio Units
Base stations serve as the primary hardware at cell sites in a radio access network (RAN), responsible for transmitting and receiving radio signals to and from user equipment. In 4GLTE networks, these are known as evolved Node Bs (eNodeBs), which integrate radio frequency (RF) and baseband functions to manage air interface communications. In 5G New Radio (NR), they are termed gNodeBs (gNBs), supporting enhanced capabilities such as higher data rates and lower latency through advanced RF technologies.[6] These base stations are classified by coverage area and transmit power according to 3GPP specifications, including wide area base stations (macrocells) for broad coverage, medium range (microcells) for intermediate urban densities, local area (picocells) for indoor or hotspot scenarios, and home base stations (femtocells) for residential use.[12]Macrocells provide wide-area coverage, typically deployed on tall towers or rooftops to serve large populations, with no upper limit on output power for wide area base stations in 3GPP TS 38.104, allowing configurations up to several hundred watts total across sectors to achieve cell radii of several kilometers.[12] Microcells and picocells target denser environments like urban streets or buildings, with maximum conducted output powers of 38 dBm (about 6.3 W) for medium range and 24 dBm (250 mW) for local area classes, enabling smaller cell sizes of tens to hundreds of meters.[12] Femtocells, limited to 20 dBm (100 mW), are designed for in-home deployment to offload traffic from macro networks while minimizing interference.[12]Radio units (RUs) within base stations handle the analog RF signal processing, including up-conversion, amplification, filtering, and modulation of signals before transmission via antennas.[3] In 5G, RUs often integrate with massive multiple-input multiple-output (MIMO) antenna arrays, supporting configurations like 64 transmit and 64 receive elements (64T64R) to enable beamforming, where directional beams focus energy toward users for improved spectral efficiency and reduced interference.[13] These units operate across frequency ranges defined in 3GPP, including sub-6 GHz bands (FR1, 410 MHz to 7.125 GHz) for balanced coverage and capacity, and millimeter-wave (mmWave) bands (FR2, 24.25 GHz to 71 GHz) for ultra-high throughput in short-range, dense deployments.[12] Typical power amplifiers in macro RUs deliver 20-60 W per sector to support these bands, with active antenna systems combining RF components directly with the antenna array for compact integration.[14]Deployment of base stations involves site acquisition, where operators secure land or rooftop leases compliant with local zoning and environmental regulations to ensure line-of-sight propagation.[15] For macrocells, tower mounting elevates antennas 30-100 meters to maximize coverage, using monopoles, lattice towers, or rooftops with sector frames to support multiple directional antennas.[16]Interference mitigation is achieved through sectorization, dividing the cell into 3-6 sectors with narrow-beam antennas (e.g., 65-120 degrees horizontal beamwidth), which reduces co-channel interference by limiting signal overlap between adjacent cells and allows frequency reuse factors as low as 1.[17] This configuration enhances capacity in high-traffic areas while the RUs connect to baseband units via fronthaul for digitalprocessing.[12]
Baseband Units and Processing
The baseband unit (BBU) serves as the core processing element in a radio access network (RAN), handling the digital signal processing required to manage communication between user equipment and the network core. It performs critical functions such as modulation and demodulation of signals, converting digital data streams into formats suitable for radio transmission and vice versa. Additionally, the BBU manages coding and decoding operations, including forward error correction (FEC) techniques like turbo codes or low-density parity-check (LDPC) codes, to ensure reliable data transmission by detecting and correcting errors introduced during propagation. In traditional RAN architectures, the BBU is typically integrated at the base station site, providing localized processing for one or more radio units.[3][18][19]At the physical (PHY) layer, the BBU executes Layer 1 processing tasks essential for modern wireless standards, particularly those employing orthogonal frequency-division multiplexing (OFDM). This includes fast Fourier transform (FFT) for demodulation in the uplink and inverse fast Fourier transform (IFFT) for modulation in the downlink, along with cyclic prefix addition to mitigate inter-symbol interference. Resource allocation is managed through schedulers within the BBU, which dynamically assign time-frequency resources to users based on factors like channel quality and traffic demands, optimizing overall network efficiency. These functions enable the BBU to support high spectral efficiency and low-latency communications in dense environments.[20][21]Virtualization of the BBU, known as the virtual BBU (vBBU), leverages network function virtualization (NFV) principles to run baseband software on commercial off-the-shelf (COTS) hardware, such as general-purpose servers, rather than proprietary equipment. This approach enhances scalability by allowing dynamic resource pooling and orchestration across multiple sites, reducing capital expenditures and enabling faster deployment of new features through software updates. In cloud RAN (C-RAN) deployments, vBBUs are centralized in data centers, supporting functional splits like Option 7-2 where higher-layer processing is separated from lower-layer tasks.[22]Regarding capacity, a single 5G BBU can handle thousands of users per sector while delivering aggregate throughput on the order of tens of Gbps, scaling to 100 Gbps or more in advanced configurations with wide bandwidths and massive MIMO. For instance, with 100 MHz carrier bandwidth and multiple antenna streams, BBUs achieve peak sector throughputs exceeding 10 Gbps, accommodating high-density scenarios like urban mobility or IoT applications. This processing power is crucial for meeting 5G performance targets, though it demands efficient hardware acceleration for real-time operations.[23][24]
Backhaul and Fronthaul Connections
In radio access networks (RAN), backhaul refers to the transport links connecting the baseband unit (BBU) or distributed unit (DU) to the core network (CN), aggregating and routing user data, control signaling, and management traffic from multiple radio sites.[25] These links typically employ fiber optic infrastructure in dense urban environments, supporting Ethernet-based capacities ranging from 10 Gbps to 100 Gbps or higher to handle the surge in data throughput driven by mobile traffic growth.[26] In rural or remote areas where fiber deployment is cost-prohibitive, microwave radio systems serve as an alternative, delivering up to 20 Gbps over line-of-sight paths of several kilometers, though with limitations in adverse weather conditions.[27]Fronthaul, in contrast, provides the high-bandwidth, low-delay interconnection between the remote radio unit (RU) and the BBU/DU, transporting digitized in-phase and quadrature (IQ) samples or processed baseband signals essential for centralized radio processing.[28] For 5G deployments, the Common Public Radio Interface (CPRI) and its enhanced version, eCPRI, standardize these links, with eCPRI enabling packet-based Ethernet transport to reduce bandwidth overhead compared to the circuit-switched CPRI.[29] Depending on the functional split in the RAN architecture, fronthaul capacity requirements can exceed 25 Gbps per sector for lower-layer splits involving high-resolution signal sampling, such as those with multiple antennas and wide bandwidths.[30] These connections integrate with baseband units to support disaggregated processing in split RAN designs.[25]Key performance requirements for fronthaul emphasize ultra-low latency to ensure timely signal reconstruction, typically under 1 ms one-way (often targeting 100–250 µs), which is critical for maintaining air interface timing in coordinated multipoint operations.[29] Backhaul latencies are more tolerant, generally below 10 ms, to accommodate aggregation without significantly impacting end-to-end quality of service.[30]Synchronization across these links is vital for phase alignment and timing accuracy in RAN, achieved primarily through the Precision Time Protocol (PTP) as specified in IEEE 1588, which distributes sub-microsecond precision over packet networks, often combined with Synchronous Ethernet for frequency stability.[31]The evolution of backhaul and fronthaul has progressed from time-division multiplexing (TDM)-based systems in 3G, which offered limited scalability, to IP/Multi-Protocol Label Switching (MPLS) packet transport in 4G and 5G, facilitating statistical multiplexing, virtualization, and cost efficiencies in shared infrastructure.[25] This shift, alongside the adoption of wavelength-division multiplexing (WDM) on fiber, has addressed escalating capacity demands, but challenges persist, including fiber scarcity in underserved regions that drives up deployment costs and limits centralized RAN viability.[25]
Architectures
Traditional RAN
The traditional radio access network (RAN) architecture is characterized by a monolithic design, where the baseband unit (BBU) and radio unit (RU) are tightly integrated into a single hardwareunit provided by a single vendor. This integration typically occurs at the cell site, with standardized interfaces such as the Common Public Radio Interface (CPRI) facilitating communication between the components. Major vendors like Ericsson and Nokia have historically supplied these end-to-end solutions, ensuring optimized performance but limiting interoperability with third-party equipment.[32][33]In this architecture, exemplified by the NodeB in 3G Universal Terrestrial Radio Access Network (UTRAN), the base station combines radio frequency transmission/reception and baseband processing functions, connected to the radio network controller via the Iub interface. Key features include site-specific installations tailored to local conditions and a high degree of vendor-specific customization, which often results in vendor lock-in for operators. This closed ecosystem prioritizes seamless integration over openness, making upgrades or expansions dependent on the original supplier.[34]The advantages of traditional RAN lie in its proven reliability and stability, with mature hardware delivering consistent performance in established networks, alongside simplified deployment and dedicated vendor support for maintenance. However, disadvantages include high costs for scaling due to proprietary equipment and the need for full replacements during upgrades, as well as limited multi-vendor interoperability that hinders flexibility. Historically, this architecture dominated 2G through 4G deployments worldwide until the early 2010s, forming the backbone of global mobile infrastructure before shifts toward more disaggregated designs.[33][35]
Virtualized RAN
Virtualized radio access network (vRAN) represents a paradigm shift in RAN architecture by leveraging network function virtualization (NFV) and software-defined networking (SDN) to execute RAN software functions on commercial off-the-shelf (COTS) general-purpose servers, rather than dedicated hardware appliances. This approach decouples software from proprietary hardware, enabling greater flexibility in deployment and management. As a precursor, cloud RAN (C-RAN) centralized baseband unit (BBU) processing in data centers to pool resources across multiple cell sites, reducing redundancy and improving efficiency in handling traffic variations.[36][37][38]A key enabler of vRAN is the use of functional splits defined by the 3rd Generation Partnership Project (3GPP), which divide RAN processing between centralized and distributed units to optimize transport requirements and resource sharing. For instance, 3GPP Option 2 splits the packet data convergence protocol (PDCP) and radio link control (RLC) layers, allowing higher-layer functions to be virtualized and pooled in a central cloud while lower layers remain closer to the radio units. These splits facilitate efficient resource pooling for multiple sites, supporting dynamic allocation and scalability in response to varying network demands.[39][32][40]The benefits of vRAN include significant cost reductions through the adoption of COTS hardware, with potential total cost of ownership (TCO) savings of up to 25% for centralized RAN architectures compared to traditional deployments. Additionally, virtualization simplifies network upgrades and enables orchestration using container platforms like Kubernetes, allowing automated scaling, deployment, and management of RAN functions across cloud-native environments. This integration also supports edge computing by distributing virtualized functions closer to the network edge for low-latency applications.[41][42][43]Major mobile network operators have adopted vRAN to modernize their infrastructures, with AT&T initiating virtualization efforts as early as 2013 and progressing to commercial 5G Cloud RAN deployments in partnership with vendors like Ericsson by 2024. Similarly, Verizon has incorporated vRAN in its 5G expansions, leveraging centralized processing for enhanced performance and cost efficiency. These deployments demonstrate vRAN's role in enabling scalable, software-driven networks that align with evolving demands for 5G and beyond.[44][45][46]
Open RAN
Open RAN represents a disaggregated and interoperable approach to radio access networks, emphasizing open interfaces to foster vendor diversity and innovation in telecommunications infrastructure. The O-RAN Alliance, established in 2018, defines Open RAN as an open and intelligent RAN architecture that promotes a broad industry ecosystem through standardized specifications for components and interfaces.[47] This paradigm builds on virtualization principles by introducing openness, allowing multi-vendor integration while enabling intelligent control via the RAN Intelligent Controller (RIC), which incorporates AI and machine learning for real-time optimization and automation.[48]Key components in Open RAN include the disaggregated Radio Unit (RU), Distributed Unit (DU), and Centralized Unit (CU), which separate hardware and software functions to enhance flexibility. The RU handles radio frequency transmission and reception, while the DU processes lower-layer baseband functions; the CU, often split into Control Plane (O-CU-CP) and User Plane (O-CU-UP) components, manages higher-layer protocols. Connectivity between these elements relies on open interfaces such as eCPRI for the fronthaul link between RU and DU, and the F1 interface for the split between CU and DU, ensuring interoperability across vendors.[48] The RIC further integrates via interfaces like E2 for near-real-time decisions, supporting AI/ML-driven policies for traffic management and resource allocation.[49]One primary advantage of Open RAN is the reduction of vendor lock-in, enabling operators to select best-of-breed components from multiple suppliers, which lowers costs and accelerates innovation through faster deployment cycles and ecosystem participation.[50] For instance, since 2020, Verizon has conducted extensive trials and deployments of multi-vendor Open RAN elements, including the first commercial multi-vendor O-RAN-based Distributed Antenna System in 2024, demonstrating improved network resilience and adaptability.[51] Similarly, DISH Network pioneered a large-scale Open RAN 5G buildout starting in 2020, integrating components from various vendors to create a cloud-native network, though it faced financial difficulties, regulatory pressures, and scaling challenges, culminating in a spectrum sale to AT&T and the start of decommissioning in late 2025 (ongoing as of November 2025).[52][53]Despite these benefits, Open RAN faces challenges, particularly in interoperability testing, where ensuring seamless performance across diverse vendor components requires rigorous validation of interfaces and protocols.[54] As of 2025, Open RAN accounts for approximately 5-10% of the overall RAN market, with growing incorporation in new 5G deployments driven by operators seeking enhanced flexibility.[55]
Generations
2G and 3G RAN
The second-generation (2G) radio access network (RAN), primarily based on the Global System for Mobile Communications (GSM) and its enhancement General Packet Radio Service (GPRS), was structured as the Base Station Subsystem (BSS). The BSS comprised Base Transceiver Stations (BTS) for handling radio transmission and reception, and Base Station Controllers (BSC) for managing radio resource allocation, handover, and signaling across multiple BTS units.[56] GSM employed Time Division Multiple Access (TDMA) combined with Frequency Division Multiple Access (FDMA) as its primary multiple access schemes, enabling efficient spectrum use for circuit-switched voice services.[57] With the introduction of GPRS, packet-switched data capabilities were added, achieving a theoretical maximum data rate of approximately 171 kbit/s using eight time slots in the downlink.[58]The third-generation (3G) RAN evolved to the UMTS Terrestrial Radio Access Network (UTRAN), featuring Node B elements—analogous to BTS—for radio frequency transmission and Radio Network Controllers (RNC) for centralized control of radio resources, mobility management, and interfacing with the core network.[59] UTRAN utilized Wideband Code Division Multiple Access (WCDMA), a form of direct-sequence CDMA, to support higher capacity and better interference management compared to 2G approaches.[60] Basic UMTS provided circuit- and packet-switched services with a peak data rate of up to 2 Mbit/s, while enhancements like High-Speed Downlink Packet Access (HSDPA) improved packet data efficiency through adaptive modulation and faster scheduling, enabling downlink speeds beyond initial capabilities.[61]Key evolutions from 2G to 3G shifted the RAN focus from predominantly voice-centric, circuit-switched operations in GSM to integrated voice and higher-speed data services in UMTS, with GPRS marking the initial packet data transition and UTRAN enabling broadband-like mobile internet access.[62] A significant advancement in 3G was the introduction of soft handover, where a mobile device maintains simultaneous connections to multiple Node Bs during transitions, reducing call drops and improving mobility in dense environments through macro-diversity combining.[63]By 2025, 2G and 3G RAN deployments have been largely phased out in most urban and developed regions to reallocate spectrum for 4G and 5G, with 61 networks scheduled for shutdown that year alone as part of 131 total retirements by 2030.[64] However, these legacy networks continue to support low-bandwidth Internet of Things (IoT) applications, such as remote metering and asset tracking, in rural and underserved areas where advanced infrastructure rollout remains limited.[65]
4G LTE RAN
The 4G Long-Term Evolution (LTE) Radio Access Network (RAN), formally known as the Evolved Universal Terrestrial Radio Access Network (E-UTRAN), marks a pivotal shift to an all-IP, packet-switched architecture optimized for high-speed mobile broadband, contrasting with the circuit-switched emphasis of prior generations. This design enables efficient data delivery for multimedia applications, with theoretical peak downlink speeds surpassing 100 Mbps under typical configurations. The E-UTRAN simplifies the network hierarchy by integrating functions traditionally handled by separate Base Station Controller (BSC) and Radio Network Controller (RNC) entities into a single logical node, the evolved Node B (eNodeB), which manages radio resource control, handover, and scheduling directly.[66] This flattened structure reduces latency and enhances scalability, supporting seamless mobility across cells.[67]At the physical layer, E-UTRAN employs Orthogonal Frequency-Division Multiplexing (OFDM) for the downlink to combat multipath fading and achieve high spectral efficiency, while Single-Carrier Frequency-Division Multiple Access (SC-FDMA) is used for the uplink to minimize peak-to-average power ratio for user equipment battery efficiency.[68] These modulation schemes, combined with flexible bandwidth options from 1.4 MHz to 20 MHz, facilitate downlink throughputs over 100 Mbps in 20 MHz channels.[69] Key features include Multiple Input Multiple Output (MIMO) technology, supporting up to 8x8 configurations in the downlink for enhanced capacity and reliability through spatial multiplexing and diversity.[70] Carrier aggregation further boosts performance by combining multiple component carriers—up to five in LTE-Advanced—allowing effective bandwidths up to 100 MHz and proportional increases in data rates.[71] Additionally, Self-Organizing Networks (SON) automate network management through self-configuration of new eNodeBs, self-optimization of parameters like handover thresholds, and self-healing for fault recovery, reducing operational costs.[72]LTE-Advanced, standardized in 3GPP Release 10, extends these capabilities with peak data rates approaching 1 Gbps in the downlink via advanced MIMO, carrier aggregation, and coordinated multipoint transmission, enabling gigabit-class mobile broadband. Voice services transitioned to Voice over LTE (VoLTE), which leverages IP Multimedia Subsystem (IMS) for high-definition voice delivery over the LTE packet core, supporting simultaneous voice and data without circuit-switched fallback.[73] LTE dominated global mobile networks through the 2010s and into the 2020s, achieving approximately 90% population coverage worldwide by 2025 as the foundational technology for 4G broadband.[74]
5G NR RAN
The 5G New Radio (NR) Radio Access Network (RAN), known as NG-RAN, represents a significant evolution in wireless infrastructure, designed to support a wide array of services through enhanced flexibility and efficiency. At its core, NG-RAN consists of gNodeB (gNB) base stations, which can be disaggregated into a Centralized Unit (CU) and Distributed Unit (DU) to optimize resource allocation and scalability. The CU handles higher-layer functions such as Radio Resource Control (RRC) and Packet Data Convergence Protocol (PDCP), while the DU manages lower-layer processing like Medium Access Control (MAC) and Physical Layer (PHY), connected via the F1 interface. This split architecture, standardized by 3GPP in Release 15, enables virtualization and cloud-native deployments, improving operational efficiency in diverse environments.[6][75][76]The NR air interface underpins NG-RAN's capabilities, featuring flexible numerology that allows subcarrier spacing to vary from 15 kHz to 240 kHz, accommodating different frequency bands and service requirements. This adaptability supports channel bandwidths up to 400 MHz in mmWave spectrum, enabling peak downlink data rates of up to 20 Gbps under ideal conditions. Key enablers include Massive MIMO, which deploys hundreds of antennas at the base station to serve multiple users simultaneously with spatial multiplexing; advanced beamforming techniques that direct signals precisely to improve coverage and reduce interference; and mmWave frequencies (above 24 GHz) for ultra-high throughput in dense urban areas. Additionally, network slicing allows the creation of isolated logical networks on shared infrastructure, each tailored with specific Quality of Service (QoS) parameters like latency and reliability, as defined in 3GPP TS 23.501.[77][78][79][80][81]NG-RAN supports three primary use cases outlined by ITU-R M.2410: enhanced Mobile Broadband (eMBB) for high-data-rate applications like 4K/8K video streaming, achieving user-experienced speeds up to 100 Mbps; Ultra-Reliable Low-Latency Communications (URLLC) targeting end-to-end latency below 1 ms and reliability over 99.999% for mission-critical scenarios such as autonomous vehicles and industrial automation; and massive Machine-Type Communications (mMTC) enabling connectivity for up to 1 million devices per square kilometer in IoT deployments like smart cities. By November 2025, standalone (SA) 5G deployments have surpassed 60 commercial networks globally, with projections for over a dozen additional launches, marking significant progress toward full 5G core integration. Furthermore, NG-RAN facilitates seamless integration with non-3GPP accesses like Wi-Fi through the Non-3GPP Interworking Function (N3IWF), which establishes secure IPsec tunnels to the 5G core, enhancing coverage in hybrid environments.[82][83][84][85]
Beyond 5G Developments
The vision for 6G radio access networks (RANs) emphasizes terahertz (THz) frequencies to enable unprecedented data rates and connectivity densities, building on the foundational spectrum extensions explored in 5G. THz bands, spanning 0.1 to 10 THz, offer vast bandwidths potentially exceeding hundreds of GHz, facilitating applications like holographic communications and immersive extended reality. Research indicates that sub-THz communications (100-300 GHz) could achieve peak rates of up to 1 Tbps over short distances, addressing the exponential growth in data demands from AI-driven services and massive IoT ecosystems.[86][87]AI-native RAN architectures represent a core pillar of 6G, where artificial intelligence is embedded from the design phase to enable autonomous optimization of network resources, beamforming, and interference management. Unlike prior generations, AI-native designs integrate machine learning models directly into the RAN stack for real-time decision-making, such as predictive traffic routing and energy-efficient spectrum allocation, potentially improving spectral efficiency by 20-50% in dynamic environments. Integrated sensing and communication (ISAC) further enhances this by merging radar-like sensing with data transmission, allowing RANs to simultaneously detect environmental changes and communicate, which supports applications in autonomous vehicles and smart cities while utilizing the same THz spectrum for dual purposes.[88][89][90]Advanced antenna technologies like holographic MIMO and orbital angular momentum (OAM) multiplexing are poised to revolutionize 6G spatial multiplexing. Holographic MIMO employs metasurface-based arrays to create dynamic, programmable radiation patterns, enabling ultra-massive MIMO with thousands of elements for precise beam control in THz bands. OAM, which exploits the helical phase structure of electromagnetic waves, provides an additional degree of freedom for multiplexing orthogonal modes, potentially increasing capacity by integrating with sub-THz channels without additional bandwidth. Experimental demonstrations have shown OAM achieving multi-Gbps rates in near-field scenarios, with potential scalability to Tbps in 6G backhaul links.[91][92]Ongoing research initiatives are steering 6G RAN development toward commercialization by 2030. The 3GPP's Release 18 and subsequent releases (Rel-19 onward) lay the groundwork for 6G by enhancing 5G-Advanced features like AI/ML integration in the air interface, with early 6G studies focusing on THz feasibility and non-terrestrial networks (NTN) starting in Rel-20 around 2026. The European Union's Hexa-X project, a flagship initiative involving over 60 partners, targets a sustainable 6G platform with key enablers like sub-THz transceivers and ISAC, aiming for deployment readiness by 2030 through collaborative trials on AI-optimized RAN fabrics.[93][94][95]Emerging trends in 6G RAN include AI-driven optimization for self-healing networks and seamless satellite-terrestrial integration to achieve ubiquitous coverage. AI algorithms will enable predictive maintenance and resource orchestration across hybrid architectures, reducing latency to microsecond levels for time-sensitive applications. Satellite-terrestrial convergence, leveraging low-Earth orbit (LEO) constellations with terrestrial RANs, promises global coverage with integrated backhaul, where AI facilitates handovers and spectrum sharing between space and ground segments, targeting zero-coverage gaps in remote areas.[96][97][98]
Protocols and Standards
Key Protocols
The radio access network (RAN) employs a layered protocol stack to manage the air interface between user equipment (UE) and base stations, ensuring reliable data transmission, resource allocation, and connection control. This stack, defined by the 3rd Generation Partnership Project (3GPP), is divided into physical (PHY), medium access control (MAC), radio link control (RLC), packet data convergence protocol (PDCP), and radio resource control (RRC) layers, with adaptations across generations for enhanced efficiency and performance.[99][100]The PHY layer (Layer 1) handles the physical transmission of data over the radio channel, including channel coding for error detection and correction, as well as modulation schemes that map digital data to analog signals. In 5G New Radio (NR), supported modulation formats include quadrature phase shift keying (QPSK), 16-quadrature amplitude modulation (16QAM), 64QAM, and 256QAM, enabling higher spectral efficiency in favorable channel conditions while QPSK provides robustness in poor signal environments.[101]Hybrid automatic repeat request (HARQ) operates at this layer to combine forward error correction with retransmissions; upon detecting errors via cyclic redundancy checks, the receiver requests retransmissions of specific transport blocks, improving throughput in fading channels compared to pure ARQ.[102]Layer 2 encompasses the MAC, RLC, and PDCP sublayers, which manage data flow, reliability, and security. The MAC sublayer performs scheduling to allocate radio resources dynamically based on UE needs and channel quality, multiplexes logical channels into transport channels, and handles priority queuing for diverse traffic types. The RLC sublayer provides segmentation and reassembly of data units, ensuring in-sequence delivery through acknowledged mode operations, while also supporting automatic repeat request (ARQ) for error recovery beyond PHY-level HARQ. The PDCP sublayer adds security via ciphering and integrity protection, performs header compression to reduce overhead (e.g., Robust Header Compression for IP packets), and enables robust header compression release for seamless mobility.[99][100]The RRC layer, part of Layer 3, oversees connection management, including establishment, reconfiguration, and release of radio bearers, as well as broadcast of system information and mobility procedures. It coordinates UE states (e.g., idle, connected) and triggers measurements for handover decisions, ensuring efficient resource utilization across the network.[99]Key air interface protocols facilitate initial access and mobility. The random access channel (RACH) procedure allows UEs to synchronize with the network and request resources; in LTE, it involves a four-step contention-based process where the UE sends a preamble, receives a random access response with timing advance, submits an identity, and resolves contention via a unique identifier. Handover procedures maintain connectivity during mobility, involving measurement reporting by the UE, decision by the source base station, and execution through reconfiguration messages to minimize interruption (typically under 50 ms in LTE). For inter-base station coordination in LTE, the X2 interface enables direct signaling between eNodeBs for handover preparation, resource status updates, and interferencemanagement, reducing latency compared to core network routing.[103][100][104]RAN protocols have evolved significantly from 3G Universal Mobile Telecommunications System (UMTS), where access stratum user plane protocols emphasized circuit-switched services with dedicated channels, to 5G NR's service-based architecture that prioritizes all-IP packet-switched transport, flexible numerology, and cloud-native integration for ultra-reliable low-latency communications. This progression, spanning 3GPP Releases 99 (3G) through 15+ (5G), has shifted from asynchronous transfer mode influences in 3G to OFDM-based waveforms and network slicing in 5G, enhancing scalability for diverse applications.[105][6]
Standardization Bodies and Interfaces
The primary standardization body for radio access networks (RAN) is the 3rd Generation Partnership Project (3GPP), a collaborative organization comprising seven telecommunications standards development entities that has defined specifications for mobile technologies from GSM evolution to 6G. 3GPP's work ensures interoperability and global deployment of RAN architectures, starting with early enhancements to 2G systems and progressing through 3G, 4G LTE, and 5G NR.[106] Complementing 3GPP, the International Telecommunication Union Radiocommunication Sector (ITU-R) manages global radio-frequency spectrum allocation and harmonization, defining International Mobile Telecommunications (IMT) requirements that guide RAN spectrum usage for international compatibility and efficient resource sharing.[107] For open and disaggregated RAN, the O-RAN Alliance develops specifications that promote vendor-neutral interfaces and intelligent control, building on 3GPP standards to enable multi-vendor interoperability.[48]Key interfaces standardized by these bodies facilitate RAN connectivity and management. In 3GPP specifications, the S1 interface connects the RAN (e.g., eNB in LTE) to the core network (CN), supporting control and user plane signaling for mobility and session management. The Xn interface enables direct communication between RAN nodes (e.g., gNBs in 5G) for handover and coordination, enhancing inter-RAN efficiency without core network involvement. Within the O-RAN framework, the E2 interface links the near-real-time RAN Intelligent Controller (RIC) to RAN elements like distributed units (DUs) and central units (CUs), allowing dynamic policy enforcement and optimization.[48]3GPP's release evolution has shaped RAN standards since Release 99 (R99), which introduced initial 3G UMTS features including the UTRAN architecture for wideband CDMA.[108] Subsequent releases progressed through enhancements in Releases 4-8 for 3G evolution, Releases 8-14 for 4G LTE with OFDMA-based RAN, and Releases 15-17 for 5G NR, incorporating massive MIMO, beamforming, and URLLC support.[106] These releases include harmonization efforts, such as aligned spectrum bands and protocol profiles under ITU-R IMT guidelines, to enable seamless global roaming and device compatibility across operators.[109]As of 2025, 3GPP Release 18 (Rel-18), branded as 5G-Advanced, was completed with specification freeze in mid-2024, focusing on enhancing RAN capabilities with AI/ML integration for resource optimization, extended reality support, and non-terrestrial network integration, while laying groundwork for 6G through study items on new use cases.[110] Ongoing work in 2025 centers on Release 19, the second phase of 5G-Advanced, which introduces further enhancements such as AI/ML at the physical layer for improved efficiency and spectrum use, enhanced positioning, and additional features for industrial IoT and automotive applications, continuing to bridge 5G toward future IMT-2030 systems under ITU-R.[111]
Deployments and Variations
Global Deployments
By the end of 2025, 5G radio access networks (RAN) have been deployed in 191 countries and territories worldwide, with 647 operators actively investing in the technology.[112] Globally, the number of operational 5G base stations exceeds 5 million, enabling coverage for approximately one-third of the world's population.[113]China leads this expansion, operating nearly 4.71 million 5G base stations as of September 2025, which accounts for a significant portion of the global total and supports widespread urban and rural connectivity.[114]A prominent case study in urban 5G deployment is Verizon's use of millimeter-wave (mmWave) spectrumin the United States, targeting dense city environments to deliver high-speed, low-latency services for applications like fixed wireless access and enterprise solutions.[115] This approach has enhanced user experiences in major metropolitan areas by leveraging mmWave's capacity for ultra-high throughput, though it requires dense small-cell infrastructure to mitigate propagation challenges.[116] In contrast, European deployments emphasize shared infrastructure models to extend 4G and 5G coverage to rural areas, where population sparsity increases costs; neutral host networks, for instance, allow multiple operators to share towers and backhaul, reducing deployment expenses by 10-50% and improving broadband access in underserved regions.[117] Such models have been evaluated in countries like Finland and Switzerland, demonstrating higher profitability through collaborative site sharing.[118]The global RAN vendor landscape remains concentrated, with Huawei and Ericsson dominating approximately 60% of the market share in the first half of 2025, particularly in regions outside China where geopolitical factors influence selections.[119]Huawei holds the top position in three of five major geographical regions, while Ericsson leads in business performance metrics.[120] A notable shift toward multi-vendor environments is occurring through Open RAN adoption, which has stabilized after initial challenges, enabling operators to integrate equipment from diverse suppliers and reduce dependency on single vendors.[121]Spectrum allocation via auctions has profoundly shaped 5G RAN deployments, with the C-band (3.7-4.2 GHz) emerging as a critical mid-band resource for balancing coverage and capacity.[122] In the United States, the Federal Communications Commission's C-band auction generated over $81 billion, funding extensive mid-band 5G rollouts that enhanced nationwide performance.[123] Globally, harmonized C-band allocations in Europe and Asia have promoted economies of scale, though variations in auction designs—such as those in Germany and France—have led to differing efficiencies in spectrum utilization and operator investments.[124]
Regional Variations
In the United States, the radio access network (RAN) market is characterized by a diverse vendor landscape dominated by Ericsson, which holds over 50% market share as of 2025, alongside significant contributions from Nokia and Samsung.[125] This mix reflects strategic operator choices amid geopolitical restrictions on certain foreign vendors, with Ericsson and Nokia leading deployments for major carriers like Verizon and AT&T. A notable innovation is the Open RAN pilots led by DISH Wireless, which received a $50 million U.S. Department of Commerce grant in 2024 to establish an integration and deployment center, enabling multi-vendor, cloud-native 5G networks that reduce dependency on proprietary hardware.[126] The Federal Communications Commission (FCC) has further supported high-capacity deployments through policies favoring millimeter wave (mmWave) spectrum, including a 2025 overhaul of rules for bands like 24 GHz, 28 GHz, and 37 GHz to facilitate sharing between federal and non-federal users, promoting rapid 5G rollout in urban areas.[127]Europe emphasizes collaborative and regulated RAN approaches, with the European Union promoting shared infrastructure through neutral host models to optimize costs and coverage. Neutral hosts, independent entities that deploy and manage shared RAN assets like indoor 5Gsmall cells, have been pioneered in projects such as Ericsson's 2023 rollout with Proptivity, the world's first neutral host-led shared indoor 5G RAN, enabling multiple operators to access unified networks without ownership.[128] These models can reduce 5G densification costs by up to 47% in urban settings, aligning with EU goals for efficient spectrum use.[129] Security in European RANs is heavily influenced by the General Data Protection Regulation (GDPR), which mandates robust processing safeguards for personal data in telecommunications, including encryption and breach notifications that extend to 5G network elements to protect user privacy across shared infrastructures.[130]Nokia has emerged as a leader in 5G Standalone (SA) deployments here, powering networks for operators like Telia across the Nordics and Baltics with cloud-native cores, and launching Europe's first private 5G SA hospital network with Boldyn in 2025, supporting advanced services like low-latency IoT.[131][132]In Asia, RAN variations are shaped by national priorities, with China exemplifying state-led 5G acceleration where Huawei maintains dominance, securing 52% of China Mobile's $1.1 billion 5G contract from 2023 to 2024 through government-backed investments in domestic technology.[133] This approach has enabled widespread Huawei-powered 5G base stations, contributing to nearly 4.7 million sites nationwide as of September 2025. In contrast, India's deployments focus on affordability, led by Reliance Jio's homegrown 5G stack using indigenous gear for low-cost rollout, serving 234 million subscribers with speeds up to 1 Gbps on the 700 MHz band for deep coverage.[134][135][136] Jio's end-to-end in-house development reduces operational expenses by minimizing foreign vendor reliance, positioning it for export to emerging markets.[137]Beyond these major regions, Africa's RAN landscape relies heavily on microwave backhaul due to geographic and infrastructural challenges, with approximately 80% of base stations using it for cost-effective connectivity in rural areas where fiber is uneconomical.[138] This technology supports 4G and early 5G extensions across vast terrains, as seen in South Africa's 176,000 microwave links generating R8.3 billion annually. In Latin America, economic factors have sustained 4G dominance, with slower 5G adoption driven by high deployment costs and modest GDP growth projected at 2.2% for 2025, limiting investments in spectrum auctions and infrastructure upgrades.[139][140]Mobile technologies still contribute 8.2% to regional GDP, but 4G networks remain the backbone for the majority of connections amid fiscal constraints.[141]
Challenges and Future Trends
Technical Challenges
One of the primary technical challenges in radio access network (RAN) design is achieving balanced coverage and capacity across diverse environments. In urban areas, densification through small cells and mmWave frequencies enables high-capacity deployments but exacerbates interference due to reduced inter-cell distances and susceptibility to blockage by obstacles like buildings. This interference in dense mmWave setups requires advanced mitigation techniques, such as coordinated beamforming, to maintain signal quality amid overlapping transmissions.[142] Conversely, rural areas face persistent coverage gaps stemming from sparse population densities and challenging backhaul requirements, where achieving 5-10 Gbps over distances of 20-60 km with limited channels proves difficult without costly fiber alternatives.[143]Scalability poses another significant hurdle as mobile traffic surges, with global data volumes projected to increase fourfold by 2025 driven by 5G adoption and new applications.[144] The RAN, which consumes approximately 73-80% of a mobile network's total energy, struggles to handle this growth without proportional rises in power usage, as base stations alone account for 80% of RAN power draw.[145][146] This inefficiency is compounded by the need for massive MIMO and small cell proliferation, which, while boosting capacity, elevate operational costs and environmental impact unless offset by dynamic power-saving mechanisms.[145]Integration challenges arise from interworking legacy systems, particularly in 5G non-standalone (NSA) deployments reliant on 4G LTE cores, which can lead to suboptimal performance and coverage inconsistencies due to mismatched radio resource management.[147] Enhancing spectrum efficiency through dynamic time division duplexing (TDD) in 5G helps allocate uplink and downlink resources flexibly, but it demands precise synchronization to avoid cross-link interference between neighboring cells.[148]Supply chain disruptions from the early 2020s, exacerbated by semiconductor shortages and geopolitical tensions, led to significant contraction in the global RAN market in 2024 due to delayed equipment availability. However, as of mid-2025, the market has stabilized, with growth observed outside China.[55][149][150][151] Concurrently, preparations for quantum-safe encryption in RAN are underway, with 3GPP and ETSI advancing standards for post-quantum cryptography to protect against future quantum threats to key exchange protocols in 5G architectures.[152][153]
Security and Innovations
Radio access networks (RANs) face significant security threats from legacy protocols and modern architectures. In legacy systems, the SS7 signaling protocol, designed without authentication or encryption, enables attacks such as location tracking via HLR/VLR queries, call and SMS interception, and fraud through service abuse.[154][155][156] These vulnerabilities persist in transitional networks, where 5G elements may still interface with SS7-based core systems. In 5G RAN, network slicing introduces risks including isolation failures, cross-slice attacks, and lateral movement due to configuration errors or device vulnerabilities, potentially compromising multiple virtual networks simultaneously.[157][158][159] Supply chain risks further exacerbate these issues, as seen in international bans on Huawei equipment citing national security concerns over potential backdoors and espionage in RAN hardware and software.[160][161][133]Mitigation strategies have evolved through standardization and architectural shifts. The 3GPP has enhanced security in Releases 15 and beyond with features like Subscription Concealed Identifier (SUCI) encryption, which conceals the Subscription Permanent Identifier (SUPI) using public-key cryptography to prevent eavesdropping and identity mapping attacks during registration.[162][163][164] These protections extend to integrity and confidentiality for user data in RAN interfaces. In Open RAN deployments, zero-trust architectures (ZTA) are increasingly implemented, enforcing continuous verification, micro-segmentation, and least-privilege access across disaggregated components to counter insider threats and supply chain compromises.[165][166][167] ZTA maps controls to O-RAN functions like the RIC and near-RT RIC, enabling incremental security adoption.[168]Innovations in RAN leverage artificial intelligence (AI) and emerging technologies to enhance reliability and performance. AI and machine learning (ML) enable predictive maintenance by analyzing telemetry data from base stations to forecast failures, optimize resource allocation, and minimize outages, potentially reducing operational costs by detecting anomalies in real time.[169][170][171] Edge AI integration within the RAN Intelligent Controller (RIC) supports near-real-time decisions, such as dynamic spectrum management and traffic steering, by processing data at the network edge for low-latency automation.[172][173][174] Quantum computing integration, focused on post-quantum cryptography (PQC), is projected for widespread RAN adoption by 2030 to safeguard against quantum attacks on current encryption; 3GPP and GSMA recommend transitioning network equipment to quantum-resistant algorithms starting in 2026, with full compliance by 2030.[175][152][176] This includes hybrid schemes combining classical and PQC keys for RAN air interfaces and backhaul.[177]Emerging trends emphasize sustainability and expanded coverage in RAN evolution. Green RAN initiatives incorporate advanced sleep modes for base stations, activating during low-traffic periods to achieve power reductions of up to 30%, thereby lowering the carbon footprint of dense 5G deployments.[178][179][180] These modes, combined with AI-driven optimization, target overall energy efficiency gains in O-RAN architectures. The fusion of non-terrestrial networks (NTN) with terrestrial RAN, as standardized by 3GPP Release 17, integrates satellite and high-altitude platforms to extend seamless coverage, using O-RAN interfaces for hybrid topologies that support IoT and broadband in remote areas.[181][182][183] This convergence enables regenerative payloads for processing at orbital nodes, reducing latency in global networks.[184]