Cellular frequencies
Cellular frequencies are the designated ranges of radio frequencies within the ultra-high frequency (UHF) and super-high frequency (SHF) bands allocated for cellular mobile telecommunications, enabling wireless voice, data, and internet services through networks of base stations and mobile devices.[1] These frequencies, typically spanning from sub-1 GHz low-band for wide coverage to millimeter-wave bands above 24 GHz for high-speed data, are standardized by the 3rd Generation Partnership Project (3GPP) to ensure global interoperability across generations of cellular technology from 2G to 5G.[2] Regulatory bodies such as the Federal Communications Commission (FCC) in the United States and the International Telecommunication Union (ITU) manage spectrum allocation to prevent interference and promote efficient use.[3] The evolution of cellular frequencies began with early analog systems like the Advanced Mobile Phone System (AMPS) in the 800 MHz band, transitioning to digital 2G Global System for Mobile Communications (GSM) standards primarily using 900 MHz and 1800 MHz bands for improved capacity and security.[4] Subsequent generations expanded the spectrum: 3G Universal Mobile Telecommunications System (UMTS) introduced bands around 2100 MHz, while 4G Long-Term Evolution (LTE) utilized a broader set of frequency-division duplexing (FDD) and time-division duplexing (TDD) bands defined in 3GPP TS 36.101, such as Band 1 (uplink 1920–1980 MHz, downlink 2110–2170 MHz) and Band 40 (2300–2400 MHz).[5][6] In modern 5G New Radio (NR) networks, cellular frequencies are categorized into Frequency Range 1 (FR1, 410 MHz to 7.125 GHz) for sub-6 GHz mid-band coverage and speed balance, and Frequency Range 2 (FR2, 24.25–71 GHz) for mmWave high-band ultra-fast throughput, as specified in 3GPP TS 38.101-1.[7] Low-band frequencies below 1 GHz, like 600–900 MHz, prioritize rural and indoor penetration, while mid-band (1–6 GHz) supports urban deployments, and high-band enables applications like augmented reality.[8] These bands also accommodate emerging technologies such as Narrowband Internet of Things (NB-IoT) within existing LTE spectrum for low-power, wide-area IoT connectivity.[8] Ongoing spectrum auctions and harmonization efforts continue to expand available frequencies to meet growing data demands.[3]Fundamentals
Definition and Principles
Cellular frequencies refer to specific segments of the electromagnetic spectrum, ranging from 410 MHz to 100 GHz as specified in 3GPP standards for 5G New Radio (NR), though current deployments typically span 600 MHz to 40 GHz, that are allocated for use in cellular networks to facilitate wireless transmission of voice, data, and multimedia services between mobile devices and base stations.[9] These frequencies operate within the radio frequency (RF) portion of the spectrum, enabling non-wired communication over the air interface in mobile telephony systems.[10] At their core, cellular frequencies rely on the principles of radio wave propagation, where electromagnetic waves travel from base stations to user equipment, influenced by factors such as frequency, terrain, and atmospheric conditions. The wavelength \lambda of these radio waves is inversely proportional to their frequency f, governed by the equation \lambda = \frac{c}{f}, where c is the speed of light in vacuum (approximately $3 \times 10^8 m/s); higher frequencies thus correspond to shorter wavelengths, affecting propagation characteristics like penetration through obstacles and susceptibility to attenuation.[11] In cellular contexts, this relationship allows for spatial reuse of frequencies across a network of cells arranged in hexagonal patterns, where the same frequency can be reused in non-adjacent cells to maximize capacity while minimizing interference.[12] Within the cellular architecture, these frequencies support multiple access techniques to accommodate numerous users simultaneously, including time-division multiplexing (TDM), which allocates distinct time slots to different signals; frequency-division multiplexing (FDM), which divides the spectrum into sub-bands for parallel transmission; and code-division multiplexing (CDM), which uses unique codes to separate signals across the shared bandwidth.[13] These methods enable efficient handling of multiple voice calls or data streams within a single cell, forming the basis for scalable network operations.[9] Cellular spectrum is broadly classified into categories based on frequency ranges and their propagation trade-offs: sub-1 GHz low-band frequencies prioritize extensive coverage and building penetration for wide-area services; 1-6 GHz mid-band offers a balance between coverage and capacity suitable for urban environments; and above 24 GHz high-band, often termed millimeter wave (mmWave), provides high data throughput but with limited range due to higher attenuation.[14] This classification guides the deployment of cellular technologies to optimize performance across diverse scenarios.[9]Key Characteristics of Cellular Spectrum
Cellular spectrum exhibits varying bandwidth availability depending on the frequency range, with typical channel widths ranging from 5 MHz in lower bands to 200 MHz or more in higher bands, enabling scalable capacity for mobile networks.[15] Wider bandwidths directly support higher data rates, as described by the Shannon-Hartley theorem, which states that the channel capacity C is given by C = B \log_2(1 + \text{SNR}), where B is the bandwidth and SNR is the signal-to-noise ratio; this relationship underscores how increased B amplifies throughput potential in cellular systems without altering SNR.[16] For instance, 5G deployments leverage up to 100 MHz channels in mid-band spectrum to achieve multi-gigabit speeds, contrasting with narrower 5-20 MHz allocations in earlier generations.[17] Propagation behaviors differ markedly across frequency bands, influencing coverage and performance trade-offs in cellular networks. Low frequencies below 1 GHz provide superior penetration through obstacles like buildings and foliage, supporting cell ranges up to 30-50 km in rural areas due to lower attenuation over distance.[8] Mid-band frequencies (1-6 GHz) offer a balance, with typical urban cell radii of 0.5–2 km, combining reasonable coverage with higher capacity than low bands.[8][18] High frequencies above 24 GHz, such as mmWave, enable gigabit-per-second speeds through ample bandwidth but are constrained by high path loss, limiting ranges to under 1 km and requiring dense small-cell deployments.[8] Attenuation factors further shape spectrum suitability, with free-space path loss (FSPL) quantifying frequency-dependent signal degradation via the formula \text{FSPL (dB)} = 20 \log_{10}(d) + 20 \log_{10}(f) + 20 \log_{10}\left(\frac{4\pi}{c}\right), where d is distance in meters, f is frequency in Hz, and c is the speed of light; this shows loss increasing logarithmically with both distance and frequency, making higher bands more susceptible to rapid signal decay.[19] In mmWave bands, rain fade exacerbates attenuation, with heavy precipitation causing 10-20 dB or more loss over short paths according to ITU-R models, necessitating robust link margins. Urban environments introduce multipath fading, where signals reflect off structures, creating interference patterns that cause rapid fluctuations in received power, particularly pronounced at mid- and high-band frequencies.[20] Spectrum licensing introduces scarcity challenges, as cellular operations predominantly rely on exclusive licensed bands to ensure interference-free use, unlike unlicensed spectrum shared via contention protocols. Auction-based allocations, common in many regions, drive up costs—such as billions in bids for mid-band blocks—potentially straining operator investments and delaying network rollout.[21] This exclusivity supports reliable quality-of-service for cellular but contrasts with unlicensed options like 5 GHz Wi-Fi, which face congestion despite lower acquisition costs.[22]Historical Development
Origins in Analog Systems
The roots of cellular frequencies lie in early 20th-century mobile radio experiments, particularly those conducted by Bell Laboratories in the 1940s. These efforts focused on developing car-mounted radiotelephone systems to enable voice communication while in motion, utilizing very high frequency (VHF) bands in the 30-50 MHz range, such as the 35-44 MHz allocation for initial services.[23] In 1946, the Bell System launched the first commercial Mobile Telephone Service (MTS) in St. Louis, Missouri, which operated on these low VHF channels to connect vehicles to the public switched telephone network via operator-assisted calls, marking a foundational step toward widespread mobile connectivity despite limited capacity due to spectrum scarcity.[4] The conceptual foundation for modern cellular systems emerged in the 1970s with Bell Labs' development of the Advanced Mobile Phone Service (AMPS), the first true cellular telephone standard. AMPS introduced frequency division multiple access (FDMA) in the 800 MHz band, specifically allocating 824-849 MHz for uplink (mobile-to-base) transmissions and 869-894 MHz for downlink (base-to-mobile), divided into two 12.5 MHz blocks known as the A and B bands to support competing carriers.[4] This design enabled hexagonal cell layouts and frequency reuse, allowing the same spectrum to be shared across non-adjacent cells to expand coverage and capacity beyond the constraints of earlier point-to-point mobile radio systems. Key regulatory milestones accelerated analog cellular deployment. In the United States, the Federal Communications Commission (FCC) finalized approvals for the 800 MHz cellular band in 1983, authorizing the construction and operation of AMPS networks, with the first commercial service launching in Chicago on October 13, 1983.[24] Paralleling this, Europe saw the rollout of the Nordic Mobile Telephone (NMT) system in 1981, an analog standard operating initially in the 450 MHz band (453-468 MHz) across Nordic countries and later extended to 900 MHz (890-960 MHz) for higher capacity, promoting cross-border roaming through harmonized specifications developed by the European Conference of Postal and Telecommunications Administrations (CEPT).[25][26][27] Analog systems like AMPS and NMT faced inherent limitations due to their reliance on frequency reuse patterns to mitigate interference. A common 7-cell cluster pattern was employed, where frequencies were reused every seventh cell to maintain a sufficient co-channel reuse ratio (typically D/R ≥ 4.6, where D is the reuse distance and R is the cell radius), minimizing co-channel interference from distant cells using the same frequencies.[28] This approach, while innovative, constrained overall system capacity, as only about one-seventh of available channels could be used per cell, leading to rapid saturation in high-demand areas and prompting the eventual transition to digital technologies in subsequent generations.Transition to Digital and Beyond
The transition from first-generation (1G) analog cellular systems to digital technologies in the late 1980s and early 1990s revolutionized mobile communications by enabling efficient spectrum use, enhanced security, and the introduction of data services beyond voice telephony. The Global System for Mobile Communications (GSM), developed as a pan-European standard by the Conference of European Posts and Telecommunications Administrations (CEPT) starting in 1982, with the first agreed technical specifications in 1987, and later managed by the European Telecommunications Standards Institute (ETSI) from 1989, represented the cornerstone of this shift.[29] It primarily utilized the 900 MHz band, known as GSM-900, with uplink frequencies spanning 890-915 MHz and downlink frequencies from 935-960 MHz, allowing for digital voice encoding and basic short message service (SMS) capabilities.[30] To meet growing capacity demands in urban environments, the Digital Cellular System at 1800 MHz (DCS-1800) was subsequently integrated, operating on uplink 1710-1785 MHz and downlink 1805-1880 MHz, which supported more simultaneous users through smaller cell sizes while maintaining compatibility with the core GSM protocol.[30] By the mid-1990s, GSM had expanded globally, with commercial networks launched in over 200 countries and subscriber numbers surpassing 500 million by 2000, driven by its open standardization that facilitated interoperability.[29] Parallel to GSM's time-division multiple access (TDMA) approach, code-division multiple access (CDMA) gained traction as a competing 2G technology, particularly in North America. The Interim Standard 95 (IS-95), standardized by the Telecommunications Industry Association (TIA) in 1993, operated in the 800 MHz cellular band with uplink frequencies of 824-849 MHz and downlink of 869-894 MHz, leveraging spread-spectrum modulation to achieve higher spectral efficiency and resistance to interference.[31] This enabled voice quality improvements and circuit-switched data rates up to 14.4 kbps, addressing limitations of analog systems like fading in 1G. As 2G evolved into 3G, CDMA principles underpinned the Universal Mobile Telecommunications System (UMTS), adopted under the ITU's International Mobile Telecommunications-2000 (IMT-2000) framework using wideband CDMA (W-CDMA). UMTS targeted bands in the 1900-2100 MHz range, such as the widely used 2100 MHz allocation (uplink 1920-1980 MHz, downlink 2110-2170 MHz), which supported packet-switched data rates up to 2 Mbps for emerging multimedia applications like mobile web browsing. Key milestones underscored this digital progression: the world's first GSM commercial call occurred on July 1, 1991, in Helsinki, Finland, placed by former Prime Minister Harri Holkeri using a Nokia handset on the Radiolinja network.[32] This event symbolized the viability of digital cellular infrastructure. In 1999, the ITU approved the key characteristics for the IMT-2000 radio interfaces through its Radiocommunication Sector, defining core frequency bands including 1885-2025 MHz and 2110-2200 MHz to standardize 3G deployments worldwide and enable higher-speed services.[33][34] Early digital systems encountered significant challenges with fragmented frequency allocations across regions, which hindered seamless global roaming and necessitated dual- or tri-band devices for travelers. These issues prompted international standardization bodies like ETSI and the ITU to prioritize harmonized "preferential" bands, such as GSM's 900/1800 MHz pairings and IMT-2000's core allocations, fostering greater interoperability and accelerating adoption.[35] Such efforts in 2G and 3G established critical precedents for spectrum efficiency that informed later expansions in 4G systems.Regulatory Framework
International Standards
The International Telecommunication Union (ITU), through its Radiocommunication Sector (ITU-R), plays a central role in establishing global frameworks for cellular spectrum allocation via the World Radiocommunication Conferences (WRC), which occur every three to four years to review and update the Radio Regulations. These conferences identify frequency bands suitable for International Mobile Telecommunications (IMT), ensuring international coordination and preventing interference. For instance, at WRC-15, the 694-790 MHz band was identified for IMT use, particularly in ITU Region 1, to support mobile broadband expansion.[36] ITU designates core spectrum bands for successive IMT generations to facilitate technological evolution. IMT-Advanced, corresponding to 4G systems, encompasses bands in the range of approximately 450-3800 MHz, as outlined in Recommendation ITU-R M.2012, enabling global deployment of advanced mobile services. For IMT-2020, which defines 5G requirements, additional high-frequency bands from 24.25-86 GHz were studied and identified to meet demands for enhanced mobile broadband, ultra-reliable low-latency communications, and massive machine-type communications, per Recommendation ITU-R M.2083. At WRC-23, additional frequency bands including 6425–7025 MHz and 7025–7125 MHz were identified for IMT in various regions to support further 5G deployment and emerging use cases.[37] Harmonization principles under ITU aim to align spectrum footprints across regions, promoting economies of scale, global roaming, and reduced device complexity by standardizing key parameters despite local variations. For example, while the global 700 MHz band plan exhibits regional differences in block allocations, a common duplex spacing of 55 MHz is widely adopted to ensure interoperability in equipment design.[38] Complementing ITU's efforts, the 3rd Generation Partnership Project (3GPP) develops technical standards that integrate allocated IMT spectrum into practical implementations. In Release 15, 3GPP specified the 5G New Radio (NR) air interface, defining operating bands and channel arrangements to align with ITU-identified spectrum for both sub-6 GHz and millimeter-wave frequencies.National and Regional Allocations
National and regional allocations of cellular frequencies adapt international standards set by bodies like the ITU to local needs, considering factors such as population density, existing infrastructure, and spectrum availability. In North America, the Federal Communications Commission (FCC) manages allocations, with the 700 MHz band divided into specific sub-bands for commercial and public safety use. Band 12, covering 700 MHz A and B blocks (698–716 MHz uplink and 728–746 MHz downlink), is primarily licensed to major carriers like AT&T and Verizon for nationwide LTE coverage, enabling broad rural penetration due to its low-frequency propagation. Band 13, the 700 MHz C block (746–757 MHz downlink and 776–787 MHz uplink), is dedicated to public safety communications, integrated with FirstNet for nationwide broadband services. Additionally, the 600 MHz band (663–698 MHz uplink and 617–652 MHz downlink) was repurposed from television broadcasting following the 2017 incentive auction, auctioned to carriers like T-Mobile for enhanced low-band 5G deployment. Following WRC-23, the FCC has continued auctioning additional mid-band spectrum as of 2024 for 5G enhancements.[39] In Europe, the European Conference of Postal and Telecommunications Administrations (CEPT) promotes harmonized spectrum use across member states to facilitate cross-border roaming and efficient deployment. The 800 MHz band (Band 20: 791–821 MHz uplink and 832–862 MHz downlink) is allocated for LTE in rural areas, with licenses issued nationally but under CEPT guidelines for frequency division duplexing (FDD) to ensure compatibility. Similarly, the 2600 MHz band (Band 7: 2500–2570 MHz uplink and 2620–2690 MHz downlink) supports urban LTE capacity, harmonized for both FDD and time division duplexing (TDD) modes. For 5G, the 3.5 GHz band (3400–3800 MHz, Band n78) has been identified for trials and deployment following World Radiocommunication Conference (WRC-19) decisions, with national variations in exact sub-band assignments to avoid interference with satellite services. In the Asia-Pacific region, allocations reflect diverse national priorities, with significant auctions driving 4G and 5G rollout. China has allocated 2.6 GHz (2575–2635 MHz and 2635–2690 MHz, Band 38/41) for TD-LTE, emphasizing TDD for high-capacity urban networks operated by state carriers like China Mobile. In India, the 700 MHz band (Band n28: 703–748 MHz uplink and 758–803 MHz downlink) was auctioned in 2022 to bolster coverage, while the 3.3–3.6 GHz mid-band (Band n78) supports 5G sub-6 GHz deployments by operators like Reliance Jio and Bharti Airtel. Japan has pioneered mmWave allocations, assigning 28 GHz (27.4–29.5 GHz, Band n257) for 5G since 2019, focusing on high-speed applications in dense urban areas like Tokyo. Variations across regions introduce challenges in global device compatibility and network planning, including differences in duplex modes—FDD predominant in North America and Europe for symmetric traffic, versus TDD in parts of Asia for flexible asymmetry. Guard bands are implemented variably to mitigate interference, such as 5–10 MHz separations around 700 MHz edges in the US to protect adjacent services. Refarming of legacy bands, like migrating 2G GSM from 900 MHz and 1800 MHz to 4G LTE, is ongoing in Europe and Asia, requiring coordinated national policies to phase out analog while preserving coverage.Frequency Bands by Generation
2G and 3G Bands
Second-generation (2G) cellular networks, standardized under the Global System for Mobile Communications (GSM) by the 3rd Generation Partnership Project (3GPP), relied on a set of harmonized frequency bands to enable digital voice telephony and introductory data services like Short Message Service (SMS). These bands were allocated to balance coverage, capacity, and regional availability, with lower frequencies (below 1 GHz) favored for rural penetration and higher frequencies for urban density. The core 2G bands included GSM 850 MHz (3GPP Band 5, uplink: 824–849 MHz, downlink: 869–894 MHz), primarily used in the Americas for enhanced coverage; Personal Communications Service (PCS) 1900 MHz (Band 2, uplink: 1850–1910 MHz, downlink: 1930–1990 MHz), also common in North America; Digital Cellular System (DCS) 1800 MHz (Band 3, uplink: 1710–1785 MHz, downlink: 1805–1880 MHz), widely deployed in Europe, Asia, and Africa for higher capacity; and Primary GSM (PGSM) 900 MHz (Band 8, uplink: 890–915 MHz, downlink: 935–960 MHz), the foundational band for global roaming in most regions outside the Americas. Each band supported typical channel widths of 200 kHz, allowing up to 124 carriers in a 25 MHz allocation, which optimized spectrum efficiency for circuit-switched voice at data rates up to 9.6 kbps with enhancements like General Packet Radio Service (GPRS). Third-generation (3G) networks, built on Universal Mobile Telecommunications System (UMTS) using Wideband Code Division Multiple Access (WCDMA), refarmed existing 2G spectrum while introducing dedicated allocations for higher-speed packet data up to 384 kbps and early multimedia services. Key expansions included UMTS Band I at 2100 MHz (uplink: 1920–1980 MHz, downlink: 2110–2170 MHz), allocated internationally for initial 3G rollouts in Europe, Asia, and Japan to support 5 MHz channels; Band VIII, refarming the 900 MHz spectrum (uplink: 880–915 MHz, downlink: 925–960 MHz) for improved indoor and suburban coverage by reusing 2G infrastructure; and Band IV for Advanced Wireless Services (AWS) in the Americas (uplink: 1710–1755 MHz, downlink: 2110–2155 MHz), which combined mid-band capacity with lower uplink frequencies for efficient data delivery. These bands served as precursors to carrier aggregation in 4G by enabling multi-band operations and dual-carrier High-Speed Downlink Packet Access (HSDPA), though 3G channels were wider at 5 MHz compared to 2G's 200 kHz, prioritizing data throughput over voice channels.| Generation | Band | Uplink (MHz) | Downlink (MHz) | Typical Channel Bandwidth | Primary Regions | Key Use Case |
|---|---|---|---|---|---|---|
| 2G (GSM) | Band 5 (GSM 850) | 824–849 | 869–894 | 200 kHz | Americas | Rural voice coverage |
| 2G (GSM) | Band 2 (PCS 1900) | 1850–1910 | 1930–1990 | 200 kHz | North America | Urban capacity |
| 2G (GSM) | Band 3 (DCS 1800) | 1710–1785 | 1805–1880 | 200 kHz | Europe, Asia, Africa | High-density voice |
| 2G (GSM) | Band 8 (PGSM 900) | 890–915 | 935–960 | 200 kHz | Global (excl. Americas) | Baseline roaming |
| 3G (UMTS) | Band I (2100) | 1920–1980 | 2110–2170 | 5 MHz | Europe, Asia, Japan | Initial data services |
| 3G (UMTS) | Band VIII (900 refarm) | 880–915 | 925–960 | 5 MHz | Global | Enhanced coverage |
| 3G (UMTS) | Band IV (AWS) | 1710–1755 | 2110–2155 | 5 MHz | Americas | Balanced data/voice |
4G LTE Bands
The 4G LTE frequency bands, standardized by the 3rd Generation Partnership Project (3GPP) in Technical Specification (TS) 36.101, encompass a wide range of spectrum allocations designed to deliver high-speed mobile broadband with improved spectral efficiency over prior generations. These bands support frequency division duplexing (FDD) and time division duplexing (TDD) modes, with channel bandwidths up to 20 MHz per carrier, enabling peak downlink data rates of approximately 100 Mbps under 2x2 multiple-input multiple-output (MIMO) configurations in a single 20 MHz channel.[5][42] 53 E-UTRA operating bands are defined to accommodate regional variations and spectrum availability, though around 40 serve as core bands for global interoperability and deployment.[43] LTE bands are broadly classified into low-band (sub-1 GHz), mid-band (1-3 GHz), and high-band (above 3 GHz) categories, each optimized for different performance trade-offs in coverage, capacity, and propagation. Low-band frequencies prioritize extensive coverage for voice and basic data services, while mid- and high-bands emphasize higher throughput for urban broadband demands through wider channels and aggregation techniques.Low-Band LTE
Low-band LTE operates below 1 GHz, leveraging longer wavelengths for superior propagation and building penetration, which is essential for rural and suburban coverage. Representative examples include Band 8 (FDD, uplink: 880-915 MHz, downlink: 925-960 MHz) and Band 28 (FDD, uplink: 703-748 MHz, downlink: 758-803 MHz; also known as the Asia-Pacific Telecommunity or APT band). These support channel bandwidths of 5-20 MHz, allowing operators to achieve reliable connectivity over large areas with minimal infrastructure density.[5][43]Mid-Band LTE
Mid-band LTE, spanning 1-3 GHz, offers a balance between coverage and capacity, making it suitable for dense urban environments where moderate propagation distances suffice for high-data-rate services. Key bands include Band 3 (FDD, uplink: 1710-1785 MHz, downlink: 1805-1880 MHz), Band 7 (FDD, uplink: 2500-2570 MHz, downlink: 2620-2690 MHz), and Band 40 (TDD, 2300-2400 MHz). With 20 MHz channels, these enable downlink speeds up to 100 Mbps via 64 quadrature amplitude modulation (QAM) and 2x2 MIMO, supporting applications like video streaming and mobile internet.[5][42][43]High-Band LTE
High-band LTE focuses on capacity-intensive scenarios, utilizing frequencies above 3 GHz for denser deployments, though with reduced range compared to lower bands. Notable allocations are Band 41 (TDD, 2496-2690 MHz) and Advanced Wireless Services (AWS) bands such as Band 4 (FDD, uplink: 1710-1755 MHz, downlink: 2110-2155 MHz) and Band 66 (FDD, uplink: 1710-1780 MHz, downlink: 2110-2200 MHz). Carrier aggregation enhances performance here; for instance, two-component carrier (2CC) aggregation combines two 20 MHz carriers into an effective 40 MHz bandwidth, potentially doubling throughput to 200 Mbps in ideal conditions.[5][44][43] Global adoption of these bands has been facilitated by refarming legacy spectrum from 3G UMTS networks, such as reallocating the 2100 MHz allocation to LTE Band 1 (FDD, uplink: 1920-1980 MHz, downlink: 2110-2170 MHz), which has accelerated LTE rollout in regions like Europe and Asia by repurposing existing infrastructure without new auctions.[45][5] This evolution from 3G builds on harmonized allocations to enable seamless broadband upgrades.| Band | Duplex Mode | Frequency Range (MHz) | Typical Bandwidths (MHz) | Primary Use Case |
|---|---|---|---|---|
| 8 | FDD | UL: 880-915, DL: 925-960 | 5, 10, 15, 20 | Wide coverage |
| 28 | FDD | UL: 703-748, DL: 758-803 | 5, 10, 15, 20 | Rural extension |
| 3 | FDD | UL: 1710-1785, DL: 1805-1880 | 5, 10, 15, 20 | Urban balance |
| 7 | FDD | UL: 2500-2570, DL: 2620-2690 | 5, 10, 15, 20 | Capacity boost |
| 40 | TDD | 2300-2400 | 5, 10, 15, 20 | Symmetric data |
| 41 | TDD | 2496-2690 | 5, 10, 15, 20 | High throughput |
| 4/66 | FDD | UL: 1710-1780, DL: 2110-2200 | 5, 10, 15, 20 | Aggregation |
Advanced and Emerging Bands
5G NR Bands
The fifth-generation New Radio (5G NR) standard, developed by the 3GPP, divides its frequency bands into two primary ranges to balance coverage, capacity, and speed: Frequency Range 1 (FR1) for sub-6 GHz operations and Frequency Range 2 (FR2) for millimeter-wave (mmWave) operations above 24 GHz.[46] FR1 bands, spanning 410 MHz to 7.125 GHz as of recent updates, enable wider coverage and penetration suitable for diverse environments, while FR2 bands, from 24.25 GHz to 71 GHz, offer ultra-high bandwidth but require advanced techniques like beamforming to mitigate propagation losses.[47] These bands support both standalone (SA) deployments, which use a full 5G core network for optimized latency and efficiency, and non-standalone (NSA) modes, which leverage existing 4G LTE infrastructure for faster initial rollouts.[48] In FR1, key bands include n78 in the 3.3–3.8 GHz C-band, which operates in time-division duplex (TDD) mode with channel bandwidths up to 100 MHz, making it ideal for urban capacity enhancement through massive MIMO deployments.[46] Band n71 (downlink 617–652 MHz, uplink 663–698 MHz) provides extensive rural coverage due to its low-frequency propagation characteristics, supporting supplementary uplink (SUL) for improved upload performance in low-density areas with channel bandwidths up to 20 MHz.[49] Band n41 in the 2.5 GHz range (2496–2690 MHz) also uses TDD and facilitates high-capacity mid-band services with channel bandwidths up to 100 MHz, often aggregated for enhanced throughput.[46] Collectively, FR1 bands enable peak data rates up to 1 Gbps, prioritizing reliable connectivity over extreme speeds. FR2 mmWave bands deliver exceptional bandwidth for dense, high-demand scenarios. Band n257 covers 26.5–29.5 GHz (often referred to as the 28 GHz band) with TDD operation and channel bandwidths up to 400 MHz, though aggregations can exceed 800 MHz for multi-gigabit capacities.[46] Band n260 spans 37–40 GHz (39 GHz band), and n261 operates in 27.5–28.35 GHz, both requiring beamforming to focus signals and overcome short-range limitations with channel bandwidths up to 400 MHz for n260 and 200 MHz for n261.[46] These bands support ultra-high speeds exceeding 10 Gbps, enabling applications like fixed wireless access in urban hotspots.[51] The core 5G NR bands were defined in 3GPP Release 15 (completed in 2018), establishing initial FR1 and FR2 allocations up to 52.6 GHz. Releases 16 (frozen in 2020) and 17 (completed in 2022) expanded capabilities, including NR unlicensed (NR-U) operations in the 5–6 GHz bands and extensions to 7 GHz unlicensed spectrum for improved indoor and private network use, alongside FR2 reach to 71 GHz. Release 18, completed in June 2024, further enhances 5G NR with features like reduced capability (RedCap) devices for IoT and sidelink improvements, bridging toward 6G developments.[52] As of 2025, World Radiocommunication Conference (WRC-23) outcomes have identified the 6.425–7.125 GHz band for International Mobile Telecommunications (IMT), enhancing mid-band options across ITU regions.[53] Global 5G NR coverage now reaches approximately 51% of the world's population, driven by widespread FR1 deployments.[54]| Band | Frequency Range (GHz) | Duplex Mode | Max Channel Bandwidth (MHz) | Typical Use Case |
|---|---|---|---|---|
| n71 | DL: 0.617–0.652; UL: 0.663–0.698 | FDD/SUL | 20 | Rural coverage |
| n41 | 2.496–2.690 | TDD | 100 | Mid-band capacity |
| n78 | 3.3–3.8 | TDD | 100 | Urban capacity |
| n257 | 26.5–29.5 | TDD | 400 | High-speed urban hotspots |
| n260 | 37–40 | TDD | 400 | Dense mmWave access |
| n261 | 27.5–28.35 | TDD | 200 | mmWave augmentation |