Fact-checked by Grok 2 weeks ago

Accelerating change

Accelerating change denotes the empirically observed pattern of in the rate of technological progress, wherein advancements compound to yield successively faster innovations across multiple domains, from to . This phenomenon manifests in historical timelines where the intervals between transformative inventions have dramatically shortened, evolving from millennia for early tools to mere years or months for contemporary breakthroughs in fields like and . Central to this concept is the Law of Accelerating Returns, articulated by , which asserts that evolves through successive paradigms, each enabling the next to progress at an accelerating pace due to loops where more capable tools accelerate further . Empirical support includes the exponential doubling of computational power per , sustained for over half a century, alongside similar trajectories in genome sequencing costs and efficiency. While proponents highlight its predictive power for forecasting rapid future shifts, skeptics question its universality beyond select metrics, though data from diverse technological epochs consistently reveal rates of change rather than linear progression. The implications of accelerating change extend to societal transformation, driving unprecedented and capability enhancements, yet posing challenges in , ethical , and potential disruptions from superintelligent systems emerging from sustained trends. Defining characteristics include the self-reinforcing nature of progress, where computational abundance fuels algorithmic improvements, exemplified by the transition from mechanical calculators to pursuits within decades.

Conceptual Foundations

Definition and Core Principles

Accelerating change denotes the empirical observation that the pace of technological, scientific, and societal advancements has intensified over historical timescales, manifesting as an rather than linear trajectory in key metrics of . This is evidenced by sustained doublings in computational performance, where processing power has increased by factors exceeding a billionfold since the mid-20th century, driven by iterative improvements in and algorithms. The phenomenon implies that intervals between major innovations shorten, as each epoch of development builds cumulatively on prior achievements, yielding progressively greater capabilities in shorter periods. At its core, accelerating change operates through positive feedback loops, wherein advancements in information processing and computation enable more efficient discovery and implementation of subsequent innovations. For instance, enhanced resources facilitate complex simulations, , and of research processes, which in turn accelerate the of new knowledge and technologies. This self-amplifying mechanism contrasts with static or growth models, as returns on innovative efforts compound: a given input of ingenuity yields outsized outputs when leveraged atop exponentially growing infrastructural capabilities. Empirical support derives from long-term trends in density and , which have adhered to predictable doubling patterns for decades, underpinning broader technological proliferation. Another foundational is the paradigm-shift dynamism, where dominant technological regimes periodically yield to superior successors, each phase compressing the time required for equivalent leaps forward. Historical data indicate that while early paradigms, such as mechanical computing in the , advanced slowly, later ones like integrated circuits exhibit superexponential rates due to and interconnectivity. This underscores causal realism in progress: change accelerates not randomly but through measurable efficiencies in R&D cycles, , and , though it remains contingent on sustained investment and avoidance of systemic disruptions. Critics, including some econometric analyses, note that not all domains exhibit uniform acceleration, with sectors like showing punctuated rather than smooth exponentials, yet aggregate technological output metrics confirm the overarching trend.

Distinction from Linear Progress Models

Linear progress models assume technological advancement occurs at a constant rate, akin to steady, additive increments where each unit of time yields a fixed amount of , such as in simple extrapolations of historical trends without considering effects. These models, often rooted in intuitive human expectations of uniform pacing, project future capabilities by extending past linear gains, implying predictable timelines for without in the underlying rate. Accelerating change, by contrast, posits that the pace of progress itself escalates over time, typically following exponential or double-exponential trajectories due to self-reinforcing mechanisms inherent in evolutionary processes. Proponents argue this arises from feedback loops, where advancements—such as increased computational power—enable more rapid design, testing, and iteration of subsequent technologies, thereby shortening development cycles and amplifying returns on prior investments. Unlike linear models, which break down beyond the initial "knee of the curve" in exponential growth phases, accelerating change accounts for paradigm shifts that redefine limits, as each epoch of technology builds upon and surpasses the previous one at an intensifying velocity. This conceptual divide has profound implications for forecasting: linear extrapolations underestimate long-term outcomes by ignoring how early-stage exponentials appear deceptively slow before surging, while accelerating models emphasize causal drivers like the of information processing that fuels further transitions. Critics of linear assumptions, drawing from observations of historical , note that such models overlook the non-linear nature of complex systems where outputs grow disproportionately to inputs once critical thresholds are crossed. Empirical patterns, such as the consistent doubling times in computational rather than , underscore this distinction, though debates persist on whether universal laws govern the acceleration or if domain-specific limits apply.

Historical Development

Pre-Modern Observations

Early modern thinkers began to articulate notions of progress that implied an increasing pace of human advancement, driven by the accumulation and application of knowledge. , in his 1620 work , highlighted three inventions—, , and the magnetic —as medieval developments that exceeded the collective achievements of and , suggesting that empirical inquiry could compound discoveries over time rather than merely replicate past glories. This view marked a shift from cyclical historical models to one of directional improvement, where prior innovations served as foundations for subsequent ones. By the mid-18th century, Joseph Priestley observed that scientific discoveries inherently generated new questions and opportunities, creating a self-reinforcing cycle. In his writings, Priestley noted, "In completing one discovery we never fail to get an imperfect knowledge of others of which we could have no idea before, so that we cannot solve one doubt without raising another," indicating that the process of inquiry accelerated the expansion of knowledge itself. His 1769 Chart of Biography visually represented history as a timeline of accelerating intellectual output, with denser clusters of notable figures and events in recent centuries compared to antiquity. The provided one of the earliest explicit formulations of accelerating change in his 1795 Sketch for a Historical Picture of the Progress of the Human Mind. He argued that advancements in and mutually reinforced each other: "The progress of the sciences secures the progress of the art of instruction, which again accelerates in its turn that of the sciences; and this reciprocal action is sufficient to explain the indefinite progress of human reason." projected this dynamic into future epochs, envisioning exponential improvements in human capabilities through perfected methods of reasoning and social organization, unbound by biological limits. These observations, rooted in optimism, contrasted with earlier static or regressive views of , emphasizing causal mechanisms like that would later underpin modern theories of technological .

20th-Century Formulations

In 1938, R. Buckminster Fuller coined the term in his book Nine Chains to the Moon to describe the process by which technological advancements enable humanity to achieve progressively greater performance with diminishing inputs of and materials, potentially culminating in "more and more with less and less until eventually doing everything with nothing." Fuller grounded this formulation in empirical observations of 20th-century innovations, such as the shift from horse-drawn carriages to automobiles and early , which demonstrated exponential efficiency gains in transportation and resource utilization. He argued that this trend, driven by synergistic design and material science, represented a fundamental law of rather than isolated inventions, predicting its through global industrialization. By the 1950s, mathematician articulated concerns about the exponential of technological progress in informal discussions and writings, warning of its implications for human survival amid rapid . As recounted by collaborator Stanislaw Ulam, von Neumann highlighted how advancements in and were fostering changes in human life that approached an ""—a point beyond which forecasting future developments becomes infeasible due to the sheer velocity of transformation. In his 1955 essay "Can We Survive Technology?", von Neumann emphasized the unprecedented speed of postwar scientific and engineering breakthroughs, contrasting them with slower historical precedents and attributing the to loops in knowledge production and application. He cautioned that this pace, unchecked by geographical or resource limits, could overwhelm societal adaptation, necessitating deliberate governance to mitigate risks. In 1965, statistician and cryptanalyst I. J. Good advanced these ideas with the concept of an "intelligence explosion" in his article "Speculations Concerning the First Ultraintelligent Machine," defining an ultraintelligent machine as one surpassing all human intellectual activities. Good posited a recursive self-improvement cycle: such a machine could redesign itself and subsequent iterations with superior efficiency, triggering an explosive growth in capability that outpaces biological evolution by orders of magnitude. He supported this with logical reasoning from early computing trends, noting that machines already excelled in specific tasks like calculation and pattern recognition, and projected that general superintelligence would amplify research across domains, potentially resolving humanity's existential challenges—or amplifying them—within years rather than millennia. Good's formulation emphasized probabilistic risks, estimating a non-negligible chance of misalignment between machine goals and human values, while advocating for proactive development under ethical oversight.

Major Theoretical Frameworks

Vernor Vinge's Exponentially Accelerating Change

, a and author, articulated a framework for exponentially accelerating in his 1993 essay "The Coming : How to Survive in the Post-Human Era," presented at the VISION-21 Symposium sponsored by Lewis Research Center. In this work, Vinge posited that the rapid acceleration of technological progress observed throughout the foreshadowed a profound discontinuity, where human-level would enable the creation of superhuman intelligences capable of recursive self-improvement. This process, he argued, would trigger an "intelligence explosion," resulting in technological advancement rates so rapid that they would render human predictability of future events impossible, marking the end of the human era as traditionally understood. Central to Vinge's model is the notion that exponential acceleration arises not merely from hardware improvements, such as those following , but from the feedback loop of enhancing itself. He described the as a point beyond which extrapolative models fail due to the emergence of entities operating on timescales and levels incomprehensible to humans, leading to change comparable in magnitude to the evolution of life on Earth. Vinge emphasized that this acceleration would stem from superintelligences designing superior successors in days or hours, compounding improvements geometrically rather than linearly, thereby compressing centuries of progress into subjective moments from a perspective. Vinge outlined four primary pathways to achieving the critical intelligence threshold: direct development of computational systems surpassing cognition; large-scale computer networks exhibiting emergent ; biotechnological or direct neural enhancements augmenting individual to levels; and reverse-engineering of the to create superior digital analogs. He forecasted that the technological means to instantiate intelligence would emerge within 30 years of 1993, potentially as early as 2005, with the following shortly thereafter, by 2030 at the latest. These predictions were grounded in contemporaneous trends, including accelerating power and early research, though Vinge cautioned that societal or technical barriers could delay but not prevent the onset. His framework has influenced subsequent discussions on technological futures, distinguishing accelerating change as a causal outcome of rather than mere historical pattern extrapolation.

Ray Kurzweil's Law of Accelerating Returns

Ray articulated the Law of Accelerating Returns in a 2001 essay, positing that technological evolution follows an trajectory characterized by positive feedback loops, where each advancement generates more capable tools for the subsequent stage, thereby increasing the overall rate of progress. This law extends biological evolution's principles to human technology, asserting that paradigm shifts—fundamental changes in methods—sustain and amplify by compressing the time required for equivalent improvements. Central to the law is the observation of double-exponential growth in computational power, driven by successive paradigms that yield diminishing durations but multiplicative gains. Historical data on calculations per second per $1,000 illustrate this: from the early 1900s, doubling occurred roughly every three years during the electromechanical era (circa 1900–1940), accelerating to every two years with relays and vacuum tubes (1940–1960), and reaching annual doublings by the integrated circuit era post-1970. Kurzweil identifies six major computing paradigms since 1900, each providing millions-fold improvements in efficiency, with the transistor-to-integrated-circuit shift exemplifying how economic incentives and computational feedback propel faster innovation cycles. The generalizes beyond to domains reliant on information processing, such as , where costs have plummeted exponentially due to algorithmic and hardware advances, and reverse-engineering, projected to achieve human-level scanning at $1,000 per brain by 2023. Kurzweil contends that this acceleration equates to approximately 20,000 years of progress at early twenty-first-century rates compressed into the century, as rate halves roughly every decade. While empirically grounded in century-long trends, the law's projections assume uninterrupted succession, a supported by historical patterns but subject to potential disruptions from resource constraints or unforeseen physical barriers. , a Canadian roboticist and researcher at , advanced theories of accelerating change through his 1988 book Mind Children: The Future of Robot and Human Intelligence, published by . In it, Moravec argues that in computing hardware, projected to continue at rates doubling computational power roughly every , will soon permit the emulation of processes at scale. This hardware trajectory, extrapolated from historical trends in transistor density and processing speed, underpins his forecast that machines will achieve human-equivalent intelligence by around 2040, enabling a transition from biological to digital cognition. Once realized, such systems—termed "mind children"—would serve as humanity's post-biological descendants, programmed with human-derived goals and capable of self-directed evolution. Central to Moravec's is the of recursive self-improvement, where intelligent machines redesign their own architectures, amplifying the rate of far beyond limitations. He describes feedback loops in which enhanced computational substrates allow faster simulation of complex systems, accelerating knowledge generation and problem-solving. For instance, Moravec calculates that replicating the human brain's estimated 10^14 synaptic operations per second requires advancements feasible within decades, given observed doublings in cheap every year. This leads to an "intelligence explosion," a phase of hyper-rapid progress where each iteration of machine intelligence exponentially shortens development cycles, outpacing linear biological . Moravec contends this process is causally driven by competitive economic pressures favoring incremental and software gains, rendering deceleration improbable without physical impossibilities. Moravec extends these ideas to mind uploading, positing that scanning and emulating neural structures onto durable digital media would grant effective immortality, with subjective time dilation in high-speed simulations permitting eons of experience within biological lifetimes. He anticipates robots displacing humans in all labor domains by 2040 due to superior speed, endurance, and scalability, yet views this as benevolent if machines inherit human values through careful initial design. Related notions include his earlier observation of "Moravec's paradox," noting that low-level perceptual-motor skills resist automation more than high-level reasoning, yet overall hardware scaling will overcome such hurdles via brute-force simulation. These predictions, rooted in Moravec's robotics expertise rather than speculative philosophy, emphasize empirical hardware metrics over abstract software debates, aligning with causal mechanisms of technological compounding observed in semiconductor history.

Empirical Evidence

Growth in Computational Power

The in computational power forms a cornerstone of for accelerating , primarily manifested through sustained advances in density and performance metrics. Gordon Moore's 1965 observation, later formalized as , posited that the number of transistors per would double every 18 to 24 months, correlating with proportional gains in capability. This trend held robustly from the 1970s onward, transforming rudimentary processors into high-performance systems capable of trillions of operations per second. Supercomputer performance, as cataloged by the project since 1993, exemplifies this trajectory with aggregate and peak increasing at rates exceeding in some periods. The leading system's Rmax performance rose from 1,128 GFLOPS in June 1993 to 1.102 EFLOPS for in June 2025, a factor of over 10^12 improvement in 32 years, implying an effective of roughly 1.4 years. This growth stems from architectural innovations, parallelism, and scaling of chip counts, outpacing single-processor limits. In applications, compute demands have accelerated beyond historical norms, with training computations for notable models doubling approximately every six months since 2010—a rate four times faster than pre-deep learning eras. Epoch AI's database indicates 4-5x annual growth in training FLOP through mid-2024, fueled by investments in specialized hardware like GPUs and TPUs, where FP32 performance has advanced at 1.35x per year. analyses corroborate this, noting a 3.4-month post-2012, driven by algorithmic efficiencies and economic scaling rather than solely hardware density. These trends underscore causal linkages: denser enable more parallel operations, reducing costs per FLOP and incentivizing larger-scale deployments, which in turn spur innovations in software and . While scaling has decelerated due to physical constraints like quantum tunneling, aggregate system-level continues expansion via multi-chip modules, optical interconnects, and domain-specific accelerators. Empirical data from industry reports affirm no immediate cessation, with supercomputers achieving performance doublings every nine months as of 2025.

Shifts Across Technological Paradigms

Technological paradigms represent dominant frameworks for innovation and problem-solving within specific domains, characterized by core principles, tools, and methodologies that enable sustained progress until supplanted by more efficient alternatives. Shifts between paradigms often involve fundamental reorientations, such as moving from analog systems to ones, and empirical observations indicate these transitions have accelerated over time, with intervals shortening from centuries to decades or years. This acceleration aligns with broader patterns in , where each paradigm builds on prior computational substrates, enabling exponential gains in capability and speed of subsequent shifts. Historical analysis reveals progressively shorter durations for paradigm dominance and replacement. Early paradigms, such as water- and animal-powered mechanics in pre-industrial eras, persisted for millennia with minimal shifts, as evidenced by stagnant per-capita energy use and output until the 18th century. The steam-powered industrial paradigm, emerging around 1760, dominated for roughly 80-100 years before yielding to electrochemical and internal combustion systems in the late 19th century, a transition spanning about 50-60 years per Kondratiev cycle phase. By the 20th century, electronics and computing paradigms shifted more rapidly: vacuum tubes to transistors (1940s-1960s, ~20 years) and then to integrated circuits (1960s-1980s, ~20 years but with intra-paradigm doublings every 18-24 months). Recent examples include the pivot from standalone computing to networked and AI-driven systems post-2000, where cloud computing and machine learning paradigms diffused globally within a decade. Empirical metrics underscore this compression: the time for groundbreaking technologies to achieve widespread has plummeted, reflecting faster integration into economies and societies. reached 30% U.S. penetration in about 40 years (from ~), automobiles took ~50 years for similar , s required 16 years (1980s-1990s), and the just 7 years (1990s). Generative AI tools, exemplifying a nascent , surpassed adoption rates within two years of mass introduction in 2022-2023. data corroborates acceleration, with AI-related filings growing steeply since 2010, driven by a surge in innovators and declining , signaling a where software-defined permeates multiple sectors. Ray Kurzweil's framework of six evolutionary epochs provides a structured lens for these shifts, positing paradigm transitions from physics/chemistry (pre-biological computation) to biology/DNA (~4 billion years ago), brains (~1 million years ago), human-AI technology (recent centuries), merging (projected soon), and cosmic intelligence. Each epoch leverages prior outputs as inputs for higher-order processing, with the rate of paradigm change doubling roughly every decade since the 20th century, as measured by computational paradigms in electronics. While Kondratiev waves suggest quasi-regular 40-60 year cycles tied to paradigms like steam or information technology, proponents of acceleration argue intra-wave innovations compound faster, eroding fixed durations. Counter-evidence includes persistent infrastructural bottlenecks, yet diffusion metrics consistently show paradigms propagating more rapidly in knowledge-intensive economies.

Economic and Productivity Metrics

Global (GDP) has exhibited accelerating growth rates over the long term, transitioning from near-stagnation in pre-industrial eras to sustained increases following the . From 1 CE to 1820 CE, average annual global GDP growth was approximately 0.05%, reflecting limited technological and institutional advancements. This rate rose to about 0.53% annually between 1820 and 1870, driven by early industrialization and steam power adoption, and further accelerated to roughly 1.3% from 1913 to 1950 amid and . Post-1950, advanced economies experienced episodes of even higher growth, such as 2-3% annual rates in the , attributable to shifts in paradigms and integration. Total factor productivity (TFP), a metric isolating output growth beyond capital and labor inputs to reflect technological and organizational efficiency, provides direct evidence of acceleration in key sectors. In the United States, TFP growth averaged over 1% annually from 1900 to 1920 but surged to nearly 2% during the 1920s, coinciding with electrification and assembly-line innovations. A similar uptick occurred post-1995, with TFP rising by about 2.5% annually through the early 2000s, linked to information technology diffusion. Globally, agricultural TFP accelerated from the late 20th century onward, contributing over 1.5% annual growth in output while offsetting diminishing resource expansion, as measured in Conference Board datasets spanning 1950-2010. These patterns align with paradigm shifts where successive technologies compound efficiency gains. Labor , output per hour worked, reinforces this trajectory with episodic accelerations tied to computational and advances. U.S. nonfarm business sector labor grew at an average 2.1% annual rate from to , but with marked surges: 2.8% in the 1995-2005 IT boom and preliminary 3.3% in Q2 , potentially signaling a resurgence from post-2008 slowdowns below 1.5%. Globally, labor per hour has risen from under $5,000 (2011 international dollars) in 1950 to over $20,000 by 2019, with accelerations in emerging economies post-1990 due to . These metrics indicate that while growth rates fluctuate—dipping to 1% or less in stagnation periods like 1973-1995—the overarching trend features returns from technological paradigms, outweighing linear input expansions.
PeriodU.S. TFP Annual Growth (%)Key Driver
1900-1920~1.0-1.5 onset
~2.0 efficiencies
1995-2005~2.5IT adoption
2010-2024~1.0 (with recent uptick) and integration

Forecasts and Predictions

Timelines for Technological Singularities

, in his 1993 essay, forecasted the —defined as the point where superhuman intelligence emerges and accelerates beyond —would likely occur between and 2030, with the upper bound reflecting a conservative estimate based on trends in and . has consistently predicted the singularity by 2045, following human-level () around 2029, a timeline he attributes to in computational capacity and reaffirmed in his 2024 publication The Singularity Is Nearer. Aggregated expert forecasts show a broader range, with many tying timelines to achievement. A of over 8,500 predictions from researchers indicates a median estimate for (a prerequisite for in most models) between 2040 and 2050, with a 90% probability by 2075, though these draw from surveys predating rapid 2023–2025 scaling advances. Recent reviews of expert surveys report shrinking medians, such as 2047 for transformative among researchers, influenced by empirical progress in large language models and compute scaling, yet still longer than industry optimists like Kurzweil. Forecasting platforms like aggregate community predictions placing announcement around 2034, implying potential shortly thereafter under acceleration assumptions, though these remain probabilistic and sensitive to definitional ambiguities. Optimistic outliers, such as some industry leaders projecting superhuman capabilities by 2026–2027, contrast with conservative academic views extending beyond 2100, highlighting uncertainties in algorithmic breakthroughs and hardware limits; however, post-2020 developments have systematically shortened prior estimates across sources.
Predictor/SourceSingularity/AGI TimelineBasis
(1993)2005–2030Extrapolation from computing trends and intelligence creation.
(2024)AGI 2029; 2045Exponential returns in , biotech integration.
AI Expert Surveys (aggregated)Median AGI 2040–2050Probabilistic forecasts from researchers, adjusted for recent scaling.
CommunityAGI ~2034Crowdsourced predictions on general AI benchmarks.

Specific Domain Projections

In artificial intelligence, Ray Kurzweil projects that systems achieving human-level intelligence across all domains—artificial general intelligence (AGI)—will emerge by 2029, enabled by exponential growth in computational capacity reaching 10^16 calculations per second, matching the human brain's estimated performance. This milestone would trigger recursive self-improvement, accelerating AI capabilities toward superintelligence by 2045. Supporting this, recent advancements in large language models and hardware scaling have aligned with historical exponential trends in AI performance metrics, such as those tracked in benchmarks like GLUE and BIG-bench. Biotechnology projections anticipate integration with and to achieve "" by the early 2030s, where annual medical progress extends healthy lifespan by more than one year, effectively overcoming aging as a . Kurzweil forecasts that by 2030, -driven analysis of the human proteome and epigenome will enable personalized interventions reversing cellular damage, building on current advancements and -accelerated that reduced development timelines from years to months in cases like vaccines. Such developments would cascade into broader healthspan extensions, with nanobots repairing DNA and tissues at molecular scales. Energy sector forecasts posit that solar photovoltaic efficiency, following a decade-long doubling of global capacity, will supply the majority of world energy demands by the late 2020s to early 2030s, augmented by nanotechnology-enhanced panels capturing sunlight at near-theoretical limits. Kurzweil's analysis extrapolates from solar's historical 29% compound annual growth rate in price-performance, predicting cost parity with fossil fuels already achieved in many regions by 2025, leading to decentralized, abundant clean energy that mitigates scarcity. Fusion energy, while farther out, could see acceleration via AI-optimized reactor designs, though projections remain contingent on breakthroughs in plasma confinement beyond current tokamak experiments like ITER. Nanotechnology is expected to enable molecular assemblers by the , facilitating bottom-up that defies traditional resource constraints and accelerates material innovations across domains. This would underpin self-replicating systems for and infinite scalability in production, with early evidence in carbon nanotube synthesis yielding materials 100 times stronger than at fractional weights. Transportation projections include fully autonomous vehicles dominating roadways by the late 2020s, reducing accidents by orders of magnitude through surpassing human reaction times and predictive modeling. Despite delays from regulatory hurdles, scaling laws in and neural networks suggest convergence with , enabling and hyperloop-scale efficiencies that compress global travel times exponentially.

Constraints and Counterarguments

Physical and Thermodynamic Limits

The exponential growth in computational density and speed faces fundamental constraints imposed by the laws of physics and , which establish irreducible minimums for information processing. The Landauer principle dictates that erasing one bit of information requires dissipating at least kT \ln 2 energy as , where k is Boltzmann's and T is temperature; at room temperature (approximately 300 K), this equates to about $2.8 \times 10^{-21} joules per bit. Contemporary digital logic operates $10^{10} to $10^{12} times above this limit per operation, rendering it not an immediate barrier but a theoretical floor that intensifies dissipation challenges as transistor counts rise and feature sizes shrink below 5 nm. Power density in advanced chips already approaches 100-790 W/cm² under aggressive cooling, nearing sustainable limits around 1000 W/cm² beyond which thermal management becomes impractical without exotic solutions. Physical limits further constrain scaling: transistor gates cannot shrink indefinitely due to atomic scales (roughly 0.1 ), with quantum tunneling and variability dominating below 2-3 , as observed in current 2 nodes where leakage erodes reliability. Signal propagation speed is capped by the (c \approx 3 \times 10^8 m/s), imposing minimum latencies; for a chip spanning 1 cm, round-trip signaling takes about 67 ps, limiting effective clock rates and parallelism in dense architectures. Ultimate bounds, derived from and , cap a 1 kg system's operations at roughly $10^{50} to $10^{51} per second before into a , though practical and constraints reduce this to $10^{31} ops/J for matter-based computers. These limits suggest that while paradigm shifts—such as to approach Landauer efficiency or photonic/quantum alternatives—may defer saturation, they cannot indefinitely sustain Moore-like exponentials without violating conservation laws. theoretically minimize by avoiding irreversible state mergers, yet real implementations face overhead from correction and cryogenic requirements, preserving thermodynamic costs. Empirical trends show slowing transistor scaling since the mid-2010s, with density gains dropping from 2x every two years to under 1.5x, partly due to these encroaching barriers rather than mere economic factors. Consequently, accelerating returns in silicon-based confront a horizon where physical finitude curtails unbounded growth, necessitating qualitative leaps in architecture to evade plateauing.

Resource and Economic Barriers

The acceleration of computational power and related technologies encounters significant resource constraints, particularly in critical materials essential for hardware production. Rare earth elements, vital for magnets in electric motors, , and components, remain heavily concentrated in supply chains, with controlling over 80% of global processing capacity as of 2025. Recent restrictions imposed by in October 2025, expanding controls to include additional elements and heightened scrutiny for applications, have exacerbated supply vulnerabilities, potentially delaying advancements in and hardware. These measures, aimed at , threaten to disrupt global manufacturing timelines, as alternative sourcing from regions like or the requires years to scale due to environmental and extraction challenges. Energy demands pose another formidable barrier, as the scaling of models and data centers drives unprecedented consumption. Training a single can require equivalent to the annual usage of hundreds of , with global AI-related power usage projected to reach levels comparable to 22% of U.S. by the late if growth continues unchecked. Data centers for advanced computing already strain electrical grids, contributing to price hikes and delays in grid expansions, particularly in regions pursuing renewable transitions where intermittent supply mismatches hinder reliability. , key constraints include permitting delays and insufficient transmission infrastructure, limiting net available power capacity expansions needed to support AI growth through 2030. These physical bottlenecks could cap the pace of iterative improvements in , as availability becomes the binding factor over algorithmic gains. Economic factors further impede sustained acceleration, with the of fabrication escalating dramatically. Constructing a state-of-the-art fabrication facility () for nodes below 3 nanometers now demands investments of $20-30 billion, a sharp rise from earlier generations due to requirements for extreme precision, scales, and specialized equipment. Operating costs compound this, as advanced nodes consume ly more materials and energy per , while yields remain sensitive to nanoscale defects. These escalating expenditures, coupled with geopolitical subsidies distorting global competition, strain private investment and national budgets, potentially leading to among fewer firms and reduced velocity. In contexts of progress, such as extending analogs, diminishing marginal returns emerge as R&D yields plateau against rising complexity, necessitating paradigm shifts that historical data suggest occur less frequently amid resource scarcity.

Empirical and Methodological Critiques

Critics contend that empirical data supporting accelerating technological change often overstates continuity by focusing on narrow metrics while broader indicators reveal plateaus or decelerations. For instance, , which posits a doubling of density on integrated circuits approximately every two years, has empirically slowed since around 2010, with industry-wide advancements falling below the predicted pace due to challenges in scaling. Transistor density growth rates have diminished, and clock frequency improvements have stagnated, contributing to reduced gains per . Similarly, despite proliferation of digital technologies, labor productivity growth in the United States decelerated to an average of 0.8 percent annually from 2010 to 2018, compared to higher rates in prior decades. This slowdown extends globally, affecting 29 of 30 countries, suggesting that technological diffusion has not translated into economy-wide acceleration. Methodological issues further undermine claims of sustained exponential acceleration. Proponents like rely on selective historical examples to construct curves fitting the "law of accelerating returns," omitting technologies that deviated from exponential patterns, such as certain information-based systems that underperformed predictions. Forecaster Theodore Modis has argued that such approaches cherry-pick data points across paradigms to force an overarching exponential trend, ignoring instances where growth stalled or reverted to linear progression. Analyses often fail to incorporate S-curve dynamics, where individual technologies exhibit initial exponential phases followed by and the need for disruptive shifts, rather than seamless acceleration; this logistic pattern better explains historical transitions than unbounded exponentials. Moreover, extrapolations frequently prioritize computational metrics as proxies for overall progress without rigorous causal validation, overlooking dependencies on non-technical factors like regulatory hurdles or investment returns, which can cap apparent acceleration. These flaws risk overpredicting future rates by retrofitting data to narrative rather than deriving from falsifiable models.

Alternative Viewpoints

Advocates of Bounded or Decelerating Change

Economist Robert J. Gordon has argued that U.S. economic growth, driven by technological innovation, experienced a exceptional surge from 1870 to 1970 but has since decelerated significantly. In his analysis, productivity growth averaged 2.8% annually from 1920 to 1970, dropping to 1.6% from 1970 to the present, attributing this to the exhaustion of transformative inventions like electricity, indoor plumbing, and automobiles, which yielded persistent gains unlike the more limited impacts of information technology post-1970. Gordon forecasts future per capita growth at only 0.5% to 1% annually through 2040, constrained by "headwinds" including aging populations, plateauing educational attainment, rising inequality, environmental regulations, and fiscal burdens from entitlements. Tyler Cowen, in his 2011 book The Great Stagnation, posits that the U.S. economy has hit a technological plateau after reaping "low-hanging fruit" from earlier innovations such as scientific advances, population growth, and institutional improvements that fueled rapid progress from 1940 onward. He contends that subsequent innovations, while numerous, fail to deliver comparable economy-wide productivity boosts due to their niche applications and rising research costs amid diminishing marginal returns. Cowen highlights stagnant median wages and household incomes since 2000 as evidence, linking them to slower innovation diffusion rather than accelerating change. Empirical data supports claims of bounded progress in key domains; for instance, , describing exponential transistor density increases, has slowed, with growth rates halving from 40% annually pre-2000 to about 20% by the , approaching physical limits in silicon-based scaling. Critics of exponential paradigms, including co-founder , argue that software complexity in fields like grows superlinearly relative to hardware advances, demanding exponentially more human effort and resources, thus capping acceleration. These views emphasize S-curve trajectories over unbounded exponentials, where technologies mature and yield to absent paradigm shifts.

Cyclic and Non-Exponential Theories

Cyclic theories of technological and economic change posit that progress occurs in recurrent waves rather than uninterrupted acceleration, with periods of rapid innovation followed by stagnation or decline driven by saturation, resource constraints, or social adjustments. Nikolai Kondratieff's long wave theory, developed in the 1920s, describes supercycles lasting approximately 40 to 60 years, each propelled by clusters of basic innovations such as steam power in the first wave (roughly 1780s–1840s) and information technologies in the fifth (1970s–present). These waves feature an upswing phase of expansion through technological diffusion and investment, transitioning to a downswing of relative stagnation as returns diminish and structural rigidities emerge, challenging notions of perpetual exponential growth by emphasizing endogenous cyclical dynamics rooted in capital accumulation and innovation exhaustion. Joseph Schumpeter extended this framework by integrating , arguing that entrepreneurial innovation disrupts established equilibria, generating boom-bust cycles where monopolistic complacency yields to new technological paradigms, as evidenced in historical shifts from railroads to automobiles. Empirical analyses of patent data and productivity metrics support cyclical patterns, with radical innovations triggering variance in technological trajectories that eventually converge on dominant designs, followed by and eventual disruption, as modeled in studies of industries like semiconductors and . Such models highlight how organizational and institutional , rather than linear acceleration, governs transitions, with downswings reflecting not failure but necessary reconfiguration before the next cycle. Non-exponential theories emphasize logistic or bounded trajectories, where individual technologies follow S-curves characterized by slow initial adoption, rapid mid-phase expansion, and eventual saturation due to physical limits or market fulfillment, precluding indefinite acceleration without shifts. For instance, analyses of historical trends in production and transportation reveal that improvements plateau as approaches thermodynamic bounds, with aggregate progress appearing only through discontinuous jumps to new S-curves, but overall yielding sub-exponential rates when accounting for increasing complexity and input costs. Economic models incorporating non-exponential steady states argue that expanding variety in follows rather than paths, constrained by finite resources and human cognitive limits, as simulated in frameworks that predict asymptotic rather than . These perspectives, grounded in empirical trend forecasting, underscore diminishing marginal returns in mature domains, where further advances demand exponentially greater effort, as observed in post-Moore's Law .

Contemporary Manifestations

AI and Software Advancements Post-2020

The release of OpenAI's in June 2020 marked a pivotal advancement in large language models, featuring 175 billion parameters and demonstrating capabilities in for tasks like text generation and translation. This model exemplified scaling laws identified in prior research, where performance on benchmarks improved predictably with increased compute, data, and model size, setting the stage for subsequent exponential gains. from post-2020 training runs validated these laws, with loss functions decreasing logarithmically as resources scaled, enabling models to generalize across diverse domains. The launch of on November 30, 2022, powered by GPT-3.5, accelerated public and commercial adoption of generative , reaching 100 million users within two months and catalyzing a surge in investments exceeding $100 billion annually by 2023. This interface democratized access to advanced , revealing emergent abilities such as coherent conversation and problem-solving, which outperformed prior benchmarks in areas like coding and reasoning. OpenAI's , released on March 14, 2023, introduced multimodal processing of text and images, achieving human-level performance on exams like the (90th percentile) and surpassing on most metrics by margins of 20-50%. Subsequent iterations, including GPT-4o in May 2024, enhanced speed and cost-efficiency, processing multimodal inputs with 2x faster inference and 50% lower costs than Turbo while maintaining or exceeding benchmark scores in reasoning tasks. By 2025, models like OpenAI's GPT-4.1 and o1 demonstrated advanced chain-of-thought reasoning, solving complex problems in math and at levels rivaling expert humans, with o1 achieving 83% on International Math Olympiad qualifiers. The U.S. dominated model production, releasing 40 notable systems in 2024 alone per the Stanford AI Index, reflecting compute scaling that doubled effective training capacity every 6-9 months, outpacing traditional . In , AI tools like , integrated post-2021, automated , boosting developer by 55% in tasks such as writing boilerplate and debugging, as measured in controlled studies. Generative AI adoption led to average performance improvements of 66% in complex knowledge work, with automation extending to workflow orchestration via AI agents that handle multi-step processes autonomously. Projections indicate AI-driven gains could add 1.5% to annual GDP growth by 2035, driven by software efficiencies in sectors like programming and , though empirical critiques note in scaling without algorithmic innovations. These advancements underscore a causal link between scaled compute and capability leaps, fueling debates on whether continued progress will sustain or plateau amid and constraints.

Biotech, Materials, and Energy Innovations

In , editing technologies exemplified by -Cas systems have demonstrated accelerated , transitioning from foundational discoveries in the early 2010s to widespread clinical trials by 2025, with over 50 active trials addressing conditions like , cancer, and . The integration of has further hastened this progress; for instance, models like CRISPR-GPT enable rapid prediction and optimization of guide RNAs, reducing design timelines from weeks to hours and broadening accessibility beyond specialized labs. Market data underscores this momentum, with the global and Cas gene editing sector projected to expand from $3.3 billion in 2023 to $8.8 billion by the end of the decade, driven by precision therapies and automation in . complements these advances, programming stem cells via CRISPR for tissue regeneration and therapeutic protein production, as seen in emerging cell-based treatments for degenerative diseases. Despite historical trends like —indicating rising costs and slowing outputs in pharmaceutical R&D prior to 2020—post-pandemic accelerations in mRNA platforms and data analytics have reversed productivity declines, enabling faster iteration cycles akin to computational exponentials. Advanced materials science has witnessed a surge in discoveries leveraging , particularly and related 2D structures, yielding properties like room-temperature and enhanced interactions. In twisted multilayers, magic-angle configurations induce through slowed dynamics and quantum correlations, with experimental validations progressing from theoretical proposals in 2018 to observable effects in layered systems by 2025. Novel hybrid materials, such as coupled with indium oxide superconductors, reveal multiple Dirac points that facilitate tunable charge neutrality, advancing potential applications in quantum devices and low-resistance . Growth-directed stacking domains in , identified in late 2024, enable self-organized ABA/ABC bilayers, promising scalable production of with programmable electronic behaviors. These breakthroughs, often computationally accelerated, parallel exponential performance gains in by enabling denser, more efficient material architectures, though reproducibility challenges in high-temperature superconductors persist. Energy innovations exhibit analogous accelerations, with photovoltaic efficiencies climbing through diverse material and refinements, contributing to a 89% in systems from 2010 to 2020 and continued declines into 2025. cells, achieving lab efficiencies exceeding 25% by 2025, integrate hybrid organic-inorganic structures for broader light absorption and flexibility, outpacing traditional panels in deployment speed and cost metrics. Battery technologies follow trajectories reminiscent of , with lithium-ion densities doubling roughly every few years via solid-state electrolytes and anodes, enabling ranges to surpass 500 miles in production models by mid-decade. efforts have compressed timelines, as evidenced by the U.S. Department of 's 2025 roadmap targeting grid-scale commercialization by the mid-2030s through inertial confinement and milestones like net energy gain demonstrations. These fields collectively reflect causal drivers of progress—improved simulation tools, modular prototyping, and cross-disciplinary synergies—outpacing linear expectations despite thermodynamic constraints.

Broader Implications

Societal and Economic Transformations

Accelerating has driven significant through enhanced , with AI-related capital expenditures contributing 1.1 percentage points to U.S. GDP growth in the first half of 2025. Studies project that AI adoption could increase global GDP by $7 trillion annually by augmenting labor across sectors, though realization depends on widespread implementation and complementary investments in . In optimistic scenarios, advanced AI might enable growth exceeding 30% annually by 2100, fundamentally altering economic scales through of cognitive tasks previously immune to . However, these transformations exacerbate , as has accounted for most of the rise in U.S. income disparities since 1980 by displacing lower-skilled workers while rewarding high-skill labor and capital owners. Empirical analysis attributes 87% of between-group wage inequality increases to labor shifts from technological advancements, concentrating gains among top earners. development further widens gaps, with evidence showing stronger effects in regions with uneven access to and retraining, as routine tasks vanish faster than new opportunities emerge for non-adapters. Societally, rapid change disrupts labor markets, with projections estimating 92 million jobs displaced globally by 2030 due to and , though offset by 170 million new roles in emerging fields like and green technologies. Skills demanded in AI-exposed occupations evolve 66% faster than in others, necessitating continuous upskilling to avoid , as seen in sectors like and administrative support where adoption rates have accelerated post-2020. This pace outstrips institutional adaptation, straining social structures through widened skills gaps and potential among demographics less equipped for digital transitions. Economic models indicate that without policy interventions like targeted or supports, accelerating change could amplify , as historical patterns show favoring skilled labor and hubs over broad-based . Yet, complementary effects persist where augments human capabilities, boosting output in knowledge-intensive industries and potentially lifting living standards if mitigates risks. Overall, these shifts demand reevaluation of work norms, from shorter career tenures to human-machine systems, reshaping societal expectations around and value creation.

Policy and Adaptation Challenges

Accelerating poses significant challenges to policymakers, as the pace of innovation in fields such as , , and outstrips the deliberative cycles of legislative and regulatory processes. Traditional structures, designed for linear progress, struggle to address exponential advancements, resulting in regulatory lag where outdated laws fail to mitigate risks like or misuse while potentially stifling innovation through overly prescriptive rules. For instance, the U.S. Department of Defense has identified parallel revolutions across , , , , and energy—collectively termed —as necessitating rapid doctrinal shifts, yet bureaucratic hampers timely adaptation. In regulating , governments face dilemmas in balancing safety with competitiveness; for example, debates over autonomous weapons systems highlight ethical concerns about delegating lethal decisions to machines without human oversight, prompting calls for international moratoriums while adversaries may proceed unconstrained. Legal accountability remains unresolved for AI-driven decisions, with potential violations of in unmanned systems, and disputes intensify as global competition erodes U.S. dominance in , where foreign acquisitions of key firms like in 2013 exemplify vulnerabilities. Despite a surge in AI-related regulations—U.S. federal agencies issued in 2024, doubling the prior year—enforcement gaps persist, particularly in addressing deepfakes or threats amplified by rapid evolution. Multilateral efforts, such as UN discussions on lethal autonomous weapons, underscore coordination challenges amid differing national priorities. Economic adaptation strains labor markets, where displaces routine tasks, contributing to 50-70% of U.S. earnings inequality rise from 1980-2016 and that favors high-skilled workers while eroding middle-skill jobs. Governments grapple with reskilling initiatives amid fragmented systems and insufficient funding, as accelerates task , potentially shifting income toward capital owners and exacerbating wage polarization. Policies like adjustments or portable benefits lag behind fluidity, with projections of increasingly insecure work as firms demand new competencies without traditional . Geopolitically, accelerating change heightens risks, with adversaries exploiting technologies like or directed-energy weapons under fewer ethical constraints, necessitating foresight mechanisms such as horizon-scanning in defense planning. Institutional falters due to slow public-private and workforce skill shortages in government, compounded by where frontier firms capture disproportionate gains—45% since 2000 versus under 10% for laggards—demanding revamped competition policies. Overall, these challenges demand agile models, yet entrenched bureaucracies and political divisions impede flexible responses, risking unaddressed threats from biotech ecological harms to proliferation.

References

  1. [1]
    the Law of Accelerating Returns. - the Kurzweil Library
    Jan 1, 2025 · In exponential growth, we find that a key measurement such as computational power is multiplied by a constant factor for each unit of time (e.g. ...
  2. [2]
    Technology over the long run: zoom out to see how dramatically the ...
    Feb 22, 2023 · The timeline begins at the center of the spiral. The first use of stone tools, 3.4 million years ago, marks the beginning of this history of ...
  3. [3]
    A timeline of technology transformation: How has the pace changed?
    Feb 27, 2023 · The pace of technological change is much faster now than it has been in the past, according to Our World in Data. It took 2.4 million years ...
  4. [4]
    [PDF] Chapter 10 The Emergence and Impact of Intelligent Machines ...
    There are a great many examples of the exponential growth implied by the law of accelerating returns in technologies, as varied as DNA sequencing, communication ...Missing: evidence | Show results with:evidence
  5. [5]
    Law of Accelerating Returns - Edge.org
    Evolution applies positive feedback in that the more capable methods resulting from one stage of evolutionary progress are used to create the next stage.
  6. [6]
    Kurzweil Responds: Don't Underestimate the Singularity
    Oct 20, 2011 · Allen writes that “the Law of Accelerating Returns (LOAR)… is not a physical law.” I would point out that most scientific laws are not physical ...<|separator|>
  7. [7]
    What's Driving Exponential Technology Growth?
    Dec 24, 2018 · The exponential acceleration of technology is having an unprecedented impact on the way we live. The resulting pace of change is cause for both ...Missing: definition | Show results with:definition
  8. [8]
    III. Universal Accelerating Change - The Foresight Guide
    These exponential changes are in turn accelerating global wealth creation and certain social and political changes, and decelerating other changes, such as ...
  9. [9]
    Universality of accelerating change - ScienceDirect
    On large time scales the progress of human technology follows an exponential growth trend that is termed accelerating change. The exponential growth trend ...
  10. [10]
    (PDF) Kurzweil, Moore, and Accelerating Change - ResearchGate
    Nov 13, 2020 · process that leads to accelerating change. Kurzweil also argues that when a specific paradigm that provides exponential growth. exhausts its ...<|separator|>
  11. [11]
    The Techno-Optimist Manifesto - Andreessen Horowitz
    Oct 16, 2023 · Ray Kurzweil defines his Law of Accelerating Returns: Technological advances tend to feed on themselves, increasing the rate of further advance.
  12. [12]
    Universality of accelerating change - ADS
    On large time scales the progress of human technology follows an exponential growth trend that is termed accelerating change. The exponential growth trend ...
  13. [13]
    A Letter from Ray Kurzweil - IEEE Spectrum
    ... law of accelerating returns"), rather than the linear extrapolation, which represents most people's intuition. Rennie says that my predictions "border on ...
  14. [14]
    The Idea of Progress | The Institute for the Study of Western Civilization
    Bacon argues that man is the master of his own fate, that time is continuous, and that the authority of the ancients must be rejected if improvements in ...
  15. [15]
    TOP 25 QUOTES BY JOSEPH PRIESTLEY | A-Z Quotes
    In completing one discovery we never fail to get an imperfect knowledge of others of which we could have no idea before, so that we cannot solve one doubt ...
  16. [16]
    Joseph Priestley Created Revolutionary "Maps" of Time
    “The proper employment of men of letters,” he once wrote, “is either making new discoveries, in order to extend the bounds of human knowledge; or facilitating ...
  17. [17]
    Condorcet, 10th Epoch. Future Progress of Man (1796)
    The progress of the sciences secures the progress of the art of instruction, which again accelerates in its turn that of the sciences; and this reciprocal ...
  18. [18]
    [PDF] Nicolas de Condorcet and the First Intelligence Explosion Hypothesis
    Condorcet's Accelerating ... First, metaphorically, the book can be interpreted as a scatterplot with the general trend of progress accelerating.
  19. [19]
    Ephemerality: Another radical design concept for the climate revolution
    Sep 1, 2023 · Ephemeralization is a term invented by R. Buckminster Fuller in his 1938 book Nine Chains to the Moon to describe how, through technological advancement, we ...
  20. [20]
    Rebels of Construction—The Revolutionary Ideas of Buckminster ...
    Bucky coined his lifework “ephemeralization” meaning 'doing more with less.' For a man who held multiple honorary doctorate degrees and served as World ...
  21. [21]
    Buckminster Fuller and Systems Theory - UMSL
    Ephemeralization was another systems' term coined by Fuller to express the concept of accomplishing more with fewer resources. His third system philosophy is ...
  22. [22]
    The Coming Technological Singularity
    In the 1950s there were very few who saw it: Stan Ulam [27] paraphrased John von Neumann as saying: One conversation centered on the ever accelerating progress ...
  23. [23]
    [PDF] Can We Survive Technology?
    CAN WE SURVIVE TECHNOLOGY? by John von Neumann. Member, Atomic Energy Commission. "The great globe itself" is in a rapidly maturing crisis. — ...
  24. [24]
    Can We Survive Technology? — John von Neumann (1955)
    Oct 31, 2021 · The essay discusses the threats that may result from ever-expanding technological progress in a finite world.<|separator|>
  25. [25]
    [PDF] Speculations Concerning the First Ultraintelligent Machine
    This shows that highly intelligent people can overlook the "intelligence explosion." It is true that it would be uneconomical to build a machine capable ...
  26. [26]
    Irving John Good Originates the Concept of the Technological ...
    Originated the concept later known as "technological singularity Offsite Link ," which anticipates the eventual existence of superhuman intelligence.
  27. [27]
    [PDF] Speculations Concerning the First Ultraintelligent Machine*
    In order to design an ultraintelligent machine we need to understand more about the human brain or human thought or both. In the follow-.<|control11|><|separator|>
  28. [28]
    The coming technological singularity: How to survive in the post ...
    Dec 1, 1993 · The coming technological singularity: How to survive in the post-human era The acceleration of technological progress has been the central ...
  29. [29]
    The Coming Technological Singularity, Vernor Vinge, 1993
    The acceleration of technological progress has been the central feature of this century. I argue in this paper that we are on the edge of change comparable ...
  30. [30]
    Vernor Vinge's Prophecies: Are we heading toward a technological ...
    Apr 14, 2025 · Vernor Vinge's predictions about the coming technological singularity emerged during this dynamic period of AI development. It's worth ...<|separator|>
  31. [31]
    Mind Children - Harvard University Press
    Jan 2, 1990 · Filled with fresh ideas and insights, this book is one of the most engaging and controversial visions of the future ever written by a serious scholar.
  32. [32]
    Mind Children: The Future of Robot and Human Intelligence
    Author, Hans P. Moravec ; Edition, illustrated ; Publisher, Harvard University Press, 1988 ; Original from, the University of Michigan ; Digitized, Sep 7, 2010.
  33. [33]
    When will computer hardware match the human brain?
    Hans Moravec argues that computers will soon outperform human intellects. 'The visceral sense of a thinking presence in machinery will become increasingly ...Missing: explosion | Show results with:explosion
  34. [34]
    Superhumanism | WIRED
    Oct 1, 1995 · According to Hans Moravec, by 2040 robots will become as smart as we are. And then they'll displace us as the dominant form of life on Earth.
  35. [35]
    Hans Moravec Documents | The Library of Consciousness
    He argues that technological advancement is inevitable, driven by competition and incremental improvements, shaping a future where machines may replace humans.
  36. [36]
    Accelerating Change Documents | The Library - organism.earth
    Hans Moravec explores the future of robotics and AI, predicting a gradual evolution toward intelligent, autonomous machines. He discusses how robotics is ...
  37. [37]
    Superhumanism: According to Hans Moravec, by 2040 Robots Will ...
    According to Hans Moravec, by 2040 robots will become as smart as we are. And then they'll displace us as the dominant form of life on Earth.
  38. [38]
    Moravec's Paradox of Artificial Intelligence and a Possible Solution ...
    Oct 29, 2017 · Aside from his paradox discovery, he is well-known for a book he wrote in 1990, Mind Children: The Future of Robot and Human Intelligence.
  39. [39]
    A Brief History of Intellectual Discussion of Accelerating Change
    The roboticist Hans Moravec also emerged on the public scene in this decade. Moravec is arguably the most important single pioneer and advocate of deep thinking ...
  40. [40]
    21st century progress in computing - ScienceDirect.com
    'Low-growth' denotes an index where we assume that the proportion of GPU use was 1% in 2006 and increased by 10% each year, 'high-growth' where we assume that ...21st Century Progress In... · 2. Cpu Performance And The... · Appendix
  41. [41]
    Performance Development | TOP500
    Performance Development ; Jun 1, 1993, 1,128.57, 59.7 ; Nov 1, 1993, 1,493.35, 124 ; Jun 1, 1994, 2,317.01, 143.4 ; Nov 1, 1994, 2,732.24, 170 ...
  42. [42]
    TOP500: Home -
    The 64th edition of the TOP500 reveals that El Capitan has achieved the top spot and is officially the third system to reach exascale computing after Frontier ...Lists · June 2018 · November 2018 · TOP500 List
  43. [43]
    The training compute of notable AI models has been ... - Epoch AI
    Jun 19, 2024 · The training compute of notable AI models has been doubling roughly every six months. Since 2010, the training compute used to create AI models ...
  44. [44]
    AI and compute | OpenAI
    May 16, 2018 · Two distinct eras of compute usage in training AI systems. Show ... historical results—with a 3.4-month doubling time starting in ~2012.
  45. [45]
    Machine Learning Trends - Epoch AI
    Jan 13, 2025 · The amount of FLOP/s for GPUs in FP32 precision grows by 1.35x per year. A similar trend is observed for FP16. 90% confidence interval: 1.31x ...
  46. [46]
    Trends in AI supercomputers | Epoch AI
    Apr 23, 2025 · The computational performance of leading AI supercomputers has doubled every 9 months, driven by deploying more and better AI chips (Figure 1).
  47. [47]
    Kurzweil's Law (aka “the law of accelerating returns”)
    Jan 12, 2004 · The paradigm shift rate (i.e., the overall rate of technical progress) is currently doubling (approximately) every decade; that is, paradigm ...
  48. [48]
    Kondratieff Wave - Definition, How It Works, and Past Cycles
    Economists estimate that the waves last for 40 to 60 years, with each cycle demonstrating alternate intervals of high and low growth rates.
  49. [49]
    Is artificial intelligence leading to a new technological paradigm?
    This study examines whether AI is initiating a technological revolution, signifying a new technological paradigm, using the perspective of evolutionary neo- ...
  50. [50]
    Chart: The Rising Speed of Technological Adoption - Visual Capitalist
    Feb 14, 2018 · Microwaves, cell phones, smartphones, social media, tablets, and other inventions from the modern era all show fast-rising adoption rates.
  51. [51]
    The Rapid Adoption of Generative AI | St. Louis Fed
    Sep 23, 2024 · We show that two years after the mass introduction of generative AI, its adoption rate already exceeds that of the PC and internet at ...Missing: major | Show results with:major
  52. [52]
    [PDF] AI as a new emerging technological paradigm: evidence from global ...
    AI patenting strongly accelerated over time, driven by an even steeper growth of the number of AI innovators, with a corresponding decline in the number of AI ...
  53. [53]
    Ray Kurzweil: The Six Epochs of Technology Evolution - Big Think
    Sep 20, 2011 · In tracking our progress in the technological-evolutionary journey, Kurzweil has identified six epochs, each of which is characterized by a major paradigm ...
  54. [54]
    GDP per capita, 2022 - Our World in Data
    GDP per capita is a comprehensive measure of people's average income. It helps compare income levels across countries and track how they change over time.
  55. [55]
    Economic Growth - Our World in Data
    Similarly, the history of economic growth is also the history of how large global inequalities emerged – in nutrition , health , education , basic ...
  56. [56]
    [PDF] Total Factor Productivity Growth in Historical Perspective
    After averaging somewhat more than 1 percent annually from 1900 to 1920, measured TFP growth accelerated to nearly 2 percent on average during the 1920s and ...
  57. [57]
    [PDF] The acceleration in U.S. total factor productivity after 1995
    The final two columns show that over time, labor has benefited substantially from TFP growth, with real wages rising 1.9 percent per year. Real rental rates, by ...
  58. [58]
    Accelerated Productivity Growth Offsets Decline in Resource ...
    Sep 1, 2010 · The results suggest that, rather than the rate of global productivity growth slowing, TFP has accelerated and accounts for an increasing share ...
  59. [59]
    Productivity Home Page : U.S. Bureau of Labor Statistics
    Total factor productivity increased 1.3 percent in the private nonfarm business sector in 2024 as output increased 2.9 percent and combined inputs increased 1. ...Missing: evidence acceleration
  60. [60]
    Labor Productivity (Output per Hour) for All Workers (OPHNFB) | FRED
    Graph and download economic data for Nonfarm Business Sector: Labor Productivity (Output per Hour) for All Workers (OPHNFB) from Q1 1947 to Q2 2025 about ...
  61. [61]
    Productivity: output per hour worked - Our World in Data
    Productivity is calculated as GDP divided by the total number of hours worked in the economy. This data is adjusted for inflation and differences in living ...
  62. [62]
    Is High Productivity Growth Returning?
    Jan 15, 2025 · Economists find preliminary evidence that productivity may be on a higher growth trajectory, contributing to a faster pace of growth for the ...
  63. [63]
    Scientist Says Humans Will Reach the Singularity Within 20 Years
    Jun 30, 2025 · Ray Kurzweil predicts humans and AI will merge by 2045, boosting intelligence a millionfold with nanobots, bringing both hope and challenges ...Singularity · Futurist Predicts Nanorobots... · Experts Simulated 500 Million... · AI
  64. [64]
    When Will AGI/Singularity Happen? 8,590 Predictions Analyzed
    The surveyed AI experts estimate that AGI will probably (over 50% chance) emerge between 2040 and 2050 and is very likely (90% chance) to appear by 2075. Once ...
  65. [65]
    Shrinking AGI timelines: a review of expert forecasts - 80,000 Hours
    Mar 21, 2025 · This article is an overview of what five different types of experts say about when we'll reach AGI, and what we can learn from them.AI experts · 2. AI researchers in general · Expert forecasters · 3. Metaculus
  66. [66]
    When Will the First General AI Be Announced? - Metaculus
    Starting from weak-AGI ~mid-2027, I expect ~3–4 years to robust cross-domain competence with strategic planning, stable autonomy, and reliable self-improvement ...
  67. [67]
    The Gentle Singularity - Sam Altman
    Jun 10, 2025 · 2026 will likely see the arrival of systems that can figure out novel insights. 2027 may see the arrival of robots that can do tasks in the real ...
  68. [68]
    The Singularity by 2045, Plus 6 Other Ray Kurzweil Predictions
    Jul 22, 2024 · Ray Kurzweil predicts that the singularity (artificial intelligence surpassing human intelligence) will happen by 2045.<|control11|><|separator|>
  69. [69]
    Ray Kurzweil's 5 Major AI predictions for the future and ... - LinkedIn
    Nov 24, 2024 · AGI (2026-2029): AI will replicate and surpass human intelligence. 2. Longevity escape velocity (2030): AI-driven advancements will achieve life ...
  70. [70]
    AI scientist Ray Kurzweil: 'We are going to expand intelligence a ...
    Jun 29, 2024 · Now, nearly 20 years on, Kurzweil, 76, has a sequel, The Singularity Is Nearer – and some of his predictions no longer seem so wacky.
  71. [71]
    AI can radically lengthen your lifespan, says futurist Ray Kurzweil
    Jun 25, 2024 · Using these technologies, by the end of the 2030s we will largely be able to overcome diseases and the aging process. The 2020s will feature ...
  72. [72]
    Could AI extend your life indefinitely? Futurist Ray Kurzweil thinks so
    Jul 11, 2025 · "I don't guarantee immortality. I'm talking about longevity escape velocity, where we can keep going without getting older. We won't be aging in ...
  73. [73]
    Ray Kurzweil: Solar Will Power the World in 16 Years - Big Think
    Mar 17, 2011 · Solar power, driven by exponentially-increasing nanotechnology, will satisfy the entire world's energy needs in 16 years.
  74. [74]
    Ray Kurzweil: Solar Will Dominate Energy Within 12 Years | Fortune
    Apr 16, 2016 · Okay, technically, that would suggest solar would have a 128% share of the market in 12 years. Some might love that—but it highlights the fact ...
  75. [75]
    Ray Kurzweil: 2022-2025 Updates - LifeArchitect.ai
    AI is going to change everything all at once. Take energy, for example. The Earth receives 10,000 times the energy it needs from sunlight, and energy from solar ...<|separator|>
  76. [76]
    The Next 25 Years of Nanoscience and Nanotechnology: A Nano ...
    Aug 27, 2025 · We expect that nanoscale characterization will accelerate breakthroughs in QIST, computing platforms, energy storage architectures, and bio–nano ...
  77. [77]
    Ray Kurzweil Defends His 2009 Predictions - Forbes
    Mar 21, 2012 · For example, the prediction that we would have self-driving cars was regarded as wrong even though Google self-driving cars have logged over ...
  78. [78]
  79. [79]
    [PDF] Notes on Landauer's principle, reversible computation ... - cs.Princeton
    This is Landauer's principle. Typically the entropy increase takes the form of energy imported into the computer, converted to heat, and dissipated into the ...
  80. [80]
    Computation, Energy-Efficiency, and Landauer's Principle - Stanford
    Dec 5, 2016 · As noted by Landauer, the energy dissipation of most computational processes (including those performed by modern computers) has an "unavoidable ...<|separator|>
  81. [81]
    Physical Limits of Computation - Scientific Research Publishing
    The fundamental limit of the power density appears to be approximately 1000 W/cm2. A power density of 790 W/cm2 has already been achieved by using water cooling ...
  82. [82]
    (PDF) The physical limits of computing - ResearchGate
    Aug 6, 2025 · Many of the fundamental limits on information processing, from thermodynamics, relativity, and quantum mechanics, are only a few decades away.
  83. [83]
    The Fundamental Physical Limits of Computation - Scientific American
    Jun 1, 2011 · Any limits we find must be based solely on fundamental physical principles, not on whatever technology we may currently be using.
  84. [84]
    [PDF] Ultimate physical limits to computation - The Simulation Argument
    Physical limits to computation are determined by the speed of light, Planck's reduced constant, and the gravitational constant. Energy and degrees of freedom ...
  85. [85]
    Fundamental energy cost of finite-time parallelizable computing
    Jan 27, 2023 · We find that the energy cost per operation of a parallel computer can be kept close to the Landauer limit even for large problem sizes, whereas ...
  86. [86]
    [PDF] Fundamental Energy Limits and Reversible Computing Revisited
    Problem: Landauer's Principle teaches us that losing computational information (merging computational states) implies unavoidable energy dissipation. Solution: ...
  87. [87]
    The future of computing beyond Moore's Law - Journals
    Jan 20, 2020 · Moore's Law [1] is a techno-economic model that has enabled the IT industry to double the performance and functionality of digital electronics ...<|separator|>
  88. [88]
  89. [89]
    China expands rare earths restrictions, targets defense and chips ...
    Oct 10, 2025 · China dramatically expanded its rare earths export controls on Thursday, adding five new elements and extra scrutiny for semiconductor users ...Missing: constraints | Show results with:constraints
  90. [90]
    China's New Rare Earth and Magnet Restrictions Threaten ... - CSIS
    Oct 9, 2025 · China has imposed its most stringent rare earth and magnet export controls yet, restricting products with even trace Chinese content.Missing: constraints | Show results with:constraints
  91. [91]
    We did the math on AI's energy footprint. Here's the story you haven't ...
    May 20, 2025 · At that point, AI alone could consume as much electricity annually as 22% of all US households.Power Hungry · Four reasons to be optimistic... · Can nuclear power really fuel...<|separator|>
  92. [92]
    The multi-faceted challenge of powering AI | MIT Energy Initiative
    Jan 7, 2025 · Providing electricity to power-hungry data centers is stressing grids, raising prices for consumers, and slowing the transition to clean energy.
  93. [93]
    Expanding U.S. Net Available Power Capacity by 2030 - RAND
    Oct 13, 2025 · The authors of this report identified the key constraints limiting the grid's ability to expand capacity and support future AI growth. They also ...
  94. [94]
    A Strategy for The United States to Regain its Position in ... - CSIS
    Feb 13, 2024 · Today, 50 years later a fabrication facility for advanced semiconductor chips can cost between $20-30 billion.
  95. [95]
    Navigating the Costly Economics of Chip Making | BCG
    Sep 28, 2023 · In 2020, we explored the economics of various types of semiconductor fabs. Our goal then was to examine the categories of capital and costs ...
  96. [96]
    Exponential Laws of Computing Growth - Communications of the ACM
    Jan 1, 2017 · Diminishing returns then set in, signaling the need to jump to another technology, system design, or class of application or community. At the ...
  97. [97]
    [PDF] Moore's Law is dead, long live Moore's Law! - arXiv
    In short, the growth rate of transistor density is slowing down, and the frequency improvement is even harder, it is more difficult to increase performance.
  98. [98]
    The U.S. productivity slowdown: an economy-wide and industry ...
    The slow growth observed since 2010 has been even more striking: labor productivity grew just 0.8 percent from 2010 to 2018. As the slowdown in labor ...
  99. [99]
    Why Hasn't Technology Sped Up Productivity? - Chicago Booth
    Feb 5, 2018 · The productivity slowdown is indeed widespread: it's occurring in 29 of 30 countries in the Organisation for Economic Co-operation and ...
  100. [100]
    Review of Kurzweil, 'The Singularity is Near' - LessWrong
    Nov 24, 2011 · Plenty of technologies have violated his law of accelerating returns, and Kurzweil doesn't mention them. This cherry-picking is one of the two ...
  101. [101]
    All Exponentials are Eventually S-Curves - LessWrong
    Sep 3, 2025 · An exponent models things locally, at an appropriate level of detail for modeling them locally. An S-curve won't actually be an S-curve, there ...Exponential growth is the baselineS-Curves for Trend ForecastingMore results from www.lesswrong.comMissing: critique | Show results with:critique
  102. [102]
    How to solve the puzzle of missing productivity growth | Brookings
    May 21, 2021 · Indeed, the pace of productivity growth has decelerated in the past two decades—from an average of 2.8% per year in the decade ending in 2005, ...
  103. [103]
    [PDF] Is US Economic Growth Over? Faltering Innovation Confronts the Six ...
    Part of this leap forward was due to technological advances developed during the 1930s (Field, 2011), and ... Gordon, Robert J. (2000). “Does the New Economy ...Missing: stagnation | Show results with:stagnation
  104. [104]
    The Flying Cars We Never Got: Are We Wrong About What Caused ...
    Aug 31, 2022 · What caused the Great Stagnation, the slowing of measured US productivity and economic growth since the early 1970s?
  105. [105]
    Technological stagnation: Why I came around - The Roots of Progress
    Jan 23, 2021 · Gordon's own book points out that growth in output per hour has slowed from an average annual rate of 2.82% in the period 1920-1970, to 1.62% in ...
  106. [106]
    The Naïveté of "Exponential" Growth - Stanford Computer Science
    Critics of the technological singularity idea speak through a common thread of arguments: unbounded explosive growth in artificial intelligence is not ...
  107. [107]
    The development of Kondratieff's theory of long waves - Nature
    Feb 13, 2023 · The article discusses the fundamental issues of the emergence of a new theory related to the evolution of Kondratieff waves in the context of modern drivers of ...
  108. [108]
    Kondratieff Waves, Technological Modes, and the Theory of ...
    We show that the multifunctional character of the world economy is an important factor for the origin of innovation waves. Since the K-waves are associated only ...
  109. [109]
    [PDF] Cyclical phenomena in technological change - arXiv
    The process of technological change can be regarded as a non-deterministic system governed by factors of a cumulative nature that generate cyclical phenomena.
  110. [110]
    Technology's Favorite Curve: The S-Curve (and Why It Matters) |
    the S curve. Technology starts out expensive, ...
  111. [111]
    S-Curves for Trend Forecasting - LessWrong
    Jan 23, 2019 · S-curves and s-curve patterns are a useful tool for quickly analyzing systems, particularly when looking at diffusion of trends and evolution of innovations.<|separator|>
  112. [112]
    [PDF] Non-Exponential Growth Theory - The Econometric Society
    Sep 5, 2024 · Definition 1. A non-exponential asymptotic steady state is a situation in which the num- ber of goods follows Equation (2), while the paths of ...
  113. [113]
    Technologies are not exponential - Medium
    Oct 11, 2018 · It turns out that most technology is not exponential at all. In fact, exponential productivity growth and exponential improvements of processors ...Missing: progress | Show results with:progress
  114. [114]
    OpenAI GPT-3, the most powerful language model: An Overview
    Mar 14, 2022 · On June 11, 2020, GPT-3 was launched as a beta version. The full version of GPT-3 has a capacity of 175 billion ML parameters. GPT-2 has 1.5 ...
  115. [115]
    The 2020 Breakthrough That Supercharged AI: Scaling Laws and ...
    Aug 25, 2025 · In 2020, an OpenAI research paper uncovered a simple but powerful recipe for making AI models much smarter: make them bigger, feed them more ...
  116. [116]
    Scaling laws literature review - Epoch AI
    Jan 26, 2023 · Hoffmann et al. (2022) revealed that the scaling laws found by Kaplan et al. (2020) were suboptimal, after finding better hyperparameter ...Missing: evidence | Show results with:evidence
  117. [117]
    The ChatGPT (Generative Artificial Intelligence) Revolution Has ...
    In November 2022, OpenAI publicly launched its large language model (LLM), ChatGPT, and reached the milestone of having over 100 million users in only 2 ...
  118. [118]
    How ChatGPT changed… well, almost everything - Cisco Newsroom
    Dec 6, 2024 · In just two years, artificial intelligence programs like OpenAI's lively chatbot have evolved into essential tools for transforming businesses.
  119. [119]
    GPT-4 - OpenAI
    Mar 14, 2023 · We are releasing GPT‑4's text input capability ... We also evaluated GPT‑4 on traditional benchmarks designed for machine learning models.<|control11|><|separator|>
  120. [120]
    GPT-4o Guide: How it Works, Use Cases, Pricing, Benchmarks
    GPT-4o Release Date. As of July 19, 2024, many features of GPT-4o have been gradually rolled out. The text and image capabilities are added for many ...
  121. [121]
    Introducing GPT-4.1 in the API - OpenAI
    Apr 14, 2025 · GPT‑4.1 mini is a significant leap in small model performance, even beating GPT‑4o in many benchmarks. It matches or exceeds GPT‑4o in ...Try in Playground · SWE-bench Verified · Predicted outputs guide
  122. [122]
    The 2025 AI Index Report | Stanford HAI
    Nearly 90% of notable AI models in 2024 came from industry, up from 60% in 2023, while academia remains the top source of highly cited research. Model scale ...Status · 2024 · Responsible AI · The 2023 AI Index ReportMissing: AGI timelines
  123. [123]
    The impact of artificial intelligence on organizational performance
    By automating mundane tasks and providing employees with more challenging and fulfilling work, AI can increase job satisfaction and motivation (Russel and ...
  124. [124]
    Key Benefits of AI in 2025: How AI Transforms Industries
    Apr 1, 2025 · Additionally, implementing generative AI tools leads to an average performance improvement of 66%, with even greater gains for complex tasks.Missing: software 2020-2025
  125. [125]
    The Projected Impact of Generative AI on Future Productivity Growth
    Sep 8, 2025 · We estimate that AI will increase productivity and GDP by 1.5% by 2035, nearly 3% by 2055, and 3.7% by 2075. AI's boost to annual ...
  126. [126]
    AI Giants Rethink Model Training Strategy as Scaling Laws Break ...
    Nov 20, 2024 · Scaling law basics: A classic 2020 paper shows that, assuming a sufficient quantity of data, a transformer network's performance rises ...Missing: post- | Show results with:post-
  127. [127]
    CRISPR Clinical Trials: A 2025 Update - Innovative Genomics Institute
    Jul 9, 2025 · An update on the progress of CRISPR clinical trials with the latest data and a survey of the CRISPR landscape in 2025.Missing: synthetic | Show results with:synthetic
  128. [128]
    CRISPR Therapy Progress and Prospects | The Scientist
    Jun 6, 2025 · CRISPR therapy is a promising gene editing approach that uses genome-targeting tools such as the CRISPR-Cas9 system to treat diseases by correcting mutations.
  129. [129]
    AI-powered CRISPR could lead to faster gene therapies, Stanford ...
    Sep 16, 2025 · CRISPR-GPT, a large language model developed at Stanford Medicine, is accelerating gene-editing processes and increasing accessibility to CRISPR ...Missing: 2020-2025 mRNA
  130. [130]
    Revolutionizing CRISPR technology with artificial intelligence - Nature
    Jul 31, 2025 · CRISPR technology has revolutionized the field through its simplicity and ability to target specific genome regions via guide RNA and Cas ...Missing: mRNA | Show results with:mRNA
  131. [131]
    Which trends are set to shape the biotech industry in 2025?
    Jan 13, 2025 · According to a report released in December 2024, the global CRISPR and Cas gene market is expected to grow from $3.3 billion in 2023 to $8.8 ...
  132. [132]
    [PDF] biotechnology - 2025 TECH TRENDS REPORT • 18TH EDITION
    Mar 10, 2025 · In medicine, synthetic biology is enhancing cell-based therapies by programming stem cells with CRISPR to regenerate damaged tissues or produce.
  133. [133]
    Top 10 Trends in Biotechnology in 2025 | StartUs Insights
    Mar 20, 2025 · The advancements in data analytics, automation, and precision treatments are transforming biotechnology in 2025. Trends in biotechnology ...
  134. [134]
    Exploring superconducting electrons in twisted graphene
    Mar 3, 2025 · “In twisted graphene, electrons slow down, and the interaction between them somehow mixes with quantum mechanics in a bizarre way to create a ' ...
  135. [135]
    Promotion of superconductivity in magic-angle graphene multilayers
    Sep 29, 2022 · Graphene bilayers and trilayers consisting of monolayers twisted at just the right angle have been shown to be superconducting.
  136. [136]
    Observation of a second Dirac point in a graphene/superconductor ...
    Aug 1, 2024 · Here we show that single layer graphene coupled to the low-density superconductor indium oxide (InO) exhibits two charge neutrality points.
  137. [137]
    Discovery of new growth-directed graphene stacking domains may ...
    Dec 10, 2024 · A new phenomenon in graphene research, observing growth-induced self-organized ABA and ABC stacking domains that could kick-start the development of advanced ...Missing: exponential nanomaterials
  138. [138]
    Superconductor Discovery in the Emerging Paradigm of Materials ...
    In this work, we review the computationally driven discoveries and the recent developments in the field from various essential aspects.
  139. [139]
    Surprisingly diverse innovations led to dramatically cheaper solar ...
    Aug 11, 2025 · A new study reveals key innovations that contributed to the rapid decline of solar energy systems, showing that many of the most significant ...Missing: acceleration fusion 2020-2025
  140. [140]
    The Best Solar Panel Innovations to Watch in 2025 - JMS Energy
    Sep 22, 2025 · One of the most revolutionary solar panel innovations in 2025 is the rise of perovskite solar cells. These advanced materials can absorb more ...Why Solar Panel Innovation... · Breakthrough Solar Panel... · Faqs On Solar Panel...Missing: acceleration 2020-2025
  141. [141]
    Beyond Moore's Law. Examining the Parallels Between Compute…
    Jan 23, 2025 · The parallels between Moore's Law, renewable energy, and battery technology reveal an inspiring story of human ingenuity and progress. All three ...Missing: analogs biotech
  142. [142]
    Interactive: Highlights in energy innovation – The State of ... - IEA
    This innovative battery offers energy density comparable to lead-acid batteries but without the toxic lead, providing a cost-effective and environmentally ...
  143. [143]
  144. [144]
    Is AI already driving U.S. growth? | J.P. Morgan Asset Management
    In the first half of 2025, AI-related capital expenditures contributed 1.1% to GDP growth, outpacing the U.S. consumer as an engine of expansion.
  145. [145]
    A new look at the economics of AI | MIT Sloan
    Jan 21, 2025 · AI will affect almost 40% of jobs around the world, according to the International Monetary Fund. It will increase global GDP by $7 trillion — ...
  146. [146]
    Could Advanced AI Drive Explosive Economic Growth?
    Jun 25, 2021 · This report evaluates the likelihood of 'explosive growth', meaning > 30% annual growth of gross world product (GWP), occurring by 2100.
  147. [147]
    Study: Automation drives income inequality | MIT News
    Nov 21, 2022 · New data suggest most of the growth in income inequality since 1980 comes from automation displacing less-educated workers.
  148. [148]
    Inequality and technological change - Macera - Wiley Online Library
    May 21, 2024 · Specifically, technological change in labor productivity explains 87% of the increase in between-group inequality and 66% and 52% of the ...
  149. [149]
    Inequality in the digital economy: The impact of artificial intelligence ...
    The findings show that AI development worsens the income gap, with this result confirmed by robustness tests. Heterogeneity analysis indicates a stronger effect ...
  150. [150]
    60+ Stats On AI Replacing Jobs (2025) - Exploding Topics
    Oct 3, 2025 · The 2025 Future of Jobs report found that 92 million roles could be displaced by 2030, although it forecast a net gain of 78 million new jobs.
  151. [151]
    The Fearless Future: 2025 Global AI Jobs Barometer - PwC
    Jun 3, 2025 · Skills for AI-exposed jobs are changing 66% faster than for other jobs: more than 2.5x faster than last year. The AI-driven skills earthquake is ...
  152. [152]
    The impact of rapid technological change on sustainable development
    Feb 17, 2020 · Rapid technological change involves, among others, technologies like big data, the Internet of things, machine learning, artificial intelligence, robotics, 3D ...
  153. [153]
    [PDF] Technology, growth, and inequality - Brookings Institution
    The pandemic has accelerated the shift. Booming technology but slowing productivity and rising inequality. Technology drives productivity and productivity ...
  154. [154]
    AI-induced job impact: Complementary or substitution? Empirical ...
    This study utilizes 3,682 full-time workers to examine perceptions of AI-induced job displacement risk and evaluate AI's potential complementary effects on ...<|separator|>
  155. [155]
    Technological change in five industries: Threats to jobs, wages, and ...
    Sep 28, 2022 · Our team of researchers conducted multi-year studies of each industry, examining how new technologies are changing work and why.
  156. [156]
    [PDF] Policy Challenges of Accelerating Technological Change
    This paper examines policy, legal, ethical, and strategy implications for national security of the accelerating science, technology, and engineering (ST&E) ...
  157. [157]
    New Tech, New Threats, and New Governance Challenges
    Aug 28, 2019 · The array of new technologies emerging on the world stage, the new threats they can pose, and the associated governance dilemmas highlight a set of common ...
  158. [158]
    [PDF] AN INCLUSIVE FUTURE? TECHNOLOGY, NEW DYNAMICS, AND ...
    Unfortunately, policies and institutions have been slow to rise to the challenges of technological change. The outcomes of rising inequality and slowing ...
  159. [159]
    AI and the Future of Workforce Training
    However, significant challenges persist in the current workforce development landscape. These include fragmented training systems, insufficient public funding, ...
  160. [160]
    [PDF] Technology and the Future of Work - DNI.gov
    The future workplace is likely to be increasingly flexible but also increasingly insecure as companies demand new skill sets while no longer providing employees ...
  161. [161]
    Adapting jobs policies to technological change - World Bank Blogs
    Nov 4, 2020 · The changing nature of work, including diverse and fluid forms of employment—that is, the “gig” economy and part-time work—challenges this model ...