Fact-checked by Grok 2 weeks ago

Theory of everything

A theory of everything (ToE) is a hypothetical, all-encompassing, coherent theoretical framework in physics that seeks to unify the four fundamental interactions—, , the , and the weak nuclear force—into a single consistent model capable of describing all physical phenomena across all scales, from subatomic particles to the entire universe. This ultimate goal stems from the longstanding challenge of reconciling , Albert Einstein's theory of describing large-scale structures like planets and galaxies, with , which governs the other three forces and particle interactions at microscopic scales. The incompatibility arises because treats as a smooth, continuous fabric, while introduces inherent uncertainties and discreteness, leading to mathematical inconsistencies, such as infinities, when attempts are made to quantize directly. The quest for a ToE traces its modern roots to the early , building on Einstein's unsuccessful efforts to unify and into a , and has intensified since the 1970s with the development of the of , which successfully unifies , the weak force, and the strong force but excludes . Key motivations include resolving paradoxes in extreme conditions, such as the singularity or interiors, where both quantum effects and strong dominate, potentially revealing the fundamental nature of itself. Despite theoretical progress, no experimentally verified ToE exists as of 2025, partly due to the immense energies required—at the Planck scale (approximately 10^{-35} meters or 10^{19} GeV)—which are inaccessible to current particle accelerators like the . Prominent candidate theories include and , which propose that fundamental particles are tiny vibrating strings in higher (10 or 11) dimensions, naturally incorporating gravity, and (LQG), which quantizes into discrete structures without . These approaches face challenges, such as the vast landscape of possible vacua in and difficulties incorporating the in LQG. Recent observational efforts, including NASA's analysis of the in 2020 and a proposed 2025 experiment to test quantum-gravity interactions, have constrained models like axion-like particles in but provided no confirmation. In 2025, researchers at proposed a new quantum theory of gravity compatible with the , advancing efforts toward unification. Ongoing research emphasizes interdisciplinary methods, including holographic principles (like the AdS/CFT correspondence) and effective field theories, to probe indirectly through cosmology, , and high-energy experiments. The pursuit of a ToE remains one of physics' grandest challenges, promising not only a deeper understanding of the universe's origins and fate but also potential technological breakthroughs in and .

Terminology

Origin of the Term

The term "theory of everything" (TOE) was first introduced in scientific literature by physicist John Ellis in his 1986 article published in Nature, where he used it somewhat ironically to describe the ambitious potential of superstring theory as a unified framework extending beyond the Standard Model of particle physics. Ellis, a theorist at CERN, employed the phrase in the title "The Superstring: Theory of Everything, or of Nothing?" to highlight both the promise and the uncertainties of this approach to unifying fundamental forces. Prior to Ellis's formal usage, the concept of a comprehensive unification had been pursued informally under different nomenclature, notably by in the mid-20th century. In the 1950s, Einstein sought a "" to integrate with within the framework of , viewing it as a pathway to describing all physical phenomena through a single set of equations. His efforts, detailed in publications like his 1950 Scientific American article "On the Generalized Theory of Gravitation," represented an early precursor to the modern TOE ideal, though it predated quantum considerations and the . The term gained widespread popularization through Stephen Hawking's 1988 bestselling book , in which he described a TOE as the ultimate of capable of explaining the entire universe's structure and evolution. Hawking positioned it as the culmination of scientific progress, bridging and to answer fundamental questions about the . This exposure in accessible media helped embed the phrase in public discourse, distinguishing it from narrower unification efforts. Importantly, a TOE differs from a "" (GUT), which aims to merge only the strong, weak, and electromagnetic forces while excluding . GUTs, such as the SU(5) model proposed in the 1970s, operate within the quantum framework of the but fall short of incorporating gravitational effects, whereas a TOE seeks full integration of all four fundamental interactions.

Definition and Scope

In physics, a theory of everything (TOE) is conceived as a single, consistent theoretical framework that reconciles with while unifying all four fundamental forces: , , the strong , and the weak nuclear force. This unification posits the forces as manifestations of a single underlying interaction, potentially emerging from a "superforce" at extremely high energies, such as those prevalent in the early universe. The pursuit of such a theory addresses the core incompatibility between , which successfully describes the electromagnetic, strong, and weak forces, and , which governs on large scales. A TOE must predict all particle interactions across every energy scale, from the Planck scale (~10¹⁹ GeV) to macroscopic regimes, thereby providing a complete description of and behavior without divergences or inconsistencies. It should also explain the origins and values of fundamental constants, such as the G or the α, which are currently input parameters in existing models rather than derived quantities. Furthermore, resolving singularities—points of infinite density like those at centers or the origin—is essential, as predicts these breakdowns while suggests finite resolutions at the Planck length (~10⁻³⁵ m). The scope of a TOE is confined to the fundamental laws of nature governing elementary particles and interactions, excluding emergent phenomena such as , condensed properties, or biological processes, which arise from collective behaviors at higher scales. Unlike effective field theories (EFTs), which provide accurate approximations valid only within specific energy ranges by integrating out high-energy details, a TOE aims for a complete, unification without scale-dependent cutoffs or parameters.

Historical Background

Ancient and Pre-Modern Concepts

In , the concept of a unified natural order began to emerge through , proposed by and developed by in the 5th century BCE. This theory posited that all matter consists of indivisible, eternal particles called atoms moving through an infinite void, with differences in atomic shapes, sizes, and arrangements accounting for the diversity of substances and phenomena in the . Democritus viewed this atomic framework as a comprehensive explanation for the , eliminating the need for and establishing a mechanistic basis for natural processes. In contrast, Aristotle (384–322 BCE) offered a holistic system of natural philosophy that unified the sublunary and celestial realms through four elemental substances—earth, water, air, and fire—each characterized by specific qualities (hot, cold, wet, dry) that allowed for their transformations into one another. Above the terrestrial sphere, Aristotle introduced the fifth element, aether, as an incorruptible, eternal substance composing the heavens and enabling uniform circular motion, thus integrating terrestrial changes with celestial perfection under a teleological worldview where all things strive toward their natural places and purposes. During the medieval period, scholastic thinkers synthesized Aristotelian with , creating a unified intellectual framework for understanding the created order. (1225–1274) exemplified this integration in works like the , where he reconciled Aristotle's elements and causality with divine creation, positing that the four elements and operate within a hierarchical governed by God's rational design, with natural motions reflecting eternal truths. In the 17th and 18th centuries, mechanistic philosophies advanced toward a more unified view of forces, though still pre-quantum in nature. (1596–1650) proposed a vortex model in his Principia Philosophiae (1644), envisioning the as filled with swirling ethereal matter that carries celestial bodies in vortical motions, thereby explaining planetary orbits and gravitational effects through mechanical interactions without . (1643–1727), in his (1687), revolutionized this approach by introducing universal gravitation as a single force acting instantaneously between all masses, unifying terrestrial and under mathematical laws while critiquing vortex theories as insufficient. These developments laid groundwork for later 19th-century efforts, such as the unification of and .

19th and Early 20th Century Foundations

In the mid-19th century, James Clerk Maxwell achieved a major unification in physics by combining the previously separate phenomena of and into a coherent theory of . This synthesis demonstrated that electric and magnetic fields are interconnected aspects of a single , propagating through space as waves at the , thereby identifying itself as an electromagnetic phenomenon. Maxwell's seminal work, "A Dynamical Theory of the Electromagnetic Field," published in , formalized these ideas through a set of four partial differential equations, now known as , which govern the behavior of electric and magnetic fields in the presence of charges and currents. The equations, in their differential form, are: \nabla \cdot \mathbf{D} = \rho, \quad \nabla \cdot \mathbf{B} = 0, \quad \nabla \times \mathbf{E} = -\frac{\partial \mathbf{B}}{\partial t}, \quad \nabla \times \mathbf{H} = \mathbf{J} + \frac{\partial \mathbf{D}}{\partial t} where \mathbf{D} is the electric displacement field, \mathbf{B} the magnetic field, \mathbf{E} the electric field, \mathbf{H} the magnetic field strength, \rho the charge density, and \mathbf{J} the current density. This framework provided a classical foundation for understanding forces beyond gravity and set the stage for later efforts to incorporate additional interactions. Toward the end of the , the discovery of introduced evidence of subatomic processes and hinted at underlying forces within atomic nuclei. In 1896, French physicist observed that salts emit invisible rays capable of penetrating materials and exposing photographic plates, even in the absence of light or external excitation, marking the first identification of spontaneous from an element. Building on this, Pierre and isolated two highly radioactive elements from ore in 1898: , which they named after Marie's native , and , which exhibited over a million times the of . Their joint paper announcing these findings emphasized the rays' chemical and energetic potency, suggesting new atomic-scale interactions distinct from or . In the early 20th century, Albert Einstein's theory of , introduced in his 1905 paper "On the Electrodynamics of Moving Bodies," further integrated with by positing that the is constant in all inertial frames, leading to the relativity of and time. This reformulation resolved inconsistencies between Newtonian and , establishing that and time form a unified four-dimensional . A key consequence was the mass-energy equivalence principle, expressed as E = mc^2, where E is , m is , and c is the , implying that mass can be converted to energy and vice versa. Einstein extended this framework in 1915 with the general theory of relativity, which described not as a force but as the curvature of caused by and , encapsulated in the field equations G_{\mu\nu} = \frac{8\pi G}{c^4} T_{\mu\nu}. Presented in his paper "The Field Equations of Gravitation," this geometric interpretation unified with the principles of , predicting phenomena like the bending of light by massive bodies. However, general relativity's deterministic, continuous nature proved incompatible with the probabilistic, quantized ideas emerging in physics after 1900, such as Planck's hypothesis of energy quanta, highlighting the need for a deeper unification.

Mid-to-Late 20th Century Unification Attempts

In the mid-20th century, significant progress toward unifying fundamental forces began with the development of in the 1940s, which successfully integrated with and . Pioneered independently by Sin-Itiro Tomonaga, , and , QED addressed the infinities arising in through the technique of , allowing for precise predictions of phenomena like the and the anomalous magnetic moment of the . This framework, formalized in a unified manner by , marked the first to achieve full consistency between and , earning its creators the 1965 . Building on QED's success, the and saw the electroweak unification, which combined the electromagnetic and weak forces under the SU(2) × U(1) gauge symmetry group. proposed the initial model in 1961, introducing intermediate vector bosons to mediate weak interactions while preserving gauge invariance. and extended this in 1967 by incorporating via the , predicting massive as carriers of the weak force. The theory's validity was experimentally confirmed in 1983 with the discovery of the W and Z bosons at CERN's by the UA1 and UA2 collaborations, whose masses aligned closely with predictions, solidifying electroweak unification as a cornerstone of the . Parallel efforts in the 1970s led to the formulation of quantum chromodynamics (QCD), describing the strong nuclear force through the exchange of gluons between quarks carrying color charge. Murray Gell-Mann and Harald Fritzsch, along with others, developed QCD as a non-Abelian gauge theory based on SU(3) color symmetry, introducing eight gluons as mediators that themselves possess color charge, enabling self-interactions. This resolved issues like the quark statistics in hadrons and explained asymptotic freedom, where the strong coupling weakens at high energies, as demonstrated by David Gross, David Politzer, and Frank Wilczek in 1973. QCD's predictions, such as deep inelastic scattering scaling, were verified at SLAC, establishing it as the theory of the strong force. Early attempts to incorporate gravity into unification, such as the Kaluza-Klein theory from the 1920s, proposed extra spatial dimensions to geometrically unify gravity and within a five-dimensional framework. Kaluza's 1921 work derived from Einstein's field equations in five dimensions, with later compactifying the extra dimension to explain its unobservability. However, these classical approaches, including Albert Einstein's prolonged pursuits from the 1920s through the 1950s—such as non-symmetric metrics and bi-vector fields—failed to account for quantum effects or particle spectra. Their inability to be quantized renormalizably, as infinities persisted without a viable scheme unlike in , rendered them incompatible with emerging quantum field theories by the mid-20th century.

Prerequisite Theories

Standard Model of Particle Physics

The of particle physics is a that unifies the electromagnetic, weak, and strong nuclear forces, describing the interactions of fundamental particles excluding . Developed through the electroweak unification proposed by Glashow, Weinberg, and Salam in the late 1960s and early 1970s, combined with (QCD) for the strong force, it serves as the cornerstone of modern . The model treats particles as excitations of underlying quantum fields and has been extensively validated by experiments, though it leaves several parameters undetermined. The features 17 fundamental particles: six (up, down, charm, strange, top, and bottom), which carry and experience the strong force; six leptons (, , , and their corresponding neutrinos), which do not carry color; and five bosons—the (mediating ), the W⁺ and W⁻ bosons and Z boson (mediating the weak force), and the (responsible for via the ). The strong force is mediated by eight massless , which are fundamental bosons associated with quark color interactions, though often grouped under the gluon type in particle counts. These particles are organized into three generations, with increasing mass from the first to the third, reflecting a hierarchical structure observed in experiments. The theoretical framework is invariant under the local gauge symmetry group SU(3)c × SU(2)L × U(1)Y, where SU(3)c governs QCD color symmetry, SU(2)L describes left-handed , and U(1)Y accounts for ; spontaneous symmetry breaking via the Higgs field reduces the electroweak part to the observed electromagnetic U(1)em symmetry. The dynamics are encoded in the Lagrangian density: \mathcal{L}_\text{SM} = -\frac{1}{4} G^a_{\mu\nu} G^{a\mu\nu} - \frac{1}{4} W^i_{\mu\nu} W^{i\mu\nu} - \frac{1}{4} B_{\mu\nu} B^{\mu\nu} + (D_\mu \Phi)^\dagger (D^\mu \Phi) - V(\Phi) + i \sum_f \bar{\psi}_f \gamma^\mu D_\mu \psi_f - \sum_f y_f \bar{\psi}_f \Phi \psi_f + \text{h.c.} Here, G^a_{\mu\nu}, W^i_{\mu\nu}, and B_{\mu\nu} are the field strength tensors for the fields (gluons, weak bosons, and , respectively); \Phi is the Higgs doublet; V(\Phi) is the Higgs potential; \psi_f are the fields; D_\mu is the incorporating gauge couplings; and y_f are Yukawa couplings. This form captures interactions, kinetics, Higgs self-interactions, and Yukawa terms for mass generation. Key successes include precise predictions of the weak mixing angle, with the effective \sin^2 \theta_W = 0.23149 \pm 0.00013 matching electroweak measurements from combined Z-pole data at LEP, SLC, and hadron colliders as of 2024. The model incorporates through a complex phase in the Cabibbo-Kobayashi-Maskawa (CKM) quark mixing matrix, explaining observed asymmetries in and B-meson decays. The 2012 discovery of the at the LHC, with mass m_H = 125.20 \pm 0.11 GeV as of 2024, confirmed the mechanism for electroweak and particle masses, as reported by ATLAS and collaborations. Despite these triumphs, the requires 19 free parameters—such as three gauge couplings, six and three charged masses, the Higgs , four CKM matrix elements, one CP-violating phase, and two additional parameters for theta QCD and strong coupling—that must be fitted to experimental data. It excludes , limiting its scope as a . Furthermore, masses, evidenced by oscillation experiments starting with Super-Kamiokande's 1998 results, necessitate extensions like right-handed neutrinos or mechanisms beyond the minimal framework.

General Relativity

General relativity, developed by in the early 20th century, provides a geometric description of as the curvature of caused by and . This theory revolutionized our understanding of the universe by replacing with a framework where gravitational effects emerge from the structure of four-dimensional . At its core, general relativity is formulated through the , which relate the geometry of to the distribution of matter and within it: G_{\mu\nu} = \frac{8\pi G}{c^4} T_{\mu\nu} Here, G_{\mu\nu} is the Einstein tensor representing spacetime curvature, T_{\mu\nu} is the stress-energy tensor describing mass-energy distribution, G is the gravitational constant, and c is the speed of light. These equations, first presented by Einstein in 1915, allow for the prediction of gravitational phenomena on cosmic scales, such as the bending of light by massive objects and the precession of planetary orbits. Key predictions of general relativity include gravitational waves, ripples in spacetime produced by accelerating masses like merging black holes, which were theoretically derived from the linearized field equations. These waves were first directly detected on September 14, 2015, by the LIGO observatories from a binary black hole merger, confirming the theory's dynamic aspects with unprecedented precision. Another cornerstone prediction is the existence of black hole event horizons, regions beyond which nothing can escape due to extreme curvature, arising from solutions like the Schwarzschild metric to the field equations for a non-rotating mass. Additionally, Einstein introduced the cosmological constant \Lambda into the field equations in 1917 to permit a static universe model, modifying them to G_{\mu\nu} + \Lambda g_{\mu\nu} = \frac{8\pi G}{c^4} T_{\mu\nu}, where g_{\mu\nu} is the metric tensor; today, \Lambda is interpreted as driving the observed accelerated expansion of the universe. The geometric interpretation of posits that mass-energy dictates the curvature of , while the curvature in turn determines the paths of particles and light via geodesics, the shortest paths in curved geometry given by the line element ds^2 = g_{\mu\nu} \, dx^\mu \, dx^\nu. This interplay is succinctly captured by physicist : "Matter tells how to curve; tells matter how to move." excels at macroscopic scales, accurately describing phenomena from planetary motion to the large-scale structure of the cosmos. However, it breaks down at the Planck length, approximately $1.6 \times 10^{-35} meters, where quantum effects become comparable to gravitational ones, necessitating a theory of for unification with other fundamental forces.

Key Incompatibilities

The key incompatibilities between of , which is a renormalizable , and , a classical theory of gravitation, arise primarily at high energies and in extreme gravitational regimes, necessitating a unified theory of everything. One fundamental issue is the non-renormalizability of when treated as a quantum field theory. In , interactions are described by coupling constants, and for theories like , these allow infinities in perturbative calculations to be absorbed through renormalization with a finite number of parameters. However, gravity's coupling constant, Newton's gravitational constant G, has dimensions of [\text{mass}]^{-2} in natural units, leading to a power-counting analysis that shows the theory requires an infinite number of counterterms at higher loop orders, rendering it non-renormalizable beyond the low-energy effective regime. This was first demonstrated at one loop by 't Hooft and Veltman in pure gravity and gravity coupled to matter fields, where divergent terms appear that cannot be absorbed into the existing action without introducing new parameters. Further confirmation came at two loops in pure Einstein gravity, where Goroff and Sagnotti explicitly computed a non-vanishing divergent contribution proportional to a new quartic curvature invariant, underscoring the perturbative breakdown at high energies around the Planck scale (\sim 10^{19} GeV). Unlike the renormalizable gauge interactions in , this non-renormalizability implies that loses predictive power at ultraviolet scales, preventing a consistent quantum description of gravitational phenomena. Another profound conflict emerges in black hole physics, exemplified by the . Hawking radiation, arising from quantum field effects in the curved near a 's , causes black holes to emit and evaporate over time. This process appears to violate quantum mechanical unitarity, as the radiation is uncorrelated with the infalling matter that formed the black hole, suggesting that about the initial state is irretrievably lost behind the . Hawking formalized this paradox by arguing that the predictability of quantum evolution breaks down in , where the semiclassical approximation leads to a mixed state for the outgoing radiation rather than a pure preserving . The in acts as a one-way , incompatible with the reversible of , thus highlighting a tension between the two frameworks at the interface of quantum fields and strong gravity. Cosmological scenarios reveal additional incompatibilities, particularly through singularities and quantum effects in the early . predicts an initial singularity, where diverges at a finite past time, rendering physical quantities undefined as the universe contracts to zero volume under the Friedmann-Lemaître-Robertson-Walker metric. The Hawking-Penrose singularity theorems establish that such geodesic incompletenesses—points where timelike or null s cannot be extended—are generic outcomes in spacetimes satisfying reasonable conditions and assumptions, as seen in the collapse to a or black hole formation. This classical singularity signals the breakdown of at high densities, where quantum effects should dominate but are absent in the theory. Compounding this, cosmic —a rapid exponential expansion in the early universe driven by a —relies on quantum fluctuations of that field to seed the observed large-scale structure and anisotropies. These primordial quantum fluctuations, stretched to macroscopic scales during , introduce density perturbations that conflict with the smooth, singularity-prone initial conditions of pure , requiring a quantum gravitational resolution to consistently describe their origin and evolution. Finally, the underscores a scale mismatch between and . The scale, set by the Higgs around 246 GeV (or electroweak scale \sim 10^2 GeV), is extraordinarily smaller than the Planck scale (\sim 10^{19} GeV), where effects become significant—a discrepancy of about 17 orders of magnitude. In the , radiative corrections to the Higgs mass from loops involving top quarks or other particles generate quadratic divergences proportional to the scale, potentially driving the Higgs mass up to the Planck scale unless an unnatural of parameters cancels these contributions to 1 part in $10^{32}. This sensitivity implies that physics at the electroweak scale is unnaturally decoupled from higher-scale , motivating mechanisms to stabilize the hierarchy without such tuning.

Primary Theoretical Frameworks

String Theory

String theory posits that the fundamental constituents of the are not zero-dimensional point particles, as in the of , but rather one-dimensional strings of finite length, typically on the order of the Planck length ($10^{-35} meters). These strings can be open, with endpoints, or closed loops, and their vibrational modes determine the properties of the particles they represent, such as , charge, and . For instance, different excitation patterns of a string correspond to familiar particles like quarks, electrons, or photons, thereby providing a unified description of and forces at the quantum level. To reconcile this framework with our observed four-dimensional (three spatial dimensions plus time), string theory requires a total of 10 dimensions: nine spatial and one temporal. The additional six spatial dimensions must be compactified, meaning they are curled up into tiny, unobservable shapes at scales far below those probed by current experiments. A prominent mechanism for this compactification involves Calabi-Yau manifolds, which are complex, Ricci-flat Kähler manifolds that preserve , allowing the theory to yield low-energy effective theories consistent with four-dimensional physics. This dimensional reduction is crucial for generating the diversity of particle generations and interactions observed in nature. One of string theory's key strengths as a candidate for a theory of everything lies in its natural unification of all fundamental forces, including , which emerges as a vibration of closed strings corresponding to the massless spin-2 particle. Unlike point-particle quantum field theories, where must be added and leads to inconsistencies at high energies, string theory incorporates seamlessly from the outset. A pivotal development was the Green-Schwarz mechanism in , which demonstrated cancellation—resolving quantum inconsistencies in the theory—through the specific of fields to the background in ten dimensions. This mechanism ensured the consistency of supersymmetric string theories and solidified their viability for unification. Perturbative formulations of reveal five consistent superstring theories in ten dimensions: Type I, which includes both open and closed strings with SO(32) gauge symmetry; Type IIA and Type IIB, which are chiral and differ in their parity properties; and two heterotic theories, one with SO(32) symmetry and the other with E8×E8 symmetry. These theories are valid up to the string scale, an energy regime around $5 \times 10^{17} GeV, beyond which effects become significant and the approximations break down. This high scale underscores the challenge of direct experimental verification but aligns with grand unification expectations.

M-Theory

M-theory is a proposed unifying framework in that extends the five consistent superstring theories in ten dimensions into a single eleven-dimensional theory, conjectured to capture their non-perturbative aspects and dualities. It was first proposed by in 1995 during a conference at the , where he suggested that the apparent inconsistencies among the superstring theories arise from different perturbative limits of a deeper, underlying structure in eleven dimensions. The "M" in has been interpreted by Witten as standing for "membrane," "mystery," or "magic," reflecting the theory's foundational role with extended objects beyond one-dimensional strings and the unresolved nature of its complete formulation at the time. Central to M-theory are extended objects known as branes, particularly the two-dimensional M2-branes and five-dimensional M5-branes, which serve as fundamental dynamical entities analogous to strings in . These branes propagate in an eleven-dimensional spacetime and can wrap around compactified dimensions to yield lower-dimensional effective theories. The low-energy effective description of M-theory is given by eleven-dimensional , whose bosonic action is S = \frac{1}{2\kappa^2} \int d^{11}x \sqrt{-g} \left( R - \frac{1}{2} |F_4|^2 \right) + \text{Chern-Simons terms}, where R is the Ricci scalar, F_4 is the field strength of the three-form gauge potential, \kappa is the gravitational coupling constant, and the Chern-Simons terms ensure anomaly cancellation and consistency. This supergravity theory, originally formulated by Cremmer, Julia, and Scherk in 1978, emerges as the limit where quantum corrections are negligible, providing a classical approximation to M-theory's dynamics. M-theory resolves the dualities observed among the superstring theories, such as T-duality—which inverts the radius of compact dimensions to relate theories at strong and weak coupling—and S-duality, which exchanges strong and weak coupling regimes directly. For instance, type IIA string theory at strong coupling is dual to M-theory compactified on a circle, while the E8 × E8 heterotic string connects via dualities involving M5-branes wrapped on tori. One key success of this framework is its explanation of black hole entropy through microscopic counting of brane configurations; Strominger and Vafa demonstrated in 1996 that the Bekenstein-Hawking entropy of certain five-dimensional extremal black holes matches the logarithm of the number of states in a system of intersecting D-branes, which lift to M-theory branes upon dimensional enhancement. Despite these advances, lacks a complete formulation, remaining defined primarily through its limits and dualities rather than a standalone . Furthermore, the theory predicts a vast "" of possible states, estimated at around $10^{500} distinct compactifications arising from fluxes and moduli stabilization in the , posing significant challenges for selecting the unique corresponding to our universe. This problem underscores ongoing difficulties in deriving low-energy phenomenology predictively from .

Loop Quantum Gravity

Loop quantum gravity (LQG) is a non-perturbative, background-independent approach to quantizing , treating as a dynamical entity without introducing additional dimensions or fundamental particles like gravitons. Developed primarily in the and by Abhay Ashtekar, , and , LQG reformulates the canonical structure of using Ashtekar variables, which recast the in terms of a self-dual SU(2) connection and conjugate , akin to gauge theories in . This formulation facilitates a representation where quantum states are defined on holonomies—path-ordered exponentials of the connection along loops—and fluxes, which are integrals of the conjugate triad fields over surfaces dual to those loops. The resulting is spanned by cylindrical functions on these loops, ensuring diffeomorphism invariance through the action of spatial diffeomorphisms on the loop configurations. In LQG, the quantum geometry emerges from spin network states, which are graphs labeled by SU(2) representations (spins) at edges and intertwiners at vertices, providing a discrete picture of spacetime at the Planck scale. Geometric operators, such as area and volume, acquire discrete spectra; for instance, the area operator acting on a surface pierced by a spin network edge with quantum number j yields eigenvalues A = 8\pi \gamma \ell_P^2 \sqrt{j(j+1)}, where \gamma is the Immirzi parameter—a dimensionless constant introduced to match black hole entropy calculations—and \ell_P = \sqrt{\hbar G / c^3} is the Planck length. This quantization resolves classical singularities: in loop quantum cosmology, an application of LQG to homogeneous spacetimes, the Big Bang singularity is replaced by a Big Bounce, where quantum effects halt contraction and initiate expansion at Planck densities, as derived from holonomy corrections to the Hamiltonian constraint. LQG preserves the invariance of at the quantum level, with no need for , making it a pure quantization of four-dimensional geometry. A key success is the microscopic derivation of , where the Bekenstein-Hawking formula S = A / (4 \ell_P^2) arises from counting spin network microstates on the horizon, with the Immirzi parameter fixed to \gamma \approx 0.274 to match the macroscopic value. Variants like spin foam models address the dynamics of LQG by summing over histories of spin networks, evolving them via two-complexes labeled by representations, providing a path-integral formulation that connects to the canonical approach. However, challenges persist in recovering the semiclassical limit of , where coherent states approximating classical geometries have been constructed but full consistency with low-energy observations remains an . fields, such as scalars or fermions from the , to this quantum geometry is straightforward at the classical level via but complicates the full quantization, particularly in preserving anomaly-free constraints and achieving a semiclassical with propagating .

Alternative and Emerging Proposals

Non-Perturbative Quantum Gravity Approaches

Non-perturbative quantum gravity approaches seek to quantize gravity without relying on perturbative expansions, addressing the ultraviolet divergences and non-renormalizability of through lattice-based, fixed-point, or reformulation strategies that preserve . These methods contrast with string theory's extra dimensions and loop quantum gravity's discrete spin networks by emphasizing geometric , causality constraints, or flows directly in four dimensions. Developed primarily in the late 20th and early 21st centuries, they aim to derive classical from quantum principles while incorporating matter and potentially unifying with the . Causal dynamical triangulation (CDT) constructs via a over Lorentzian triangulations of , enforcing through ordered simplicial complexes that evolve from initial to final hypersurfaces. Introduced by Ambjørn, Jurkiewicz, and Loll, this approach sums over geometries where each triangle respects a , avoiding the unphysical results of earlier dynamical triangulations. Numerical simulations reveal a to a four-dimensional de Sitter-like geometry at large scales, with approximately 4 and spectral dimension around 2 in the , suggesting a renormalizable without . Asymptotic safety posits that quantum gravity is renormalizable at high energies through a non-Gaussian fixed point in the flow, where Newton's constant runs with the energy scale k as G(k) ≈ g_* / k² near the fixed point, ensuring finite scattering amplitudes. Proposed by Weinberg in the as a criterion for a predictive of , the scenario gained traction in the 1990s and 2010s via the functional equation applied to the effective average action. Calculations in the Einstein-Hilbert truncation and beyond, including matter couplings, support the existence of a fixed point with relevant directions limited to a few, potentially allowing completion without new physics. Twistor theory reformulates Minkowski as a of twistors—null rays parameterized by complex Z^α = (ω^A, π_{A'}), where A and A' are unprimed and primed indices—enabling holomorphic representations of massless fields and amplitudes. Developed by Penrose in the 1960s to unify and by prioritizing light rays over points, it has found modern applications in through twistor-string theory, where worldsheet integrals over twistor space yield tree-level amplitudes incorporating interactions. This framework facilitates exact computations of loop amplitudes in gauge-gravity theories, revealing hidden symmetries and supporting non-perturbative insights into entropy and holographic dualities. Group field theory generalizes matrix models to higher dimensions by treating as a second-quantized field theory over the group of Lorentz transformations, where fields φ(g_i) on group copies generate Feynman diagrams that condense into emergent spatial geometries via mean-field approximations. Linking to through holonomy-flux variables, it derives spin foam amplitudes as effective descriptions, with phase transitions producing four-dimensional Regge geometries from quantum fluctuations. This approach, formalized in the mid-2000s, allows incorporation of cosmological dynamics and matter fields, providing a unified framework for from group-theoretic building blocks.

Recent 21st-Century Developments

In the early , physicist Erik Verlinde proposed , suggesting that gravity emerges as an arising from changes in , rather than being a . This framework draws on holographic principles, where the associated with on a surface encodes the dynamics of the bulk , leading to Newton's law of gravitation as a thermodynamic consequence. Verlinde's 2010 paper demonstrated how the implies that inertial laws stem from entropic origins, providing a bridge between quantum information theory and gravitational phenomena without requiring new particles or fields. In late 2024, the Alena Tensor was introduced as a novel mathematical construct aimed at reconciling and through that model quantum geometry. Defined as a class of energy-momentum tensors ensuring equivalence between curved paths and geodesics, the Alena Tensor facilitates unification by embedding gravitational effects into frameworks, potentially resolving inconsistencies in calculations. Researchers have shown it generates Higgs-like potentials and in a manner compatible with observed particle interactions, offering a pathway to incorporate and electrodynamics into a cohesive TOE structure. That same year, the three-dimensional time theory emerged as a , positing that time possesses three fundamental dimensions, with spatial dimensions arising as secondary projections from quantum symmetries. This approach derives gravitational effects directly from quantum principles by treating the three temporal axes as primary, where one governs quantum , another handles relativistic motion, and the third accounts for cosmic . By reorienting metrics to prioritize temporal structure, the theory predicts emergent as a holographic byproduct, potentially unifying particle interactions and without invoking extra spatial dimensions. Parallel developments in 2025 introduced , conceptualizing the as a holographic system where cells store from all events, thereby resolving paradoxes. In this model, discrete quantum imprints on fabric maintain unitarity, ensuring that is preserved rather than lost during , as per the quantum memory matrix framework. The theory posits that gravitational dynamics arise from density gradients in these memory cells, aligning with holographic bounds and offering testable predictions for anomalies. Finally, a 2025 breakthrough reformulated Einstein's field equations into a renormalizable theory fully compatible with the , addressing non-renormalizability at the Planck scale through symmetry alignments between gravitational and particle fields. Developed by researchers at , this approach treats as an interaction mediated by particles akin to those in the , enabling perturbative calculations that avoid infinities and incorporate all known forces. The reformulation preserves classical limits while predicting deviations in high-energy regimes, such as near black holes, potentially paving the way for a . In November 2025, quadratic gravity—a theory originally proposed by Kellogg Stelle in 1977 that incorporates higher-order curvature terms in —experienced a significant revival as a candidate for . This approach introduces a massless , a massive spin-2 particle with , and a scalar particle, aiming for renormalizability without . Recent work by physicists including John Donoghue and collaborators has argued that the ghosts do not destabilize the theory or violate quantum unitarity, potentially explaining cosmic inflation and providing at high energies. This resurgence positions quadratic gravity as a viable alternative in the quest for a theory of everything.

Criticisms and Limitations

Philosophical Objections

One prominent philosophical objection to a complete theory of everything (TOE) draws from Kurt Gödel's incompleteness theorems, which demonstrate that in any consistent capable of expressing basic arithmetic, there exist true statements that cannot be proven within the system itself. These theorems, published in , imply that if the laws of physics constitute a , a TOE could not be both consistent and complete, as it would fail to derive or prove all physical truths from its own axioms, leaving some aspects inherently unresolvable without external assumptions. This limitation challenges the notion of a fully self-contained TOE, suggesting that physics may always require undecidable propositions or meta-theories beyond its foundational framework. Another critique arises from the limits of , as articulated by physicist in his 1972 essay "More is Different." Anderson argues that while higher-level phenomena in complex systems can be understood in terms of lower-level components, the emergent properties at each scale introduce new fundamental laws that cannot be fully predicted or derived solely from the underlying physics, due to factors like scale, complexity, and broken . For instance, phenomena such as in or the of defy complete reduction to quantum or particle-level descriptions, implying that a TOE focused only on fundamental particles and forces would miss irreducible complexities at macroscopic scales. This perspective underscores that "more is different," challenging the reductionist dream of a TOE that exhaustively explains all phenomena from a single set of microscopic rules. The multiverse hypothesis emerging from string theory's landscape further complicates the pursuit of a unique TOE, invoking the to explain fine-tuned constants. In , the vast "landscape" of possible vacua—estimated at 10^{500} or more—generates an ensemble of universes with varying physical laws, where our universe's parameters appear life-permitting only because observers like us could not exist in others. This framework, proposed by in 2003, suggests that no single, predictive TOE exists; instead, the effective theory is selected anthropically from a , rendering the quest for a unique, fundamental description philosophically unsatisfying as it replaces unification with probabilistic selection. Critics argue this dilutes the explanatory power of a TOE, shifting from deterministic laws to an ensemble where predictability is lost. Recent efforts, including AI-driven explorations as of 2025, aim to sift through this landscape but have not yet resolved its implications for predictability. Finally, definitional issues with the concept of "fundamental" laws highlight epistemological barriers, as outlined in Thomas Kuhn's 1962 analysis of scientific . Kuhn posits that what constitutes fundamental principles evolves through paradigm shifts, where revolutionary changes in scientific worldview redefine the of nature, making prior "fundamentals" obsolete or contextual. In the context of a TOE, this implies that no theory can be eternally fundamental, as future paradigms may reveal current axioms as approximations or artifacts of historical contingencies, perpetually deferring a truly complete unification. Such shifts, as seen in transitions from Newtonian to , suggest that the very definition of a TOE remains paradigm-dependent, undermining claims of absolute finality. Recent 2025 analyses have further applied Gödel's theorems to argue that theories cannot be both consistent and complete, reinforcing these epistemological limits.

Technical and Empirical Challenges

One of the primary technical challenges in developing a theory of everything (TOE) arises from the landscape problem in string theory, where the theory predicts an extraordinarily vast number of possible vacuum states, estimated at approximately $10^{500}, each corresponding to different low-energy effective theories with varying physical constants. This proliferation stems from the freedom in choosing flux configurations on compactified extra dimensions, as detailed in analyses of type IIB string compactifications on Calabi-Yau manifolds. Without a robust dynamical mechanism to select the specific vacuum that matches our observed universe—such as the measured values of the cosmological constant or particle masses—the landscape undermines the predictive power of string theory as a TOE, rendering it difficult to distinguish the correct solution from the multitude of alternatives. As of 2025, machine learning approaches continue to explore this vast space but have yet to provide a unique selection mechanism. Another significant obstacle is the issue of non-computability in systems, particularly in chaotic or nonlinear regimes where exact solutions cannot be algorithmically determined even with computable initial conditions. Seminal work demonstrated this through the wave equation in Minkowski , showing that while the initial data can be computable, the unique solution at certain points may require non-computable real numbers, violating the Church-Turing thesis in physical contexts. This result, from the early 1980s, extends to implications for , where the inherent nonlinearity and chaos in gravitational dynamics at the Planck scale suggest that full simulations or predictions may be fundamentally impossible on classical computers, complicating verification of any TOE candidate. Recent 2025 research has strengthened these concerns by linking undecidability directly to via Gödel's theorems, indicating that no algorithmic TOE for gravity can be complete. Empirically, the inaccessibility of the Planck energy scale poses a formidable barrier to direct testing of TOE proposals, as this scale—around $10^{19} GeV—marks the regime where effects become dominant, far beyond the capabilities of current particle accelerators. The (LHC), operating at a center-of-mass energy of up to 14 TeV (or $1.4 \times 10^4 GeV), probes electroweak-scale physics but falls short by approximately 15 orders of magnitude, making direct observation of Planck-scale phenomena infeasible with foreseeable technology. Consequently, empirical validation relies on indirect probes through cosmological observations, such as anisotropies or signals, which provide constraints but lack the precision to uniquely identify a TOE. Furthermore, many TOE candidates, including , suffer from a lack of falsifiable predictions at accessible energy scales, contravening the Popperian criterion for scientific theories that requires empirical testability to potentially refute them. Proponents argue that accommodates the and but generates no unique, testable signatures distinguishable from other models, such as specific particle spectra or coupling constants that could be confirmed or disproved at colliders like the LHC. This ambiguity allows the theory to evade direct falsification, raising concerns about its status as empirical science and hindering progress toward a verifiable TOE.

Current Status and Outlook

Experimental Constraints

The cosmic microwave background (CMB) radiation, as measured by the Planck satellite's 2018 data release and complemented by the (ACT) Data Release 4 in March 2025, exhibits power spectra and polarization patterns fully consistent with standard inflationary models under , revealing no detectable signatures of quantum gravity effects such as modified tensor-to-scalar ratios or non-standard primordial perturbations. These observations tightly constrain the parameter space for in theories like , where large or warped could imprint detectable anomalies in CMB anisotropies through altered inflationary dynamics or Kaluza-Klein modes; the absence of such features limits the compactification scale to below approximately $10^{-3} eV for two , based on analyses incorporating Planck and ACT data. This lack of evidence underscores the challenge for TOE candidates to produce observable quantum gravity imprints at CMB-accessible energy scales, around $10^{13} GeV. Gravitational wave detections by the LIGO, Virgo, and KAGRA observatories—over 200 events as of March 2025, starting from the first in 2015 (GW150914)—have rigorously tested in the highly dynamical, strong-field regime of and mergers, with signal waveforms, ringdown modes, and propagation speeds matching GR predictions to within percent-level precision and no observed deviations. Such consistency excludes certain signatures anticipated in , including extra polarizations or damping from compact , as well as effects like discrete leading to modified relations, which would manifest as phase shifts or amplitude anomalies in the detected signals. Recent 2025 analyses of ringdown signals further constrain quadratic models, with null results bounding the scale of deviations to energies above $10^{17} GeV, far beyond current sensitivity but highlighting the adherence of macroscopic to classical expectations. The 2019 Event Horizon Telescope (EHT) imaging of the in M87, along with the 2022 image of Sagittarius A* (Sgr A*) and 2025 updates to M87 polarization data, produced shadow profiles aligning precisely with the classical and photon ring predicted by the Kerr solution in , with observed diameters and asymmetric brightness distributions showing no deviations attributable to quantum corrections. This supports the existence of well-defined classical horizons without observed fuzziness or modifications from effects, such as those proposed in (e.g., horizon evaporation remnants) or (e.g., Planck-scale bounces), which might alter the shadow's edge sharpness or size by up to 10-20% at EHT resolutions. The absence of such features constrains quantum corrections to the horizon geometry to scales smaller than the event horizon radii of about 6.5 billion solar masses (M87) and 4 million solar masses (Sgr A*). Neutrino oscillation experiments, including those from , , and the joint -T2K analysis in October 2025, confirm flavor mixing and non-zero masses requiring beyond-Standard-Model physics, such as seesaw mechanisms or sterile neutrinos, yet no TOE-specific particles like gravitons or their Kaluza-Klein excitations have been identified in these or collider searches, with updated neutrino mass limits below 0.45 eV. Similarly, direct-detection efforts by experiments like XENONnT and LZ, along with haloscope searches by ADMX and 2025 galaxy-wide x-ray observations, have yielded null results for —hypothesized in some variants as solutions to the strong CP problem and candidates—constraining their mass to below ~50 μeV and couplings to photons by factors of 10-100 over prior limits, without evidence for these as TOE mediators. These experimental bounds emphasize the need for Standard Model extensions independent of full TOE unification, while limiting predictions of lightweight supersymmetric or extra-dimensional particles in TOE frameworks.

Future Directions

The , scheduled for launch in 2035, is poised to detect low-frequency , including potential primordial signals from the early universe that could reveal signatures of effects during cosmic . Similarly, the space telescope, operational since 2023, has released its first data in 2025, mapping the distribution of and probing through weak gravitational lensing and galaxy clustering in observations of over 1.2 million galaxies, offering indirect tests of modifications to on cosmological scales. On the theoretical front, the AdS/CFT correspondence continues to provide a holographic framework for understanding , with ongoing extensions to de Sitter spacetimes and realistic cosmologies potentially resolving information paradoxes and informing non-perturbative aspects of a unified theory. Complementing this, AI-assisted approaches to model building, such as those leveraging computational models in Stephen Wolfram's framework, are exploring rule-based universes to simulate fundamental physics, with recent 2025 developments emphasizing systematic pattern discovery in complex systems. A successful theory of everything could elucidate the universe's origin by unifying and at the , address the of constants like the through landscape predictions in vacua, and inspire advanced technologies such as metrics that respect quantum energy constraints. Furthermore, interdisciplinary connections between theory and are enabling tabletop tests of emergent , where phenomena like entanglement in quantum many-body systems mimic gravitational dynamics, potentially validating holographic principles experimentally.

References

  1. [1]
    Quantum Gravity - Stanford Encyclopedia of Philosophy
    Dec 26, 2005 · Loop quantum gravity is seemingly less plagued by a lack of predictions, and indeed it is often claimed that the discreteness of area and ...
  2. [2]
    The Theory of Everything | The Current - UC Santa Barbara News
    Dec 5, 2014 · Even though special relativity and quantum mechanics fit together without conflict, general relativity and quantum mechanics are much harder ...Missing: field | Show results with:field
  3. [3]
    Chandra Data Tests 'Theory of Everything' - NASA
    Mar 19, 2020 · Astronomers used Chandra to perform a test of string theory, a possible “theory of everything” that would tie all of known physics together.
  4. [4]
    Loop Quantum Gravity - PMC - PubMed Central
    Loop quantum gravity is a non-perturbative, background-independent quantization of general relativity, a candidate theory of quantum gravity, and a tentative  ...
  5. [5]
    The Theory of Everything - PMC - PubMed Central - NIH
    The Theory of Everything is a term for the ultimate theory of the universe—a set of equations capable of describing all phenomena that have been observed ...
  6. [6]
    The superstring: theory of everything, or of nothing? - Nature
    Oct 16, 1986 · The superstring: theory of everything, or of nothing? John Ellis. Nature volume 323, pages 595–598 (1986)Cite this article.Missing: origin | Show results with:origin
  7. [7]
    On the Generalized Theory of Gravitation - Scientific American
    On the Generalized Theory of Gravitation. An account of the newly published extension of the general theory of relativity against its historical and ...
  8. [8]
    [PDF] God's Thoughts: Practical Steps Towards a Theory of Everything
    Nov 28, 2017 · ... grand unified theory or GUT. And, dreaming big, a GUT and gravity might be unified into a single theory of everything or TOE. Page 10 ...
  9. [9]
    Unification theories and a theory of everything - AccessScience
    Starting with Einstein, physicists sought for decades to devise a unified field theory that would present all four fundamental forces as specialized cases of a ...
  10. [10]
    23.3 The Unification of Forces - Physics | OpenStax
    Mar 26, 2020 · Present quests to show that the four basic forces are different manifestations of a single unified force that follow a long tradition.
  11. [11]
    New quantum theory of gravity brings long-sought 'theory ... - Phys.org
    May 5, 2025 · Without such a theory, physicists cannot reconcile our two most powerful theories, quantum field theory and general relativity. Quantum theory ...
  12. [12]
    Singularities and Black Holes - Stanford Encyclopedia of Philosophy
    Jun 29, 2009 · These theorems indicate that our universe began with an initial singularity, the Big Bang, approximately 14 billion years ago. They also ...<|separator|>
  13. [13]
    A theory of everything will never work at all scales | Stephan Hartmann
    May 21, 2025 · Effective field theories describe how systems behave at specific energy scales, ignoring the finer details of higher energies (or, equivalently, ...
  14. [14]
    Ancient Atomism - Stanford Encyclopedia of Philosophy
    Oct 18, 2022 · Democritus sometimes seems to doubt or deny the possibility of knowledge. The early Greek atomists try to account for the formation of the ...1. Atomism In Classical... · 2.1 Leucippus And Democritus · 2.6 Atomism And Particle...
  15. [15]
    Democritus - Stanford Encyclopedia of Philosophy
    Aug 15, 2004 · Democritus' theory of perception depends on the claim that eidôla or images, thin layers of atoms, are constantly sloughed off from the surfaces ...Theory of Perception · The Soul and the Nature of... · Theory of Knowledge · Ethics
  16. [16]
    Aristotle's Natural Philosophy
    May 26, 2006 · The varieties of responsibilities are grouped by Aristotle under four headings, the so-called four causes. The first two of these are matter ...
  17. [17]
    Thomas Aquinas - Stanford Encyclopedia of Philosophy
    Dec 7, 2022 · Viewed as a philosopher, he is a foundational figure of modern thought. His efforts at a systematic reworking of Aristotelianism reshaped ...
  18. [18]
    Descartes' Physics - Stanford Encyclopedia of Philosophy
    Jul 29, 2005 · Descartes' vortex theory attempts to explain celestial phenomena, especially the orbits of the planets or the motions of comets.A Brief History of Descartes... · Cartesian Cosmology and...
  19. [19]
    Newton's Philosophiae Naturalis Principia Mathematica
    Dec 20, 2007 · Newton probably first encountered it in print when he read Descartes' Principia, where it is comprised by his first two “laws of nature” and is ...
  20. [20]
    [PDF] A dynamical theory of the electromagnetic field
    A Dynamical Theory of the Electromagnetic Field. By J. Clerk Maxwell. F.R.S.. Received October 27,—Read December 8, 1864.
  21. [21]
    Discovery of Radioactivity: Becquerel - Le Moyne
    This Becquerel was an expert on phosphorescent minerals. He is best known for his discovery of radioactivity, first reported just over a century ago.
  22. [22]
    [PDF] on the electrodynamics of - moving bodies
    The theory to be developed is based-like all electro- dynamics on the kinematics of the rigid body, since the assertions of any such theory have to do with the ...
  23. [23]
    The Field Equations of Gravitation - Wikisource, the free online library
    Aug 9, 2025 · Session from November 25, 1915; published December 2, 1915. Albert Einstein735695The Field Equations of Gravitation1915Wikisource. The Field ...
  24. [24]
    The Radiation Theories of Tomonaga, Schwinger, and Feynman
    A unified development of the subject of quantum electrodynamics is outlined, embodying the main features both of the Tomonaga-Schwinger and of the Feynman ...
  25. [25]
    The Z boson | CERN
    Discovered in 1983 by physicists at the Super Proton Synchrotron (SPS) at CERN, the Z boson is a neutral elementary particle. Like its electrically charged ...
  26. [26]
    The history of QCD - CERN Courier
    Sep 27, 2012 · The interaction of the quarks is generated by an octet of massless colour gauge bosons, which we called gluons (Fritzsch and Gell-Mann 1972). We ...
  27. [27]
    Kaluza–Klein unified field theory and apparent four‐dimensional ...
    Sep 1, 1985 · In the 1920s Kaluza and Klein achieved an elegant unified theory of gravitation and electromagnetism by assuming that space‐time is really 5‐ ...Missing: extra | Show results with:extra
  28. [28]
    Einstein's quest for a unified theory - American Physical Society
    Einstein was motivated by an intellectual need to unify the forces of nature. He felt very strongly that all of nature must be described by a single theory.Missing: 1950s | Show results with:1950s
  29. [29]
    On the History of Unified Field Theories. Part II. (ca. 1930–ca. 1965)
    Jun 23, 2014 · Most important centers for research on unified field theory in the 1930s until the early 1950s were those around Albert Einstein in Princeton ...
  30. [30]
    One-loop divergencies in the theory of gravitation - Inspire HEP
    All one-loop divergencies of pure gravity and all those of gravitation interacting with a scalar particle are calculated.
  31. [31]
    The Ultraviolet Behavior of Einstein Gravity - Inspire HEP
    A two-loop calculation showing that the S matrix of Einstein's theory of gravity contains non-renormalizable ultraviolet divergences in four dimensions.
  32. [32]
    Particle creation by black holes | Communications in Mathematical ...
    It is shown that quantum mechanical effects cause black holes to create and emit particles as if they were hot bodies with temperature.
  33. [33]
    Breakdown of predictability in gravitational collapse | Phys. Rev. D
    They probe the edges of space and time, from "Black holes and thermodynamics” to "Wave function of the Universe."
  34. [34]
    The singularities of gravitational collapse and cosmology - Journals
    The singularities of gravitational collapse and cosmology. Stephen William Hawking, Royal Society, 1970. The occurrence of singularities in cosmology. ɪɪɪ ...
  35. [35]
    Fluctuations in the New Inflationary Universe | Phys. Rev. Lett.
    Oct 11, 1982 · The spectrum of density perturbations is calculated in the new-inflationary-universe scenario. The main source is the quantum fluctuations of the Higgs field.Missing: original | Show results with:original
  36. [36]
  37. [37]
    [PDF] String Theory - DAMTP
    strings necessarily includes closed strings, so somehow the open string field theory should already contain gravity and closed strings. Quite how this comes ...
  38. [38]
    SUPERSTRINGS! Supersymmetric Strings - UCSB Physics
    In terms of weak coupling perturbation theory there appear to be only five different consistent superstring theories known as Type I SO(32), Type IIA, Type IIB ...
  39. [39]
    [PDF] Calabi-Yau Compactification 1 Introduction 2 Mathematical ...
    Mar 10, 2004 · dimensions of the theory are compactified: the spacetime manifold on which the strings move is not an arbitrary ten-dimensional space, but ...
  40. [40]
    Anomaly cancellations in supersymmetric D = 10 gauge theory and ...
    Supersymmetric ten-dimensional Yang-Mills theory coupled to N = 1, D = 10 supergravity has gauge and gravitational anomalies that can be partially cancelled.
  41. [41]
    [PDF] SUPERSTRING THEORY
    Mar 11, 2016 · Five Superstring Theories. Type I, Type IIA, Type IIB. SO(32) Heterotic and E8 x E8 Heterotic. Each of these theories requires supersymmetry ...<|separator|>
  42. [42]
    [hep-ph/0310155] Heterotic String Optical Unification - arXiv
    The lower limit to string coupling unification in weakly coupled heterotic strings was shown by Kaplunovsky to be around Lambda_H ~ 5 x 10^{17} GeV. In contrast ...
  43. [43]
    [hep-th/9503124] String Theory Dynamics In Various Dimensions
    Mar 20, 1995 · String Theory Dynamics In Various Dimensions. Authors:Edward Witten. View a PDF of the paper titled String Theory Dynamics In Various Dimensions ...Missing: proposal | Show results with:proposal
  44. [44]
    Mystery physics: What does the M in M-theory mean? | New Scientist
    Apr 15, 2014 · The M stood for magic, mystery or membrane, according to taste. But I thought my colleagues would understand that it was really for membrane.
  45. [45]
    Why Is M-Theory the Leading Candidate for Theory of Everything?
    Dec 18, 2017 · M-theory is often described as the leading candidate for the theory of everything in our universe. But there's no empirical evidence for it.
  46. [46]
    Loop Quantum Gravity | Living Reviews in Relativity
    May 31, 2008 · The application of loop quantum gravity to cosmology is started by Martin Bojowald [72], to be later extensively developed by Ashtekar, Bojowald ...
  47. [47]
    Loop Quantum Gravity and the Meaning of Diffeomorphism Invariance
    Oct 23, 1999 · This series of lectures gives a simple and self-contained introduction to the non-perturbative and background independent loop approach of canonical quantum ...
  48. [48]
    Spin Networks and Quantum Gravity
    ### Extracted and Summarized Content from https://arxiv.org/abs/gr-qc/9505006
  49. [49]
    Discreteness of area and volume in quantum gravity - ScienceDirect
    We argue that the spectra of volume and area determined here can be considered as predictions of the loop-representation formulation of quantum gravity on the ...
  50. [50]
    [PDF] Coupling Matter to Loop Quantum Gravity - publish.UP
    Then, assuming that the gravitational field is in a semiclassical state, a “QFT on curved space-time limit” of this theory is defined.
  51. [51]
    Causal Dynamical Triangulations and the Quest for Quantum Gravity
    Apr 2, 2010 · Quantum Gravity by Causal Dynamical Triangulation has over the last few years emerged as a serious contender for a nonperturbative description of the theory.Missing: seminal | Show results with:seminal
  52. [52]
    Functional Renormalization Group Equations, Asymptotic Safety ...
    Aug 9, 2007 · Functional Renormalization Group Equations, Asymptotic Safety, and Quantum Einstein Gravity. Authors:Martin Reuter, Frank Saueressig.Missing: seminal | Show results with:seminal
  53. [53]
    [gr-qc/0607032] The group field theory approach to quantum gravity
    Jul 7, 2006 · We give a very concise review of the group field theory formalism for non-perturbative quantum gravity, a higher dimensional generalisation of matrix models.Missing: emergent seminal
  54. [54]
    [1001.0785] On the Origin of Gravity and the Laws of Newton - arXiv
    Jan 6, 2010 · The equivalence principle leads us to conclude that it is actually this law of inertia whose origin is entropic. Comments: 29 pages, 6 figures.
  55. [55]
    On the origin of gravity and the laws of Newton
    Apr 7, 2011 · We present a heuristic argument that shows that Newton's law of gravitation naturally arises in a theory in which space emerges through a holographic scenario.
  56. [56]
    [PDF] On the origin of gravity and the laws of Newton
    Erik Verlinde. Institute for Theoretical Physics, University of Amsterdam ... Our aim is to argue that gravity is also an entropic force in this sense.
  57. [57]
    Alena Tensor—a new hope for unification in physics - Phys.org
    Dec 10, 2024 · Alena Tensor reconciles various physical theories, including general relativity, electrodynamics, quantum mechanics and continuum mechanics.
  58. [58]
    Gravitational waves and Higgs-like potential from Alena Tensor
    Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed ...
  59. [59]
    Gravitational Waves and Higgs field from Alena Tensor - Preprints.org
    Due to this property, the Alena Tensor seems to be a useful tool for studying unification problems, quantum gravity and many other applications in physics. Many ...
  60. [60]
    New theory proposes time has three dimensions, with space as a ...
    Jun 21, 2025 · A quantum theory of gravity could lead to, or become, a grand theory of the universe—the so-called "theory of everything." The elusive unifying ...
  61. [61]
  62. [62]
    A New Theory Says Time Has Three Dimensions. It 'Really Messes ...
    Jun 6, 2025 · A new theory says time has three dimensions. It 'really messes up' what we know about the cosmos, scientists say.
  63. [63]
    What if the Universe Remembers Everything? New Theory Rewrites ...
    Oct 3, 2025 · A bold new framework proposes that spacetime acts as a quantum memory. For over a hundred years, physics has rested on two foundational theories ...
  64. [64]
    The Quantum Memory Matrix: A Unified Framework for the Black ...
    Mar 30, 2025 · We develop a mathematical framework that includes space-time quantization, definitions of quantum imprints, and interactions that modify quantum ...
  65. [65]
    The radical idea that space-time remembers could upend cosmology
    Jun 16, 2025 · There are new hints that the fabric of space-time may be made of "memory cells" that record the whole history of the universe.Why John Stewart Bell has... · Ultracold atoms have been... · Florian Neukart
  66. [66]
    New theory of gravity brings long-sought Theory of Everything a ...
    May 5, 2025 · Summary: Researchers have developed a new quantum theory of gravity which describes gravity in a way that's compatible with the Standard Model ...<|separator|>
  67. [67]
    New quantum theory of gravity bridges gravity and the Standard model
    May 6, 2025 · Researchers have developed a new theory that aligns gravity with the Standard Model of particle physics.
  68. [68]
    [PDF] Gödel and Physics - arXiv
    We introduce some early considerations of physical and mathematical impossibility as preludes to Gödel's incompleteness theorems. We consider some informal ...
  69. [69]
    Gödel's Undecidability Theorems and the Search for a Theory of ...
    Feb 19, 2024 · I investigate the question whether Gödel's undecidability theorems play a crucial role in the search for a unified theory of physics.
  70. [70]
    [PDF] More Is Different - TKM (KIT)
    The reductionist hypothesis may still lbe a topic for controversy among phi- losophers, but among the great majority.
  71. [71]
    [hep-th/0302219] The Anthropic Landscape of String Theory - arXiv
    Feb 27, 2003 · I discuss the theoretical and conceptual issues that arise in developing a cosmology based on the diversity of environments implicit in string theory.
  72. [72]
    [PDF] The Structure of Scientific Revolutions
    problems, data, and theory, most often to the particular set of paradigms to which the scientific community is committed at the time they are written ...
  73. [73]
    Thomas Kuhn - Stanford Encyclopedia of Philosophy
    Aug 13, 2004 · Kuhn claimed that science guided by one paradigm would be 'incommensurable' with science developed under a different paradigm, by which is meant ...Missing: everything | Show results with:everything
  74. [74]
    Is String Theory Even Wrong? | American Scientist
    " String theory not only makes no predictions about physical phenomena at experimentally accessible energies, it makes no precise predictions whatsoever.
  75. [75]
    [1807.06211] Planck 2018 results. X. Constraints on inflation - arXiv
    Jul 17, 2018 · We report on the implications for cosmic inflation of the 2018 Release of the Planck CMB anisotropy measurements.
  76. [76]
    CMB fluctuations and string compactification scales - ScienceDirect
    Jan 16, 2012 · Using the observed data, we find constraints on the parameters of this model, such as the size of the extra dimensions and the string scale.
  77. [77]
    Tests of General Relativity with GW150914 | Phys. Rev. Lett.
    May 31, 2016 · The gravitational wave signal observed by the LIGO detectors shows no deviation from what general relativity predicts.
  78. [78]
    Quantum gravitational signatures in next-generation gravitational ...
    Dec 10, 2022 · Planck and BICEP/Keck Array 2018 constraints on primordial gravitational waves and perspectives for future B-mode polarization measurements.
  79. [79]
  80. [80]
    Einstein's Theory Can Explain the Black Hole M87
    May 18, 2021 · According to the tests, the size of the shadow from M87* is in excellent agreement with a black hole predicted by general relativity, but ...
  81. [81]
    Neutrino flavor oscillations without flavor states | Phys. Rev. D
    Nov 18, 2020 · One of the most direct indications that neutrinos provide for the need of extensions of the Standard Model comes from the phenomenon of ...
  82. [82]
  83. [83]
    [PDF] 85. Extra Dimensions - Particle Data Group
    May 31, 2024 · Extra Dimensions. 85.3.4 Flat Extra Dimensions. Models with quantum gravity at the TeV scale, as in the ADD scenario, can have extra (flat).
  84. [84]
    [PDF] Astro2020 Decadal Science White Paper: The state of gravitational ...
    Beyond the sources anticipated based on well- established empirical astrophysics, LISA could also be sensitive to more exotic sources such as primordial ...
  85. [85]
    [PDF] Euclid - ESA Science & Technology
    It will use cosmological probes to investigate the nature of dark energy, dark matter and gravity by tracking their observational signatures on the geometry of ...
  86. [86]
    Can AI Solve Science? - Stephen Wolfram Writings
    Mar 5, 2024 · Stephen Wolfram explores the potential--and limitations--of AI in science. See cases in which AI will be a useful tool, and in others a less ...
  87. [87]
    Background-independent condensed matter models for quantum ...
    Sep 14, 2011 · The possibility that gravity may be emergent suggests that quantum gravity ought to be studied as a problem in statistical physics. The ...