Fact-checked by Grok 2 weeks ago

Physics

Physics is the natural science that studies matter, its fundamental constituents, motion, and behavior through space and time, along with related entities such as energy and force. It aims to uncover the underlying laws and mechanisms that explain how the universe operates, from the smallest particles to the largest cosmic structures. The discipline is broadly divided into classical physics, which includes mechanics, electromagnetism, and thermodynamics, and modern physics, encompassing quantum mechanics, relativity, and particle physics. Key subfields also include astrophysics, which examines celestial phenomena; condensed matter physics, focusing on the properties of solids and liquids; nuclear physics, studying atomic nuclei; and biophysics, applying physical principles to biological systems. These branches interconnect to provide a comprehensive framework for understanding natural phenomena, with physicists employing mathematical models, experiments, and computational simulations to test theories. Physics underpins technological innovation and societal progress, driving developments in semiconductors and that power modern ; like cells and turbines; and medical technologies such as MRI and . Its principles form the foundation for , , and other disciplines, enabling solutions to global challenges like and while expanding human knowledge of the cosmos.

Overview

Definition and Scope

Physics is the natural science that studies matter and its motion through space and time, along with related concepts such as energy and force. The term originates from the Greek word physis, meaning "nature," reflecting its focus on the fundamental workings of the natural world. The scope of physics encompasses the four fundamental forces—gravity, electromagnetism, the strong nuclear force, and the weak nuclear force—that govern interactions among particles and larger structures. It spans an immense range of scales, from the behavior of subatomic particles in quantum realms to the dynamics of cosmic structures like galaxies and the universe itself. Unlike other sciences, physics provides the underlying principles that form the foundation for fields such as , , and , offering the most basic explanations of natural phenomena across disciplines.

Fundamental Quantities and Units

In physics, fundamental quantities represent the basic building blocks for describing natural phenomena, and their measurement relies on standardized units to ensure consistency and reproducibility across experiments and theories. The (SI), established and maintained by the General Conference on Weights and Measures (CGPM), provides a coherent framework for these measurements, with seven base units corresponding to the fundamental quantities of length, mass, time, , thermodynamic temperature, , and . These base units form the foundation from which all other physical quantities are derived, promoting precision in scientific communication and calculation. The seven SI base units are defined through fixed numerical values of fundamental physical constants, a reformulation adopted in 2019 to link units directly to invariant properties of nature rather than artifacts or specific experimental setups. This approach ensures long-term stability and universality. The definitions are as follows:
QuantityUnitSymbolDefinition
LengthMetremDefined by fixing the speed of light in vacuum at exactly 299 792 458 m/s.
MassKilogramkgDefined by fixing Planck's constant at exactly 6.626 070 15 × 10^{-34} J s, where the joule and second are defined via the metre and second.
TimeSecondsDefined by fixing the caesium-133 hyperfine transition frequency Δν_Cs at exactly 9 192 631 770 Hz.
Electric currentAmpereADefined by fixing the elementary charge e at exactly 1.602 176 634 × 10^{-19} C, where the coulomb is A s.
Thermodynamic temperatureKelvinKDefined by fixing the Boltzmann constant k at exactly 1.380 649 × 10^{-23} J/K.
Amount of substanceMolemolDefined by fixing the Avogadro constant N_A at exactly 6.022 140 76 × 10^{23} mol^{-1}.
Luminous intensityCandelacdDefined by fixing the luminous efficacy K_cd at exactly 683 lm/W for monochromatic radiation at 540 × 10^{12} Hz.
Derived quantities in physics are expressed as combinations of these base units, yielding coherent derived units with no numerical factors other than unity. For instance, force is measured in newtons (N), defined as the force imparting an acceleration of 1 m/s² to a 1 kg mass, or N = kg m s^{-2}; energy in joules (J), the work done by 1 N over 1 m, or J = kg m² s^{-2}; and power in watts (W), the rate of 1 J per second, or W = kg m² s^{-3}. These units facilitate the quantification of complex interactions, such as mechanical work or electrical power, without introducing arbitrary scales. Fundamental physical constants play a crucial role in anchoring the SI units and underlying physical laws, providing exact or highly precise values that bridge theory and measurement. The in vacuum, c = 299 792 458 m s^{-1} (exact), defines the and is invariant, serving as a cornerstone for and . Planck's constant, h = 6.626 070 15 × 10^{-34} J s (exact), defines the and quantifies quantum effects like energy levels in atoms. The , G = 6.67430 × 10^{-11} m³ kg^{-1} s^{-2} (with relative standard uncertainty 2.2 × 10^{-5}), governs Newtonian and , though its value is determined experimentally rather than fixed. These constants ensure that physical equations remain dimensionally consistent across scales, from subatomic particles to cosmic structures. Dimensional analysis examines the dimensions of physical quantities—expressed in terms of base units like [L] for , [M] for , and [T] for time—to verify the homogeneity of equations and derive relationships without full mathematical derivation. For example, the dimensions of are [L][T]^{-1}, confirming that speed = /time is consistent. This method is essential for checking the validity of formulas, estimating orders of magnitude, and ensuring unit conversions preserve physical meaning, such as transforming joules to electronvolts via appropriate factors. Unit conversions within rely on these dimensional relations, using prefixes like kilo- (10^3) or milli- (10^{-3}) to scale measurements efficiently while maintaining coherence.

Role in Science and Society

Physics plays a pivotal role in driving economic growth worldwide through advancements in key technologies. , industrial physics contributed an estimated $2.3 trillion to the economy in 2016, representing about 12.6% of GDP, by enabling innovations in sectors such as semiconductors and . Similarly, according to a 2019 study based on 2011–2016 data, physics-based industries contributed over 12% of economic output in the , supporting high-tech manufacturing and sustainable energy solutions like solar photovoltaics and turbines. These contributions underscore physics' interdisciplinary influence, fostering job creation and productivity gains across global supply chains. The discipline's societal benefits are evident in everyday technologies that enhance health, navigation, and communication. techniques such as X-rays, discovered through fundamental physics research, and MRI, rooted in principles, have revolutionized diagnostics and treatment. The (GPS) relies on corrections from Einstein's to account for effects, ensuring accurate positioning essential for and emergency services. Fiber optic communications, underpinning the , draw on quantum physics for technology and light propagation, enabling high-speed data transfer worldwide. Ethical considerations in physics arise from its dual-use applications, particularly in energy and environmental modeling. Nuclear physics has enabled both clean energy production through fission reactors and devastating weapons, prompting debates on proliferation risks and long-term waste management. Climate modeling, grounded in physical laws of thermodynamics and fluid dynamics, informs policy on global warming but raises ethical questions about equity in mitigation efforts and the potential misuse of geoengineering technologies. Post-2020, the physics community has intensified efforts to promote and , addressing historical underrepresentation of women and people of color. Initiatives like the American Physical Society's , , and Alliance (APS-IDEA), launched in 2019, empower departments to implement systemic changes through training and mentoring programs. In , the European Physical Society has developed action plans to reduce the and promote of women in the physics community. These post-pandemic and social justice-driven reforms aim to enhance innovation by reflecting diverse perspectives.

Historical Development

Ancient and Medieval Foundations

The foundations of physics trace back to ancient civilizations, where early observations of the natural world laid the groundwork for systematic inquiry. In , Babylonian astronomers compiled detailed star catalogs as early as 1200 BCE, recording planetary positions and lunar cycles on clay tablets to predict celestial events for agricultural and religious purposes. Similarly, ancient Egyptians developed star catalogs around 3000 BCE, using them to align pyramids with constellations like and to create a based on the of Sirius. These efforts represented initial steps toward empirical astronomy, emphasizing in the heavens without advanced mathematical models. Greek emerged in the BCE with Pre-Socratic thinkers who sought rational explanations for the , moving beyond mythological accounts. , around 600 BCE, proposed water as the fundamental substance underlying all matter, initiating inquiries into the nature of change and substance. , in the 5th century BCE, advanced , positing that the universe consists of indivisible particles (atoms) moving in a void, which provided an early mechanistic view of reality. , writing around 350 BCE, synthesized these ideas into a comprehensive system in his Physics, describing the sublunary world as composed of four elements—, water, air, and fire—each with natural motions toward their respective places (e.g., downward, fire upward). He incorporated , arguing that natural processes occur for purposeful ends, such as the growth of organisms toward perfection. During the Hellenistic period, following Alexander the Great's conquests, practical applications advanced physical understanding. , in the 3rd century BCE, formulated the principle of in his work , stating that the upward force on an immersed object equals the weight of the displaced fluid, enabling innovations in and . , in the 1st century CE, described the , a steam-powered device that demonstrated rotational motion from heated water jets escaping tangentially, an early precursor to concepts though primarily used as a novelty. In the medieval Islamic world, scholars built upon Greek texts, integrating them with experimentation. Ibn al-Haytham (Alhazen), in his Book of Optics completed around 1021 CE, pioneered the scientific method by emphasizing controlled experiments to study light propagation, refraction, and vision, refuting emission theories and establishing optics as an empirical science. Ibn Sina (Avicenna), in the early 11th century, contributed to mechanics through his The Book of Healing, analyzing motion as requiring both an impressed force and resistance from the medium, refining Aristotelian dynamics for projectile and falling bodies. European scholasticism in the later Middle Ages further refined these ideas within university settings. Jean Buridan, a 14th-century philosopher at the , developed the impetus theory to explain sustained motion, proposing that a mover imparts a permanent "impetus" quality to a body, which diminishes gradually due to air resistance, thus addressing limitations in Aristotle's requirement for continuous external forces. This qualitative framework bridged ancient and emerging experimental approaches, influencing later thinkers without quantitative precision.

Scientific Revolution and Classical Era

The , spanning the 16th to 18th centuries, marked a profound transformation in the study of nature, shifting from speculative philosophy to empirical and mathematical rigor, thereby laying the foundations of . This era emphasized experimentation and quantitative analysis, challenging Aristotelian views of motion and cosmology with evidence-based models of the physical world. Pioneering scientists employed precise measurements and to formulate laws that described planetary motion, terrestrial , and gravitational forces, establishing a mechanistic governed by universal principles. Galileo Galilei played a pivotal role in this transition through his inclined plane experiments, conducted around 1608 and detailed in his 1638 work Dialogues Concerning Two New Sciences, where he demonstrated that the acceleration of falling bodies is constant and independent of mass, refuting the notion that heavier objects fall faster. By rolling balls down grooved inclines, Galileo slowed the motion to measurable timescales, revealing uniform acceleration and supporting the idea of inertia as a fundamental property. Complementing these terrestrial insights, Johannes Kepler formulated his three laws of planetary motion based on meticulous analysis of Tycho Brahe's observational data: the first law, published in Astronomia Nova in 1609, states that planets orbit the Sun in elliptical paths with the Sun at one focus; the second, also in 1609, describes equal areas swept by the radius vector in equal times; and the third, in Harmonices Mundi in 1619, relates the square of orbital periods to the cube of semi-major axes. These empirical laws provided a kinematic description of celestial mechanics, bridging observation and mathematics without invoking metaphysical causes. Isaac Newton's , published in 1687, synthesized these advancements into a comprehensive framework for , articulating three laws of motion—the first positing , the second relating to , and the third describing action-reaction pairs—and introducing the law of universal gravitation, which posits that every particle attracts every other with a force proportional to their masses and inversely proportional to the square of their distance. Newton derived these principles geometrically, explaining both Kepler's planetary orbits and Galileo's falling bodies as manifestations of the same gravitational force, thus unifying terrestrial and . To support his derivations, particularly for curved trajectories and orbital dynamics, Newton developed the fluxional calculus alongside , who independently formulated differential and integral in the late 1660s and 1670s; their methods enabled precise handling of rates of change and infinite series, essential for applying to physical problems like motion under varying forces. In the 18th century, this framework expanded through refinements in analytical approaches. Leonhard Euler advanced fluid dynamics in works like his 1757 memoir on the motion of fluids, deriving the Euler equations—partial differential equations governing inviscid flow—that generalized Newtonian mechanics to continuous media, incorporating principles of conservation of mass and momentum for applications in hydrodynamics and aerodynamics. Joseph-Louis Lagrange further formalized mechanics in his 1788 Mécanique Analytique, introducing the Lagrangian formulation, which reformulates Newtonian equations using variational principles and generalized coordinates, minimizing reliance on forces and geometry in favor of energy-based integrals, thus providing a more elegant tool for complex systems like rigid bodies and celestial perturbations. These developments solidified classical physics as a predictive science, influencing later fields such as thermodynamics. The institutionalization of empirical science accelerated during this period, exemplified by the founding of the Royal Society in on November 28, 1660, by a group of natural philosophers including and , who chartered it under II in 1662 to promote experimental knowledge through collaborative inquiry and publication. Similar academies, such as the French Académie des Sciences in 1666, fostered systematic experimentation and , disseminating findings via journals like the Philosophical Transactions (started 1665) and ensuring the reproducibility of results, which became hallmarks of scientific practice.

19th and Early 20th Century Advances

The marked a pivotal in physics, characterized by efforts to unify disparate phenomena through experimental and theoretical advancements, laying the groundwork for modern paradigms. In , Sadi Carnot's 1824 analysis of heat engines introduced the concept of a reversible cycle operating between two s, establishing the maximum efficiency limit for converting heat into work without invoking the nature of heat itself. This idealized , consisting of isothermal expansion, adiabatic expansion, isothermal compression, and adiabatic compression, demonstrated that efficiency depends solely on the temperature ratio, influencing subsequent developments in . Building on Carnot's framework, James Prescott Joule's experiments in the 1840s quantified the mechanical equivalent of , showing through precise that a fixed amount of mechanical work—such as from falling weights turning a paddle in —produces a consistent rise in , equivalent to approximately 4.18 joules per . Joule's paddle-wheel apparatus, refined over multiple trials, provided against the and supported the idea of as a form of , with his 1850 paper reporting results from over 30 experiments confirming the equivalence. The second law of thermodynamics emerged in the 1850s through the independent formulations of and William Thomson (). Clausius, in his 1850 paper, introduced the principle that heat cannot spontaneously flow from a colder to a hotter body without external work, framing it in terms of an entropy-like quantity that increases in irreversible processes, thus generalizing Carnot's efficiency to all heat engines. Kelvin complemented this in 1851 by stating that it is impossible to convert heat completely into work in a cyclic process without other changes, emphasizing a universal dissipation of toward , as seen in his analysis of isolated systems where available work diminishes over time. These statements resolved paradoxes in and established as a foundational branch of physics. In , Michael Faraday's 1831 discovery of revealed that a changing induces an in a closed circuit, demonstrated through experiments with coils and iron rings where relative motion between magnets and conductors produced transient currents. Faraday's qualitative observations, detailed in his Experimental Researches in Electricity, unified , , and motion, showing induction's dependence on flux change rather than contact, and paved the way for generators and transformers. James Clerk Maxwell synthesized these insights in his 1865 paper, formulating a set of equations that describe electromagnetic fields as interdependent entities propagating as at the , thereby predicting and unifying with and . Maxwell's dynamical theory portrayed fields as stresses in a medium, with currents enabling wave propagation, and his equations—comprising Gauss's laws, Faraday's law, and Ampère's law with Maxwell's addition—provided a comprehensive mathematical framework verified by later experiments like Hertz's. The advanced through Ludwig Boltzmann's statistical approach in the 1870s, interpreting thermodynamic properties as averages over molecular motions, with his 1877 paper linking the second to probability by deriving as a measure of molecular disorder, S = k \ln W, where W is the number of microstates. Boltzmann's H-theorem demonstrated how collisions drive systems toward , increasing probabilistically, though it faced challenges from irreversibility paradoxes, resolving macroscopic via ensemble statistics. Hints toward arose from the 1887 Michelson-Morley experiment, which sought to detect Earth's motion through the by measuring speed differences in perpendicular interferometers but yielded a null result, with fringe shifts less than 1/40 of the expected wind effect. This failure to observe velocity-dependent propagation challenged the hypothesis, prompting Lorentz transformations and Einstein's later reinterpretation, though contemporaries attributed it to drag or experimental error. The discovery of radioactivity began with Henri Becquerel's 1896 observation that salts emit penetrating rays independently of , fogging photographic plates even in darkness, as reported in Comptes Rendus and confirmed by ionization measurements showing spontaneous emission akin to but distinct from X-rays. Building on this, Pierre and isolated in July 1898 from pitchblende residues, exhibiting 400 times 's activity, and in December 1898, over a million times more active, through fractional crystallization yielding pure chloride salts. Their joint Comptes Rendus announcements named the elements after () and ray-like quality (), establishing as atomic disintegration and inspiring foundations.

Post-1945 Developments and Contemporary Era

The end of marked a pivotal shift in physics, catalyzed by the , a massive collaborative effort led by the from 1942 to 1946 that successfully developed the first atomic bombs through advances in and chain reactions. This project, involving over 130,000 people and sites like and Hanford, not only demonstrated the practical harnessing of but also sparked a postwar boom in research, with governments funding accelerators and reactors to explore atomic structure and applications beyond weaponry. Post-1945, this led to rapid discoveries in nuclear reactions and isotopes, transforming fields like through radioisotope production in reactors originally built for . By the 1960s and 1970s, particle physics evolved toward a unified framework, culminating in the formulation of the , which describes fundamental particles and forces via . The model posits six quarks (up, down, , strange, top, bottom) and six leptons (, , , and their neutrinos) as matter constituents, mediated by gauge bosons: photons for , for the weak force, and gluons for the strong force. Key developments included the proposed by and in 1964, confirmed by experiments at SLAC in 1968, and the electroweak unification by , , and in the late 1960s, with (QCD) established by , , and David Politzer in 1973 for strong interactions. This synthesis, validated by experiments like the discovery of quarks at Brookhaven in 1974, provided a predictive theory for particle interactions up to high energies. Major experimental milestones in the late 20th and early 21st centuries solidified the while opening new frontiers. In 2012, the ATLAS and collaborations at CERN's (LHC) announced the discovery of the , a scalar particle with around 125 GeV, completing the model's particle roster by explaining generation through the . This breakthrough, based on proton-proton collisions at 8 TeV, confirmed predictions from the 1960s and earned and the 2013 . Complementing particle advances, the collaboration reported the first direct detection of on September 14, 2015, from the merger of two black holes 1.3 billion light-years away, verifying Einstein's predictions from 1915. The signal, GW150914, carried energy equivalent to three solar masses and initiated multi-messenger astronomy. Cosmological observations further reshaped understanding of the universe's evolution. In 1998, independent teams led by , , and analyzed Type Ia supernovae, revealing the universe's accelerating expansion and providing evidence for as a dominant component comprising about 70% of the . This , using 42 high-redshift supernovae, implied a positive or similar repulsive force, earning the trio the 2011 . Refining this picture, the European Space Agency's Planck satellite, operational from 2009 to 2013, mapped anisotropies with unprecedented precision, yielding parameters like a Hubble constant of 67.4 km/s/Mpc and confirming the with at 26.8% and at 68.3%. Planck's 2013 data release, from 15.5 months of observations, reduced uncertainties in and supported . Into the 2020s, breakthroughs continued across , quantum technologies, and precision measurements. NASA's (JWST), launched in 2021 and operational from 2022, has unveiled early universe galaxies like those at z>10, challenging formation models, and detailed exoplanet atmospheres, such as in , through . By 2025, JWST's observations, including the discovery of a new moon (S/2025 U 1) around and the interstellar comet 3I/ATLAS, have provided insights into solar system origins and in regions like Sagittarius B2. In quantum computing, Google's 2019 demonstrated by sampling a random in 200 seconds—a task estimated to take a 10,000 years—using 53 qubits. Updates through 2025 include the Willow chip, achieving a 13,000-fold speedup over classical simulations in physics problems like random circuit sampling, advancing toward error-corrected systems. Meanwhile, Fermilab's experiment, building on Brookhaven's 2001 result, reported in 2021 and 2023 deviations from predictions for the muon's anomaly (a_μ), with the 2023 value at 0.00116592061(41) suggesting new physics at 4.2σ tension. The final 2025 measurement, combining all runs, refined a_μ to 0.001165920705(20) at 127 parts-per-billion precision, maintaining a 4.0σ discrepancy and prompting searches for beyond-Standard-Model contributions.

Core Concepts and Theories

Classical Physics Foundations

Classical physics forms the foundational framework of physics, describing the behavior of macroscopic objects under deterministic laws that govern motion, forces, and interactions in everyday scales. It assumes a continuous structure, where and time are infinitely divisible and form a smooth, unbroken without elements. This continuity allows for precise mathematical modeling using equations, enabling predictions of physical phenomena from planetary orbits to mechanical systems. Absolute time is a core assumption, flowing uniformly and independently of any external or observers, distinct from relative measures like hours or days. Central to classical physics is the principle of determinism and causality, positing that the future state of a system is entirely determined by its initial conditions and the governing laws. This idea is vividly illustrated by , a hypothetical intellect that, if it knew the precise positions and velocities of all particles in the at one instant, could compute the entire past and future through the laws of . Such determinism underscores the predictability inherent in classical theories, where causes invariably lead to specific effects without randomness or uncertainty. Conservation laws are fundamental to classical physics, stating that certain quantities remain constant in isolated systems. Energy, linear momentum, and angular momentum are conserved, reflecting underlying symmetries in the laws of nature; for instance, translational invariance of space leads to momentum conservation, rotational invariance to angular momentum conservation, and time invariance to energy conservation. These principles, later formalized by Noether's theorem—which links continuous symmetries of physical actions to corresponding conserved quantities—provide powerful tools for analyzing systems without solving full equations of motion. Despite its successes, classical physics has inherent limitations, breaking down at extremely high speeds approaching the , on very small scales comparable to atomic dimensions, or in the presence of strong gravitational fields. These regimes require extensions beyond classical assumptions, though remains an excellent approximation for terrestrial and low-speed phenomena, as detailed in specific theories like Newtonian mechanics./01%3A_Fundamental_Concepts_of_Lagrangian_Mechanics/1.02%3A_Foundations_of_Newtonian_Mechanics)

Newtonian Mechanics

Newtonian mechanics, also known as , forms the foundational framework for understanding the motion of macroscopic objects under the influence of forces, as developed by in his seminal work published in 1687. This system revolutionized physics by providing mathematical laws that describe how bodies interact through forces, enabling predictions of planetary motion, terrestrial phenomena, and engineered systems. At its core are Newton's three laws of motion, which establish the relationship between force, mass, and acceleration, and the law of universal gravitation, which posits that every particle attracts every other with a force proportional to their masses and inversely proportional to the square of the distance between them. The first law, often called the law of inertia, states that an object at rest remains at rest, and an object in uniform motion continues in a straight line unless acted upon by an external ./Axioms,_or_Laws_of_Motion) This principle, derived from Galileo's earlier ideas but formalized by , implies that motion requires no force to sustain it in the absence of resistance, a concept central to inertial reference frames. The second law quantifies the effect of : the acceleration of an object is directly proportional to the acting on it and inversely proportional to its mass, expressed as \vec{F} = m \vec{a}, where \vec{F} is , m is , and \vec{a} is . This vector equation allows for the resolution of motion into components, facilitating analysis in multiple dimensions. The third law asserts that for every action, there is an equal and opposite reaction, meaning forces between interacting bodies are mutual and collinear./Axioms,_or_Laws_of_Motion) These laws collectively enable the prediction of dynamic systems by summing forces and solving differential . Newton's law of universal gravitation extends these principles to celestial bodies, stating that the gravitational force F between two masses m_1 and m_2 separated by distance r is F = G \frac{m_1 m_2}{r^2}, where G is the gravitational constant, approximately $6.67430 \times 10^{-11} m³ kg⁻¹ s⁻². Introduced in Book III of the Principia, this law unifies terrestrial and celestial mechanics, explaining phenomena like falling apples and orbiting moons under the same inverse-square rule. It derives from first principles, such as the elliptical orbits with the sun at one focus, by treating gravity as a central . Applications of Newtonian mechanics abound in both everyday and astronomical contexts. , for instance, describes the parabolic trajectory of objects like cannonballs or thrown balls under alone, where horizontal velocity remains constant (per ) while vertical motion follows a = -g (second law), with g \approx 9.8 m/s². applies these laws to predict satellite paths and planetary revolutions; for circular orbits, the equals gravitational , yielding v = \sqrt{GM/r} for a central M. s illustrate for small angles, where the restoring from leads to T = 2\pi \sqrt{L/g}, independent of , useful in timekeeping and . For more complex systems involving constraints or multiple coordinates, Newtonian mechanics is reformulated using Lagrangian and Hamiltonian approaches. The Lagrangian formulation, introduced by Joseph-Louis Lagrange in Mécanique Analytique (1788), defines the Lagrangian L = T - V as kinetic energy T minus potential energy V, with equations of motion \frac{d}{dt} \left( \frac{\partial L}{\partial \dot{q}_i} \right) - \frac{\partial L}{\partial q_i} = 0 for generalized coordinates q_i. This variational method simplifies problems like those with holonomic constraints by avoiding explicit force resolutions. The Hamiltonian formulation, developed by William Rowan Hamilton in his 1834 paper "On a General Method in Dynamics," uses the Hamiltonian H = T + V in terms of coordinates and momenta, yielding Hamilton's equations \dot{q}_i = \frac{\partial H}{\partial p_i} and \dot{p}_i = -\frac{\partial H}{\partial q_i}. These are particularly advantageous for conservative systems, facilitating phase space analysis and conservation laws. Key examples highlight Newtonian mechanics' explanatory power. Planetary orbits conform to Kepler's laws as solutions to the two-body gravitational problem, with elliptical paths emerging from the inverse-square force balance. Tides result from differential gravitational forces: the moon's pull is stronger on Earth's near side than the far side, bulging oceans and creating two high tides daily as Earth rotates. These principles underpin modern applications from trajectories to designs, though they approximate reality for low speeds and strong fields, with extensions in for high-precision contexts.

Thermodynamics and Statistical Mechanics

Thermodynamics is the branch of that deals with , work, and , and their relation to , , and properties of . It provides a framework for understanding energy transfer and transformation in physical systems, particularly through macroscopic phenomena like heat engines and refrigerators. The field emerged in the from efforts to improve efficiency, leading to foundational principles that govern isolated systems and reversible processes. The establishes the concept of and : if two systems are each in with a third, then they are in with each other, allowing the definition of a temperature scale. This law, first articulated in its modern form by James Clerk Maxwell in 1872, underpins thermometry and the transitive nature of heat flow. The first law of thermodynamics expresses the conservation of energy in thermodynamic processes: the change in ΔU of a system equals the Q added to the system minus the work W done by the system, formulated as ΔU = Q - W. This principle, introduced by in his 1850 paper " die bewegende Kraft der Wärme," resolved the of and mechanical work observed in experiments by James Joule. For an , this connects to the PV = nRT, where P is , V is volume, n is the number of moles, R is the , and T is ; this equation was first synthesized by Émile Clapeyron in 1834 from empirical laws of Boyle, Charles, and Gay-Lussac. The second law of thermodynamics introduces the concept of S, stating that the of an never decreases; for any , the total change ΔS ≥ 0, with equality only for reversible processes. Clausius formulated this in 1854 as "heat cannot spontaneously flow from a colder to a hotter body," while William Thomson () stated in 1851 that it is impossible to convert heat entirely into work in a cyclic process without other effects. The third law asserts that the of a perfect approaches a minimum value (often zero) as temperature approaches , implying that is unattainable in finite steps. This was originally proposed by in 1906 as the heat theorem, based on low-temperature , and later formalized by in 1911. Statistical mechanics bridges to microscopic behavior, interpreting macroscopic laws through the statistical properties of large numbers of particles. It posits that thermodynamic variables emerge from averaging over microstates in . Ludwig Boltzmann's 1877 paper "On the Relation between the Second Fundamental Theorem of the of and Probability Calculations" derived the S = k \ln W, where k is Boltzmann's and W is the number of microstates corresponding to a macrostate, linking to disorder or multiplicity. The gives the probability of a system occupying a state with energy E as proportional to e^{-E / kT}, where T is temperature; this was developed by Boltzmann in the 1860s-1870s to explain equilibrium distributions in gases. The partition function Z, introduced by J. Willard Gibbs in his 1902 work "Elementary Principles in Statistical Mechanics," sums the Boltzmann factors over all states, Z = \sum e^{-E_i / kT}, enabling computation of thermodynamic quantities like free energy F = -kT \ln Z from microscopic models. Phase transitions occur when a system undergoes a qualitative change in structure or properties, such as or , driven by variations in or . These are classified by in 1933 based on the continuity of thermodynamic derivatives: first-order transitions involve discontinuous first derivatives of the (e.g., in ), while second-order transitions show discontinuous second derivatives (e.g., specific heat in ferromagnetic transitions). Maxwell's demon paradox, proposed by James Clerk Maxwell in his 1871 book "Theory of Heat," challenges the second law by imagining a hypothetical entity that sorts fast and slow molecules to create a without work, seemingly decreasing ; resolutions via show that the demon's measurement incurs an entropy cost, upholding the law.

Electromagnetism and Optics

Electromagnetism encompasses the study of electric and magnetic fields and their interactions with , forming a cornerstone of . Electric charges exert forces on one another according to , which states that the magnitude of the electrostatic force F between two point charges q_1 and q_2 separated by a r is given by F = k \frac{q_1 q_2}{r^2}, where k = \frac{1}{4\pi\epsilon_0} is Coulomb's constant and \epsilon_0 is the ./18%3A_Electric_Charge_and_Electric_Field/18.03%3A_Coulombs_Law) This , experimentally verified by in 1785 using a torsion balance, describes both attractive and repulsive interactions depending on the signs of the charges. Moving charges generate , as quantified by Ampère's circuital law, which relates the magnetic field around a closed loop to the electric current passing through the loop: \oint \mathbf{B} \cdot d\mathbf{l} = \mu_0 I, where \mu_0 is the vacuum permeability and I is the total current. Gauss's law provides an integral form for the electric field, stating that the flux of the electric field \mathbf{E} through a closed surface is proportional to the enclosed charge: \oint \mathbf{E} \cdot d\mathbf{A} = \frac{Q_{\text{enc}}}{\epsilon_0}. Similarly, Gauss's law for magnetism asserts that the magnetic flux through any closed surface is zero, implying the absence of magnetic monopoles: \oint \mathbf{B} \cdot d\mathbf{A} = 0. These laws, along with Faraday's law of induction, describe static and quasi-static fields but were incomplete for dynamic situations. James Clerk Maxwell unified these concepts in 1865 by introducing a correction to Ampère's law, adding a "displacement current" term \mu_0 \epsilon_0 \frac{\partial \mathbf{E}}{\partial t} to account for changing electric fields, which resolved inconsistencies in energy conservation and continuity equations. Maxwell's equations in differential form encapsulate this unification: \nabla \cdot \mathbf{E} = \frac{\rho}{\epsilon_0} \nabla \cdot \mathbf{B} = 0 \nabla \times \mathbf{E} = -\frac{\partial \mathbf{B}}{\partial t} \nabla \times \mathbf{B} = \mu_0 \mathbf{J} + \mu_0 \epsilon_0 \frac{\partial \mathbf{E}}{\partial t} These four equations govern all classical electromagnetic phenomena, where \rho is charge density and \mathbf{J} is current density. In vacuum, without sources (\rho = 0, \mathbf{J} = 0), taking the curl of Faraday's law and substituting the corrected Ampère's law yields the wave equation for both \mathbf{E} and \mathbf{B}, with propagation speed c = \frac{1}{\sqrt{\mu_0 \epsilon_0}} \approx 3 \times 10^8 m/s, matching the measured speed of light and revealing light as an electromagnetic wave. Electromagnetic waves are transverse, with \mathbf{E} and \mathbf{B} oscillating perpendicular to the direction of propagation and to each other. Polarization describes the orientation of the electric field vector; linear polarization occurs when \mathbf{E} oscillates in a fixed plane, while circular polarization arises from equal-amplitude components along perpendicular axes with a 90-degree phase difference./Optical_Properties/Polarization_of_Waves) Optics applies these principles , treating it as an electromagnetic wave in . , the bending of light at an interface between media, follows Snell's law: n_1 \sin \theta_1 = n_2 \sin \theta_2, where n is the and \theta from ; this , empirically derived by Willebrord Snell in , arises from the of across the . Lenses manipulate light paths via , converging or diverging rays to form images; a focuses parallel rays to a focal point at distance f = \frac{R}{2(n-1)}, where R is the radius of curvature, enabling applications like microscopes and telescopes. highlights interference and diffraction, phenomena absent in ray approximations. Interference occurs when coherent waves superimpose, producing constructive (in-phase) or destructive (out-of-phase) patterns, as demonstrated by Thomas Young's double-slit experiment in 1801, where fringe spacing \Delta y = \frac{\lambda L}{d} (\lambda wavelength, L distance to screen, d slit separation) confirms light's wave nature. Diffraction, the bending of waves around obstacles or through apertures, follows the Huygens-Fresnel principle, where each point on a wavefront acts as a secondary source; for a single slit of width a, the first minimum occurs at \sin \theta = \frac{\lambda}{a}, limiting resolution in optical instruments. These effects underscore the dual geometric and wave descriptions essential to classical .

Modern Physics

Special and General Relativity

Special relativity, formulated by in 1905, revolutionized the understanding of space, time, and motion by positing two fundamental postulates: the principle of relativity, which states that the laws of physics are identical in all inertial frames, and the constancy of the in vacuum for all observers, regardless of the motion of the source or observer. These axioms led to the abandonment of , replacing them with a unified four-dimensional continuum. The theory's mathematical foundation lies in the Lorentz transformations, which relate coordinates between inertial frames moving at constant velocity v relative to each other: x' = \gamma (x - vt), \quad t' = \gamma \left(t - \frac{vx}{c^2}\right), \quad y' = y, \quad z' = z, where \gamma = 1 / \sqrt{1 - v^2/c^2} and c is the speed of light. Key consequences of special relativity include time dilation, where the proper time \tau for an object moving at velocity v is related to the coordinate time t by \tau = t \sqrt{1 - v^2/c^2}, meaning moving clocks tick slower as observed from a stationary frame. Length contraction occurs in the direction of motion, shortening the measured length L of an object to L = L_0 \sqrt{1 - v^2/c^2}, where L_0 is the proper length. Additionally, the equivalence of mass and energy is encapsulated in the relation E = mc^2, demonstrating that energy and mass are interchangeable forms, with profound implications for nuclear processes. General relativity, developed by Einstein and published in its final form in , extends to accelerated frames and gravitational fields by treating gravity not as a force but as the curvature of caused by mass and energy. The theory rests on the , which asserts that the effects of gravity are indistinguishable from those of acceleration in a local frame, implying that free-falling objects follow geodesics in curved . The dynamics of this curvature are governed by the : G_{\mu\nu} = \frac{8\pi G}{c^4} T_{\mu\nu}, where G_{\mu\nu} is the Einstein tensor describing spacetime geometry, T_{\mu\nu} is the stress-energy tensor representing matter and energy, G is the gravitational constant, and c is the speed of light; these ten nonlinear partial differential equations encapsulate how matter shapes spacetime and vice versa. Predictions of general relativity include the existence of black holes, first realized in the 1916 Schwarzschild solution for the spacetime around a spherically symmetric, non-rotating mass, which features an event horizon at radius r_s = 2GM/c^2, beyond which light cannot escape. The theory also foresees gravitational waves—ripples in spacetime propagating at the speed of light, generated by accelerating masses such as orbiting binaries—as derived by Einstein in 1916 through linear approximations to the field equations. Empirical confirmation of came swiftly. In 1915, Einstein calculated that the theory accounts for the anomalous of Mercury's perihelion, predicting an advance of 43 arcseconds per century beyond Newtonian mechanics, matching observations precisely. The 1919 expeditions, led by , measured the deflection of starlight passing near the Sun, observing a of 1.75 arcseconds, consistent with the theory's of twice the Newtonian value, providing dramatic validation. These tests, along with later detections of in 2015, underscore the theory's accuracy in describing gravitational phenomena on cosmic scales.

Quantum Mechanics

Quantum mechanics is a fundamental theory in physics that describes the behavior of matter and energy at atomic and subatomic scales, introducing a probabilistic framework where microscopic phenomena exhibit wave-particle duality. This duality posits that particles, such as electrons, can behave both as discrete particles and as waves, challenging classical intuitions of definite trajectories and positions. The theory emerged in the early 20th century to resolve inconsistencies in classical physics, particularly in explaining phenomena like blackbody radiation and atomic spectra. Max Planck introduced the concept of energy quanta in 1900 to derive the correct formula for blackbody radiation, proposing that energy is emitted or absorbed in discrete packets proportional to frequency, given by E = h\nu, where h is Planck's constant. This quantum hypothesis marked the departure from continuous energy in classical theory, laying the groundwork for quantized systems. Building on Planck's ideas, developed a model of the atom in 1913, where electrons orbit the in discrete levels, transitioning between them by emitting or absorbing of with differences matching observed lines. In this semi-classical model, is quantized as L = n\hbar, with n an and \hbar = h/2\pi, explaining the stability of atoms against classical radiation loss. extended wave-particle duality to in 1924, hypothesizing that particles have associated wavelengths \lambda = h/p, where p is , unifying light's dual nature with massive particles like s. This relation predicted wave-like interference for electron beams, later confirmed experimentally, solidifying the duality as a core principle. The full formulation of quantum mechanics arrived in 1925–1926 through by and by , proven equivalent. In wave mechanics, the state of a quantum system is described by a \psi(\mathbf{r}, t), a complex-valued function whose square modulus |\psi|^2 gives the probability density of finding the particle at \mathbf{r} at time t. The time evolution follows the : i\hbar \frac{\partial \psi}{\partial t} = \hat{H} \psi, where \hat{H} is the Hamiltonian operator representing total energy, typically \hat{H} = -\frac{\hbar^2}{2m} \nabla^2 + V(\mathbf{r}) for a particle in potential V. This equation yields solutions exhibiting superposition, where \psi is a of basis states, allowing systems to exist in multiple states simultaneously until measured. Heisenberg's , formulated in 1927, quantifies the inherent limits on simultaneous knowledge of , such as x and p, via \Delta x \Delta p \geq \hbar/2, arising from the non-commuting operators \hat{x} and \hat{p}. These principles underpin the probabilistic nature of , where outcomes are predicted statistically rather than deterministically. Interpretations of quantum mechanics address the measurement problem and the role of observation. The Copenhagen interpretation, developed by Bohr and Heisenberg in the late 1920s, posits that the wave function provides complete information about the system, but measurement causes an irreversible collapse to a definite outcome, with classical concepts applying only at macroscopic scales. Complementarity, a key aspect, holds that wave and particle aspects are mutually exclusive views revealed by different experiments. In contrast, the many-worlds interpretation, proposed by Hugh Everett in 1957, rejects collapse, asserting that all possible outcomes of a measurement occur in branching parallel universes, with the universal wave function evolving unitarily forever. This deterministic view preserves linearity but implies an ever-expanding multiverse. Applications of quantum mechanics demonstrate its predictive power beyond atomic structure. Quantum tunneling allows particles to penetrate classical energy barriers with probability determined by the wave function's evanescent tail, as explained by in 1928 for , where alpha particles escape the despite insufficient , matching observed decay rates via the T \approx e^{-2\kappa L}, with \kappa = \sqrt{2m(V-E)}/\hbar. The , solved exactly using the , models vibrational modes in molecules and phonons in solids; its energy levels are E_n = \hbar \omega (n + 1/2), n = 0, 1, 2, \dots, introducing even at , crucial for understanding heat capacities and quantum technologies like lasers. These examples highlight ' role in microscopic phenomena, with extensions to relativistic fields explored in .

Quantum Field Theory and Standard Model

Quantum field theory (QFT) provides the relativistic framework for describing subatomic particles and their interactions, treating quantum fields as the fundamental entities rather than particles themselves. In this paradigm, particles emerge as localized excitations or quanta of these pervasive fields, which obey both and . This shift resolves inconsistencies in earlier attempts to quantize relativistic systems, such as the problematic infinities in multi-particle descriptions, by incorporating an infinite number of inherent to fields spread across . The formalism of , pioneered by in his 1927 analysis of radiation processes, elevates field operators to create and annihilate particles, enabling a consistent treatment of variable particle numbers in relativistic contexts. This approach underpins all modern QFT applications, from to more complex theories. To compute interaction amplitudes perturbatively, introduced diagrams in 1949 as intuitive pictorial representations of mathematical expressions, where lines depict particle propagators and vertices indicate interactions, greatly simplifying calculations of processes. The of represents the most successful realization of QFT, unifying the electromagnetic, weak, and strong nuclear forces through a based on the SU(3) × SU(2) × U(1) . It encompasses 17 fundamental particles: six quarks (up, down, , strange, , ), six leptons (, , , and their neutrinos), and five bosons (, , , , and Higgs, with the eight gluons treated as color variants of the and the W representing the charged weak bosons W⁺ and W⁻). The electromagnetic and weak interactions are unified in the electroweak theory, developed by in 1961 and refined by and in 1967–1968, where the mediates and the massive handle weak processes like . The strong interaction, governing quark binding within hadrons, is described by (QCD), a non-Abelian where quarks carry "color" charge and interact via gluons. QCD's predictive power stems from the 1973 discovery of by and , independently confirmed by David Politzer, revealing that the strong coupling weakens at high energies, allowing perturbative calculations for short-distance phenomena. Particle masses in the arise via the , proposed by , , and Robert Brout in 1964, through of the electroweak symmetry by the Higgs field, which permeates space and interacts with particles to endow them with mass; the itself was later observed in 2012. Among its triumphs, the predicted the existence of the W and bosons, which were discovered in 1983 by the UA1 and UA2 collaborations at CERN's , confirming electroweak unification with masses around 80 and 91 GeV/c², respectively. Similarly, the top , the heaviest fermion at approximately 173 GeV/c², was observed in 1995 by the CDF and DØ experiments at Fermilab's , completing the quark sector and validating the third generation of matter particles. Despite these successes, the excludes gravity, relying instead on for that force, and originally assumes massless neutrinos, a limitation exposed by experiments starting with Super-Kamiokande's 1998 results demonstrating mixing and thus non-zero masses.

Branches and Subfields

Nuclear and Particle Physics

Nuclear and particle physics investigates the structure and behavior of nuclei and the fundamental particles that constitute matter, probing scales from femtometers to interactions mediated by and electroweak theory. This field encompasses the stability of nuclei, the forces binding protons and neutrons, and high-energy collisions that reveal subatomic constituents. Key advancements have elucidated binding mechanisms and led to practical technologies harnessing reactions. The structure of atomic nuclei is modeled using approaches like the liquid drop model and the , which explain observed and stability patterns. The liquid drop model treats the as a charged liquid droplet, accounting for bulk properties through contributions from volume, , Coulomb repulsion, asymmetry between protons and neutrons, and pairing effects. This semi-empirical framework, formalized by in 1935, predicts the B(A, Z) for a with A and Z as: B(A, Z) = a_v A - a_s A^{2/3} - a_c \frac{Z(Z-1)}{A^{1/3}} - a_a \frac{(A - 2Z)^2}{A} \pm \delta, where a_v, a_s, a_c, a_a are empirical coefficients representing volume, surface, , and terms, respectively, and \delta accounts for . The per , derived from this model, exhibits a curve peaking around at approximately 8.8 MeV per , explaining why lighter elements fuse and heavier ones to release . Complementing the liquid drop model, the posits that occupy quantized levels analogous to electrons in atoms, leading to enhanced stability at "" of protons or neutrons (2, 8, 20, 28, 50, 82, 126). Developed by and J. Hans D. Jensen in the late 1940s, this model incorporates spin-orbit coupling to match experimental and predict nuclear spectra. These models together describe the curve's deviations, such as shell closures causing local maxima in stability. The primary forces governing nuclear interactions are the strong and the weak . The strong , mediated by gluons between quarks within protons and neutrons, binds nucleons over short ranges (~1 fm) and is described within as the residual color . It overcomes electromagnetic repulsion between protons, with a strength about 100 times the electromagnetic at nuclear distances. The weak , responsible for processes like neutron-to-proton transformation via n \to p + e^- + \bar{\nu}_e, operates over even shorter ranges (~10^{-18} m) and mediates flavor-changing interactions among quarks and leptons, with strength roughly 10^{-6} times that of the strong . Particle physics relies on accelerators to probe these interactions at high energies. The (LHC) at , operational since 2008, circulates proton beams at energies up to 13-14 TeV in a 27 km ring, enabling collisions that recreate conditions of the early . Notable discoveries include pentaquarks, exotic composed of four and one antiquark, observed by the in 2015 through decays of \Lambda_b^0 \to J/\psi K^- p, confirming states like P_c(4450)^+ and P_c(4380)^+ with significances exceeding 9σ and 6σ, respectively. These findings, part of the Standard Model's hadron spectrum, highlight quark confinement dynamics. Applications of nuclear and particle physics span energy production and . , as in splitting induced by neutrons, powers reactors generating about 10% of global with controlled reactions, while —exemplified by deuterium-tritium reactions in experimental tokamaks like —promises abundant, low-waste energy by mimicking stellar processes. In , positron emission tomography () scans utilize beta-plus emitters like to image metabolic activity, detecting cancers by annihilating positrons with electrons to produce 511 keV gamma rays for . Recent efforts include searches for (0νββ), a process that would violate conservation if observed, probing neutrino masses and Majorana nature; experiments like nEXO, planned for the with a 5-tonne detector, aim for sensitivities beyond 10^{28} years.

Atomic, Molecular, and Optical Physics

Atomic, molecular, and optical (AMO) physics investigates the interactions between light and matter at the scale of individual atoms and molecules, revealing quantum behaviors that underpin modern technologies like precision spectroscopy and quantum computing. This field bridges quantum mechanics—where atoms exhibit discrete energy states—and classical optics, enabling the manipulation of photons and electrons with unprecedented control. Key phenomena in AMO physics arise from the quantized nature of atomic and molecular systems, allowing scientists to probe fundamental processes such as electron transitions and coherent light emission. In , the structure of atoms is characterized by discrete energy levels, determined by the solutions to the in the Coulomb potential of the , as established in . These levels dictate the possible electronic transitions observed in atomic spectra, with wavelengths governed by the for hydrogen-like atoms. Selection rules, derived from conservation of and in dipole approximations, restrict which transitions are allowed; for electric transitions, the change in orbital angular momentum Δl = ±1 and total angular momentum ΔJ = 0, ±1 (but not 0 to 0). The further splits these degenerate energy levels in a , lifting degeneracy due to the interaction between the atomic and the external field, resulting in shifted spectral lines proportional to the field strength B via ΔE = μ_B g m_J B, where μ_B is the . This effect, first observed experimentally, provided early evidence for electron spin and orbital contributions to atomic magnetism. Molecular physics extends atomic concepts to bound systems, where vibrational and rotational modes contribute to levels and spectra. Molecular s, modeled as harmonic oscillators for small displacements, exhibit quantized vibrational levels E_v = ħω (v + 1/2), with transitions appearing in spectra at frequencies corresponding to stretching or bending. Rotational spectra, arising from approximations, show lines spaced by 2B, where B is the rotational constant inversely proportional to the , enabling determination of from microwave or far- observations. These rovibrational spectra, combining vibrational progressions with rotational , provide detailed insights into strengths and . s, a of optical physics, rely on —a process predicted by Einstein—where an incoming triggers an excited or to emit an identical , leading to coherent amplification. The first , a device, demonstrated this in 1960, producing pulsed red light at 694 nm through in ions. Optical phenomena in AMO systems highlight light-matter coherence, exemplified by Bose-Einstein condensates (BECs), where ultracold atoms below 170 nK occupy the ground state collectively, behaving as a single quantum wavefunction. The first gaseous BEC, achieved with rubidium-87 atoms using and evaporative techniques, confirmed and Einstein's 1924–1925 prediction and earned the 2001 . Quantum entanglement in photons, generated via atomic cascades or parametric down-conversion in , creates correlated photon pairs whose polarizations or momenta violate classical inequalities, foundational for protocols. In AMO contexts, entanglement between photons and atomic ensembles enables quantum repeaters and . Recent advances in AMO physics include pulses, enabling real-time observation of dynamics in atoms and molecules. Generated via high-harmonic processes in intense fields, these pulses—lasting ~100 as (10^{-18} s)—allow probing of inner-shell transitions and charge , as demonstrated in landmark experiments. The 2023 recognized this work for revolutionizing attosecond science. Quantum sensors, leveraging AMO platforms like trapped ions or neutral atoms, achieve sensitivities beyond classical limits for fields, time, and ; for instance, magnetometers detect biomagnetic signals at fT/√Hz levels, advancing and fundamental tests.

Condensed Matter Physics

Condensed matter physics investigates the physical properties of matter in its condensed phases, primarily solids and liquids, where interactions among large numbers of atoms or molecules give rise to emergent phenomena not predictable from individual particle behaviors. This field encompasses the study of collective excitations, phase transitions, and electronic structures in dense materials, drawing on principles from and to explain macroscopic properties like and . Unlike isolated systems, condensed matter focuses on many-body interactions that lead to complex behaviors in crystalline and amorphous solids. Crystal structures form the foundational framework in , describing the periodic arrangements of atoms in solids that determine their mechanical, thermal, and electronic properties. Common structures include face-centered cubic, body-centered cubic, and hexagonal close-packed lattices, which are classified by their space groups and symmetry operations. These arrangements influence phonon modes and , impacting material stability and functionality. Band theory explains the electrical of by modeling s as waves in a periodic potential, resulting in energy bands separated by band gaps. In insulators, the valence band is fully occupied and separated from the empty conduction band by a large gap, preventing electron flow; in metals, bands overlap, allowing free conduction; and in semiconductors, a small gap enables thermal excitation of carriers. This theory, developed in the mid-20th century, underpins the understanding of electronic properties in crystalline materials. Semiconductors, characterized by band gaps of 0.1 to 3 , enable control of electrical properties through doping and temperature, forming the basis of modern . The invention of the in 1947 by and Walter Brattain at Bell Laboratories, using a crystal, demonstrated and switching, revolutionizing technology. later refined this into the junction transistor, earning them the 1956 . Superconductivity, the zero-resistance flow of below a critical temperature, emerges from electron-phonon interactions forming Cooper pairs. The Bardeen-Cooper-Schrieffer (, proposed in , provides a microscopic explanation: attractive interactions via lattice vibrations bind electrons into pairs with opposite momenta and spins, leading to a that carries current without dissipation. This theory accurately predicts properties like the energy gap and isotope effect in conventional superconductors. High-temperature superconductivity in cuprates, discovered in 1986 by J. Georg Bednorz and K. Alex Müller at , shattered the BCS limit with a transition temperature of 35 K in barium-doped lanthanum copper oxide. Subsequent developments raised critical temperatures above 90 K in YBa₂Cu₃O₇, enabling liquid-nitrogen cooling, though the pairing mechanism remains unconventional, involving strong correlations rather than phonons. Their work earned the 1987 and spurred intense research into layered structures. Topological insulators represent a where the bulk is insulating due to a band gap, but the surface hosts robust, spin-polarized conducting states protected by time-reversal symmetry and . These edge states arise from the nontrivial global band structure, characterized by invariants like the Z₂ index, making them immune to backscattering and promising for . Experimental realizations in materials like Bi₂Se₃, confirmed in 2008, have validated theoretical predictions from the 2000s. Graphene, a single layer of carbon atoms in a , exhibits extraordinary properties including high , mechanical strength, and thermal conductivity due to its Dirac-like band structure. Isolated in 2004 by and using mechanical exfoliation from , it revealed relativistic electrons behaving as massless Dirac fermions, enabling the at . Their groundbreaking experiments earned the 2010 and ignited the field of two-dimensional materials. In the 2020s, quantum spin liquids—exotic states where spins remain disordered and entangled at without magnetic ordering—have seen experimental progress, with evidence in kagome-lattice materials like herbertsmithite and ettringite showing fractionalized excitations akin to spinons. These states, predicted by Philip Anderson in 1973, promise applications in due to topological protection. Advances in 2D materials beyond , such as twisted bilayer graphene and transition metal dichalcogenides, have demonstrated moiré patterns inducing and correlated insulating phases, expanding possibilities for tunable quantum devices.

Astrophysics and Cosmology

Astrophysics applies the principles of physics to understand the structure, evolution, and properties of celestial objects and phenomena, ranging from stars and galaxies to the universe as a whole. It encompasses the study of stellar interiors, interstellar medium, galactic dynamics, and high-energy processes like supernovae and black holes. Cosmology, a subfield of astrophysics, focuses on the origin, evolution, large-scale structure, and ultimate fate of the universe, integrating observations with theoretical models to probe fundamental questions about space, time, and matter. Stellar physics provides the foundation for understanding stars as self-gravitating spheres of plasma powered by . The Hertzsprung-Russell (HR) diagram, independently developed by in 1911 and Henry Norris Russell in 1913, plots stellar luminosity against surface temperature (or spectral type), revealing key evolutionary sequences such as the , giant branch, and region. This diagram illustrates how stars of different masses spend varying portions of their lifetimes in distinct phases, with massive stars evolving rapidly toward supergiants while lower-mass stars like remain on the for billions of years. processes in stellar cores, elucidated by in the late 1930s, convert hydrogen into helium through mechanisms like the proton-proton chain in low-mass stars and the in more massive ones, releasing energy via E = mc^2 and maintaining . These processes not only power stellar luminosity but also synthesize heavier elements through subsequent stages of fusion up to iron in advanced phases. Cosmology's modern framework is anchored in the Big Bang model, first proposed by Georges Lemaître in 1927 as an expanding universe from a "primeval atom," later supported by Edwin Hubble's 1929 observations of galactic redshifts. Hubble's law, expressed as v = H_0 d where v is recession velocity, d is distance, and H_0 is the Hubble constant (approximately 70 km/s/Mpc), quantifies the uniform expansion of space itself, implying that the universe has been cooling and diluting since its hot, dense origin about 13.8 billion years ago. A cornerstone of this model is the cosmic microwave background (CMB), the relic radiation from when the universe became transparent around 380,000 years after the Big Bang, discovered serendipitously by Arno Penzias and Robert Wilson in 1965 as a uniform 2.7 K blackbody spectrum across the sky. The CMB's tiny temperature fluctuations, mapped precisely by satellites like COBE and Planck, encode the seeds of large-scale structure formation through gravitational instability. The composition of the universe, inferred from observations, challenges the standard model by requiring non-baryonic components. , constituting about 27% of the energy density, was first evidenced in the 1970s through flat galactic rotation curves observed by and Kent Ford, where orbital speeds of stars and gas in spiral galaxies like remain constant at large radii rather than declining as expected from visible mass alone. This implies a massive, invisible halo of providing the necessary gravitational binding, later confirmed by dynamics and CMB anisotropies. , making up roughly 68% of the universe, drives accelerated expansion and was discovered in 1998 through observations by teams led by , , and , revealing that distant supernovae appear fainter than in a decelerating model. The prevailing ΛCDM (Lambda ) model integrates these with a Λ representing , baryonic matter, and , successfully predicting the universe's flat geometry and as validated by Planck data. Recent advancements have provided direct visual insights into extreme astrophysical phenomena. In 2019, the Event Horizon Telescope (EHT) collaboration produced the first image of a black hole's shadow in the galaxy M87, resolving a 6.5-billion-solar-mass object at event-horizon scale and confirming general relativity's predictions for orbits near the horizon. Complementing this, the (JWST), operational since 2022, has revolutionized atmosphere studies by detecting molecular signatures like and in gas giants such as through transmission spectroscopy. By 2025, JWST observations have mapped 3D atmospheric structures in hot Jupiters, revealing temperature gradients and chemical compositions that inform planetary formation and migration theories, with detections extending to worlds and hazy environments potentially harboring biosignatures.

Research Methods and Practices

The Scientific Method in Physics

The scientific method in physics is an iterative, that guides the development and validation of theories through systematic , emphasizing testable predictions and rigorous scrutiny. It begins with careful of natural phenomena, where physicists identify patterns or anomalies in from experiments or astronomical surveys, forming the foundation for further . From these observations, a is formulated—a tentative explanation that must be precise and falsifiable, meaning it can be disproven by if incorrect. This aligns with Karl Popper's criterion of falsification, which posits that scientific theories gain credibility not through verification but by surviving attempts to refute them via observable contradictions. The process advances to prediction, where the hypothesis is used to forecast specific outcomes, often derived through mathematical modeling to quantify relationships and simulate scenarios. plays a central role here, enabling physicists to represent physical laws as equations or computational models that bridge qualitative ideas to quantitative tests; for instance, differential equations describe dynamic systems like particle motion, allowing simulations to predict behaviors under varied conditions. These predictions are then subjected to experimentation or for testing, where discrepancies between expected and actual results can falsify the hypothesis, prompting refinement or rejection. If predictions hold, the hypothesis may evolve into a , but it remains provisional, open to future challenges. Peer review and replication are integral to ensuring the reliability of findings in physics, with results submitted to prestigious journals like for expert evaluation before publication. During , independent scientists assess the methodology, data analysis, and conclusions for soundness, often recommending replications by other groups to confirm , a cornerstone of physical science that mitigates errors and biases. This communal verification process, managed by organizations such as the , upholds the field's standards and fosters cumulative progress. A seminal historical example is Albert Einstein's general theory of relativity, proposed in 1915, which hypothesized that massive objects curve , leading to predictions like the deflection of starlight by the Sun's gravity. This was empirically tested during the 1919 solar eclipse expedition led by , whose observations of shifted star positions near the Sun's edge confirmed Einstein's prediction to within experimental error, providing strong evidence while exemplifying the method's power in linking theory to observation.

Experimentation and Observation

Laboratory setups in physics experiments often employ specialized instruments to control environmental conditions and measure fundamental phenomena with high precision. Interferometers, such as the , split a into two paths that recombine to produce interference patterns, enabling measurements of minute changes in length or down to fractions of a . Calorimeters absorb incident particles or completely, converting their into detectable or to quantify total energy deposition, which is essential for studying interactions in high-energy environments. Vacuum systems maintain ultra-low pressure environments, typically below 10^{-6} , to minimize from air molecules and enable clean observations of particle trajectories or . Particle detectors play a crucial role in capturing and analyzing subatomic interactions, particularly in accelerator-based experiments. Scintillators are materials that emit flashes of upon interaction with , allowing the position, , and timing of particles to be recorded via photomultiplier tubes. Cloud chambers, pioneered by in 1911, create visible vapor trails from the ionization paths of charged particles in a supersaturated gas, providing early insights into particle tracks and decays. The (LHC) at exemplifies advanced collider technology, where protons are accelerated to nearly the in a 27-kilometer ring and collided at four interaction points, with detectors like ATLAS and identifying resulting particles such as the . Observational methods in astronomy rely on telescopes tailored to specific wavelengths to probe distant cosmic phenomena. Optical telescopes, using mirrors or lenses to collect visible light, resolve fine details of stars, galaxies, and nebulae from ground-based or space platforms. Radio telescopes, often consisting of large parabolic dishes or arrays like the , detect longer-wavelength emissions from cool gas clouds, pulsars, and the . X-ray telescopes, such as the , must operate in space to avoid atmospheric absorption, capturing high-energy emissions from black holes, supernovae remnants, and galaxy clusters. The , launched on April 24, 1990, has revolutionized observations by providing , optical, and near- images free from atmospheric distortion, revealing details like the age of the and atmospheres. Complementing Hubble, the (JWST), launched on December 25, 2021, specializes in wavelengths to peer through dust clouds at early structures, regions, and compositions. Precision measurements underpin many physical constants and tests of fundamental theories through advanced . Atomic clocks, such as those developed at NIST using optical lattices to trap thousands of atoms, achieve timekeeping accuracy to the 19th decimal place, enabling tests of and in global networks. LIGO's laser interferometry detects by measuring tiny distortions; its 4-kilometer arm cavities achieve a strain of approximately $10^{-21}, allowing detection of events like mergers billions of light-years away.

Theoretical Modeling and Computation

Theoretical modeling in physics relies heavily on analytical methods to derive and solve equations that describe natural phenomena. Central to these approaches are s, which govern a wide range of physical systems from to . For instance, , formulated in 1865, unify and through a set of four coupled partial differential equations that predict electromagnetic wave propagation. In , the , introduced in 1926, provides the foundational partial differential equation for the time evolution of a quantum system's : i \hbar \frac{\partial}{\partial t} \psi(\mathbf{r}, t) = \hat{H} \psi(\mathbf{r}, t), where \hat{H} is the operator, enabling predictions of atomic and molecular behavior. These equations often admit exact solutions only in simplified cases, such as linear systems or specific geometries, highlighting the need for considerations to simplify complex problems. Symmetry groups play a crucial role in analytical modeling by exploiting invariances to reduce the dimensionality of equations and reveal conserved quantities. Emmy Noether's 1918 theorem establishes a profound link between continuous symmetries of a system's action and conservation laws, such as energy conservation from time-translation invariance or momentum conservation from spatial translation symmetry. In particle physics, Lie groups like SU(3) for quantum chromodynamics classify symmetries of strong interactions, guiding the formulation of gauge theories. These group-theoretic tools, applied via representation theory, facilitate the classification of particles and interactions, as seen in the Standard Model's structure. Analytical methods thus provide exact insights but frequently require approximations for realistic systems, bridging to computational techniques. Numerical simulations extend analytical methods by approximating solutions to differential equations through , essential for nonlinear or high-dimensional problems. The (FEM), pioneered in the 1940s and formalized for in 1960, divides physical domains into finite elements to solve partial differential equations iteratively, widely used in for stress analysis and . For quantum systems, methods employ stochastic sampling to estimate expectation values in many-body problems, overcoming the exponential complexity of exact diagonalization. A landmark application is the variational technique for the gas, which in 1980 accurately computed ground-state energies, validating benchmarks. These methods enable simulations of quantum phase transitions and correlated materials, where analytical solutions are intractable. High-performance computing (HPC) amplifies numerical simulations by leveraging supercomputers to handle vast computational demands in physics. In lattice quantum chromodynamics (QCD), pioneered by Wilson's 1974 formulation, the strong force is modeled on a discrete spacetime lattice, requiring HPC to perform Monte Carlo updates on lattices with billions of sites; modern simulations on systems like Frontier achieve precision in hadron mass calculations essential for particle physics. Similarly, global climate models, such as those from the Community Earth System Model, utilize supercomputers to solve coupled atmospheric and oceanic equations, simulating century-scale projections with resolutions down to kilometers, informing IPCC assessments on climate sensitivity. These HPC applications demonstrate scalability, with petaflop-scale resources enabling ensemble runs that quantify uncertainties in predictions. Machine learning has emerged as a transformative tool in physics for both and predictive modeling, particularly in the with advancements. At the (LHC), algorithms process petabytes of collision data to identify rare events, such as decays, using convolutional s for jet classification with accuracies exceeding 90% over traditional methods. (PINNs), introduced in 2019 and refined in subsequent works, embed differential equations directly into loss functions, enabling surrogate models for solving forward problems like turbulent flows or inverse problems in with orders-of-magnitude speedups over traditional solvers. These approaches, combining data-driven learning with physical constraints, accelerate discoveries in high-energy physics and , such as in signals.

Current Frontiers and Challenges

One of the most pressing frontiers in physics is the unification of and into a theory of , which remains elusive despite decades of effort. continues to be a leading candidate, positing that the universe's fundamental constituents are tiny vibrating strings in up to 11 dimensions, potentially incorporating gravity naturally alongside other forces. Recent conferences, such as Quantum Gravity 2025, highlight ongoing refinements in frameworks, including explorations of entropy and holographic principles, though experimental verification remains challenging due to the Planck scale energies involved. Complementing this, quantizes space-time into discrete spin networks, predicting a granular structure that could resolve singularities in and the ; advances in 2025 include numerical simulations testing these predictions against data from binary mergers. Laboratory experiments are also probing effects, such as entangling massive objects to detect gravitational self-interaction at quantum levels, with proposals like those using optomechanical systems aiming for results in the coming years. These approaches underscore the challenge of bridging the vast scales between quantum phenomena and cosmic gravity, with no consensus theory yet emerging. Grand unified theories (GUTs) represent another unification frontier, seeking to merge the strong, weak, and electromagnetic forces of the into a single gauge symmetry at energies around 10^16 GeV, while leaving for efforts. Recent developments include non-supersymmetric GUTs incorporating leptoquarks to explain masses and predictions, analyzed in models like the Georgi-Glashow SU(5) extension, which evade issues through extra scalar fields. Asymptotic GUTs in SO(10) with one extra dimension propose gradual unification in the ultraviolet limit, where couplings converge non-perturbatively, offering testable signatures in flavor physics. Experimental priorities for 2025 emphasize searches for GUT-scale relics, such as magnetic monopoles at the LHC and future colliders, though null results to date highlight the need for higher-energy probes. The poses profound challenges, as and constitute approximately 27% and 68% of the universe's , respectively, yet their fundamental nature defies direct detection. Weakly interacting massive particles (WIMPs) remain prime candidates, but experiments like XENONnT have yielded null results in the 2020s, excluding WIMP-nucleon cross-sections above 10^{-47} cm² for masses over 9 GeV/c² in 2025 analyses of low-energy events. These constraints, derived from over 1 tonne-year of exposure, tighten the parameter space and shift focus toward lighter or sub-GeV candidates. Axions, ultralight particles arising from Peccei-Quinn , are alternatively pursued through haloscope searches like ADMX, with 2025 upgrades probing masses around 10^{-5} eV and setting new limits on axion-photon couplings below 10^{-16} GeV^{-1}. Ongoing null detections for both WIMPs and axions underscore the need for multi-messenger approaches, including gamma-ray observations from Fermi-LAT to map annihilation signals. Quantum technologies, particularly , face scalability hurdles despite rapid progress in . Error correction is central, with logical qubits requiring thousands of physical qubits to suppress decoherence; Google's Willow processor in 2024 demonstrated below-surface-code-threshold performance, reducing error rates exponentially with qubit count, a milestone extended in 2025 collaborations. IBM's 2025 roadmap outlines utility-scale systems with 100+ error-corrected qubits by 2029, building on 2023-2025 advances like the Heron chip's 133-qubit capacity and dynamical decoupling techniques that achieve fidelities over 99.9%. These developments enable fault-tolerant algorithms for chemistry simulations, but challenges persist in cryogenic scaling and interconnectivity, with industry projections targeting 1 million physical qubits by 2030 to realize practical advantage. Multiverse hypotheses emerge from inflationary and theory's of 10^500 vacua, suggesting is one of many with varying constants, potentially explaining like the . Current research grapples with the measure problem, where infinite universes complicate probability assignments for observations such as the low density. In , studies link predictions to , showing that varying fundamental constants like the Higgs vev restrict galactic formation, with only a narrow range yielding observer-friendly universes. extensions propose realizations tying stability to multiple sectors, testable via indirect signals in collider data. These ideas remain speculative, lacking direct , and fuel debates on whether theories constitute or metaphysics. The challenges quantum unitarity, questioning if falling into a is irretrievably lost during Hawking evaporation, violating the . proposals suggest a high-energy barrier at the horizon to preserve , conflicting with general relativity's smooth geometry and the "no-drama" condition for infalling observers. Recent 2025 analyses explore fast scrambling in interiors, where rapidly delocalizes , potentially leading to emergent firewalls under covariant constraints without violating . Resolutions via the Page curve, confirmed through replica calculations in AdS/CFT, indicate leaks gradually post-mid-evaporation, supported by 2025 reviews emphasizing entanglement island contributions. The matrix hypothesis posits space-time as an repository, storing Hawking pairs' correlations to resolve the paradox without exotica, aligning with conjectures. These advances highlight ongoing tensions between , , and , with gravitational wave observatories like providing indirect tests.

Applications and Interdisciplinary Connections

Technological Innovations

Technological innovations in physics have profoundly shaped modern industry and communication through breakthroughs in , production, medical diagnostics, and data transmission. In , the , invented in 1947 by and Walter Brattain at Bell Laboratories with further development by , revolutionized amplification and switching by leveraging properties to control electrical current, enabling the miniaturization of devices and earning the trio the 1956 . Building on this, integrated circuits emerged in the late 1950s: Jack Kilby at demonstrated the first prototype in 1958 using germanium, while Robert Noyce at patented a silicon-based monolithic version in 1959, allowing multiple transistors to be fabricated on a single chip for compact, efficient computing. Light-emitting diodes (LEDs), first demonstrated in visible light by Nick Holonyak Jr. at in 1962 through gallium arsenide phosphide semiconductors, provided energy-efficient illumination based on , transforming displays, lighting, and . In energy technologies, photovoltaic solar cells harness the , with practical silicon-based cells first developed at Bell Laboratories in 1954 achieving 6% efficiency; by 2025, commercial multi-junction cells have exceeded 25% efficiency under standard conditions, as tracked by the (NREL), enabling widespread renewable power generation. Nuclear fusion research advances toward sustainable , exemplified by the International Thermonuclear Experimental Reactor () project, an collaboration aiming to demonstrate net energy gain from deuterium-tritium by its original 2035 target for full operations, though recent baselines indicate a delay to 2039 due to construction challenges. Medical applications draw on wave and particle physics for non-invasive diagnostics and treatments. relies on piezoelectric transducers generating high-frequency sound waves (typically 2-18 MHz) that reflect off tissue interfaces based on differences, allowing visualization of organs without . employs , such as X-rays or gamma rays from linear accelerators, to deliver precise doses that damage DNA while sparing healthy tissue, guided by principles to achieve high tumor control rates, often exceeding 80-90% for early-stage or specific cancers such as or head-and-neck tumors, depending on the case. Quantum dots, nanoscale particles exhibiting size-tunable due to quantum confinement, enhance contrast in techniques like fluorescence microscopy, enabling targeted labeling of biomolecules for early detection with improved resolution over traditional dyes. Communications technologies leverage electromagnetic wave propagation for high-speed data transfer. Fiber optics, pioneered by in the 1960s through low-loss silica glass fibers exploiting , enable terabit-per-second transmission over thousands of kilometers with minimal signal degradation, forming the backbone of global infrastructure. Fifth-generation (5G) and emerging sixth-generation (6G) wireless networks utilize millimeter-wave frequencies (30-300 GHz), where short wavelengths allow massive bandwidths for data rates exceeding 10 Gbps, though challenged by high atmospheric attenuation and requiring to maintain signal integrity.

Physics in Engineering and Materials Science

Physics principles form the foundation of disciplines, enabling the and of structures, systems, and that withstand real-world and conditions. In civil and , provides essential tools for ensuring structural integrity and efficient performance. Stress-strain , derived from fundamental concepts of and deformation, allows engineers to predict how materials respond to loads, preventing failures in bridges, buildings, and components. In , explains the generation of on airfoils by relating fluid speed to differences, guiding the of wings and fuselages for optimal flight efficiency. Materials science applies physics to develop substances with tailored properties, focusing on behaviors like elasticity and to enhance durability and functionality. Elasticity describes a material's ability to return to its original shape after deformation within its elastic limit, quantified by the , which measures stiffness under . , in contrast, involves permanent deformation beyond this limit, crucial for understanding processes like and for designing ductile materials that absorb energy without fracturing. , such as carbon nanotubes, exemplify advanced applications; single-walled variants exhibit tensile strengths exceeding 100 GPa due to their atomic-scale structure, far surpassing traditional steels and enabling lightweight composites for and . In electrical engineering, electromagnetism underpins circuit theory, which models the flow of current and voltage in networks using principles like Kirchhoff's laws derived from . This framework simplifies complex electromagnetic interactions into practical designs for devices ranging from power grids to microchips. Superconductors, materials that exhibit zero electrical resistance below critical temperatures, are integral to (MRI) systems, where niobium-titanium coils generate stable, high-field magnets up to several teslas for precise . Recent advancements highlight physics-driven innovations in and materials. Metamaterials, engineered composites with properties not found in , enable applications like electromagnetic by manipulating wave propagation to render objects undetectable, as demonstrated in designs using shells for spherical . In the , 3D printing of alloys has progressed through techniques that "grow" metals within hydrogels, yielding structures up to 20 times stronger than conventional prints by controlling microstructure at the atomic level, advancing customizable components for and biomedical implants.

Philosophical Implications

Physics raises profound philosophical questions about the nature of reality, particularly through its challenge to classical notions of . In , the universe operates under strict causal , where every event is uniquely determined by prior states and natural laws. introduces inherent randomness, as seen in phenomena like or particle position measurements, where outcomes are probabilistic rather than predictable with certainty. This has fueled debates on , with some philosophers arguing that quantum randomness provides the necessary leeway for genuine human agency, avoiding the incompatibilist dilemma that precludes . Others contend that mere randomness does not equate to control, potentially rendering actions arbitrary rather than willed. The question of in physics is acutely tested by , especially regarding the of the wave function and its apparent upon measurement. Realist views posit that the wave function describes an objective physical state, but the suggests it represents epistemic probabilities, with occurring due to observation, implying an active role for in reality's unfolding. famously critiqued this indeterminacy, stating in a 1926 letter to that "God does not play dice with the universe," arguing that must be incomplete and that variables underlie apparent randomness to restore a deterministic, realist . This observer effect has sparked epistemological concerns about whether influences physical outcomes, though most modern interpretations, such as decoherence theory, downplay a special observer role without resolving the . Relativity theory reshapes philosophical understandings of time and , proposing a block where , present, and coexist in a four-dimensional manifold, rendering temporal flow illusory and relative. This eternalist perspective challenges intuitive notions of a privileged "now," suggesting is a static between events rather than a dynamic process propagating forward. Complementing this, the —our experience of time's directionality—emerges from the second law of thermodynamics, where increases from low to high states, creating an asymmetry that distinguishes from without invoking a fundamental temporal direction. These ideas imply that may be bidirectional or context-dependent in a block , complicating traditional linear causation and raising questions about the reality of temporal becoming. Reductionism, the view that complex phenomena can be fully explained by fundamental physical laws, faces limits when addressing and emergent properties in . While physics successfully reduces many macroscopic behaviors to atomic interactions, —the subjective of —presents the "hard problem," as neural correlates in the do not explain why physical processes yield felt rather than mere function. In , arises through context-dependent interactions, such as varying by cellular environment or self-organizing systems producing novel properties irreducible to parts, as in behaviors or developmental patterns. Critics argue that physics alone cannot capture these higher-level emergents without incorporating interdisciplinary mechanisms, highlighting reductionism's explanatory gaps in bridging micro to macro scales.

Education and Professional Practice

Undergraduate physics education typically forms the foundation for professional practice in the field, emphasizing core theoretical and experimental skills. The standard curriculum includes introductory classical physics, intermediate classical mechanics, electromagnetism, quantum mechanics, thermal and statistical physics, and modern physics, often comprising 22-26% of total credits for introductory courses and 8-11% for quantum mechanics across most programs. Laboratory components are integral, with advanced labs accounting for 12-14% of the curriculum in 87% of programs, focusing on hands-on experimentation, data analysis, and technical skills to build practical proficiency. These elements prepare students for diverse career paths while encouraging active learning and undergraduate research to enhance conceptual understanding and retention. Graduate training builds on this base through specialized coursework and , requiring mastery of a core set of advanced topics. Most programs mandate one-year sequences in and magnetism, , a semester in , and another in and , with 129 of 137 surveyed departments enforcing these. Specializations follow in areas such as condensed matter, , or , often incorporating interdisciplinary breadth like courses in or , alongside skills in , communication, and . This structure ensures candidates develop expertise for original , with programs typically spanning four to six years. Professional careers in physics span , , and , each demanding advanced degrees for specialized roles. In , a is essential for tenure-track positions, where physicists conduct and teach, though only about 5% of bachelor's recipients ultimately secure such roles despite 30% initial interest. opportunities, particularly in R&D at tech firms, leverage physics training for innovations in , , and , contributing an estimated 12.6% to the U.S. economy through product development. positions at national labs like or U.S. facilities involve collaborative on fundamental questions, with recent data showing approximately 54% of new physics s entering or , 14% in , 20% in postdoctoral positions, and the remainder in or other sectors. Professional bodies play a key role in upholding standards and supporting practitioners. The (APS), with over 50,000 members, advances physics through publications, advocacy, and education initiatives, fostering global collaboration and integrity in the community. The Institute of Physics (IOP) similarly promotes research, education, and application of physics worldwide, offering awards, events, and resources to enhance professional development. Diversity efforts, such as the APS Bridge Program launched in 2015, address underrepresentation by providing post-baccalaureate pathways for underrepresented minority students to enter graduate programs, increasing PhD attainment among Black, Latinx, and Indigenous physicists. Current challenges in the field include a volatile job market and the demand for interdisciplinary skills. The disrupted post-2020 outcomes, with workforce entry for bachelor's graduates dropping 4% and unemployment peaking at 8%, though recovery has stabilized with about 46% entering jobs within a year, based on data for the classes of 2019 and 2020. Additionally, the rise of data-intensive experiments necessitates training in and , positioning "data physicists" as key figures who integrate statistical tools with physical principles to analyze complex datasets from sources like particle colliders. These skills enhance employability across sectors, bridging traditional physics with emerging computational demands.

References

  1. [1]
    Physics: Background Information - Library Guides
    Oct 29, 2025 · Physics is a branch of physical science that involves the study of matter and its motion and behavior through space and time.
  2. [2]
    What is Physics? - Michigan Technological University
    Physics is the study of the underlying laws and mechanisms explaining how the universe works. Most of what we do in daily life is based on a principle or law ...Missing: authoritative source
  3. [3]
    physics_branches.html - UNLV Physics
    Classical Physics: The branch where Newtonian physics (AKA classical mechanics), classical electromagnetism, and classical thermodynamics (which includes the ...
  4. [4]
    What is Physics? - Tennessee Tech University
    Sub-Fields of Physics · Astrophysics · biophysics icon Biophysics · large molecule simulation Chemical Physics · future car mock-up Engineering Physics · gravity map ...
  5. [5]
    The Role of Applied Physics in Modern Engineering Challenges
    Feb 27, 2025 · Key Applications of Applied Physics in Modern Engineering · Electronics and Semiconductor Technology · Renewable Energy Systems · Medical Imaging ...
  6. [6]
    Physics - About Us - Department Chair - University of South Florida
    Physics is the most fundamental science that enlightens us to the workings of the smallest particles in the universe to the largest galaxies that permeates ...
  7. [7]
    Physics - Etymology, Origin & Meaning
    Originating in the 1580s from Latin physica and Greek ta physika, meaning "natural things," physics initially meant natural science; now it denotes the ...
  8. [8]
    The Four Interactions
    There are four interactions between particles: Strong, weak, gravity, electromagnetism. To clarify things, here are two definitions.
  9. [9]
    Physics Major - Tulane University Catalog
    Physics is the most fundamental science. It is the foundation for our understanding of the world around us, spanning the ultimate depths within subatomic ...
  10. [10]
    Minor in Physics - Quinnipiac University
    Physics is the most fundamental science, in that all other science, engineering and technology disciplines are founded upon the first principles of physics.
  11. [11]
  12. [12]
    [PDF] CODATA RECOMMENDED VALUES OF THE FUNDAMENTAL ...
    CODATA RECOMMENDED VALUES OF THE FUNDAMENTAL PHYSICAL CONSTANTS: 2018. NIST ... speed of light in vacuum c. 299 792 458 (exact). m s−1 muon g-factor −2 ...
  13. [13]
  14. [14]
    The impact of industrial physics on the U.S. economy
    The fascinating findings of the study show that an estimated 12.6% of the US economy can be ascribed directly to the practice of industrial physics.
  15. [15]
    Physics worth more to EU economy than retail and financial services ...
    Oct 22, 2019 · Report commissioned by the European Physical Society says industries that rely on expertise in physics contribute 12 per cent of EU economic ...
  16. [16]
    [PDF] Applications of Physics in Science, Technology, and Everyday Life
    Oct 30, 2025 · One of the most significant contributions of physics to society lies in healthcare and sustainability. The development of X-rays, MRI, CT scans ...
  17. [17]
    Relativity and the Global Positioning System - Physics Today
    May 1, 2002 · Relativistic coordinate time is deeply embedded in the GPS. Millions of receivers have software that applies relativistic corrections. Orbiting ...
  18. [18]
    The Importance of Investing in Physics
    Feb 2, 2021 · Such research has been crucial to achieving cost reductions in light-emitting diodes, photovoltaic cells, high-voltage semiconductors, and wind ...Missing: GDP | Show results with:GDP
  19. [19]
    Sustainability, Ethics and Nuclear Energy: Escaping the Dichotomy
    In this paper we suggest considering sustainability as a moral framework based on social justice, which can be used to evaluate technological choices.
  20. [20]
    Climate change is physics | Communications Earth & Environment
    Jan 25, 2022 · The award of the Nobel prize highlights that climate modelling is physics. This renders the question 'do you believe in global warming' meaningless.
  21. [21]
    A new UN report lays out an ethical framework for climate engineering
    Dec 1, 2023 · A new United Nations report is weighing the ethics of using technological interventions to try to rein in rising global temperatures.
  22. [22]
    Enhancing equity, diversity, and inclusion in physics - Frontiers
    Jan 11, 2024 · Equity, Diversity, and Inclusion (EDI) are important to drive innovation in many different fields, including particle physics.
  23. [23]
    Infusing Equity, Diversity, and Inclusion Throughout Our Physics ...
    Mar 1, 2022 · Increasingly, the physics community is attending to issues of equity, diversity, and inclusion (EDI), both in language and action.<|control11|><|separator|>
  24. [24]
    [PDF] Lecture 4 Ancient Astronomy
    The origins of astronomy are found in the ancient kingdoms of Assyria and Babylon. • Earliest star catalogs - 1200 BC.
  25. [25]
    Presocratic Philosophy
    Mar 10, 2007 · The Presocratics were 6 th and 5 th century BCE Greek thinkers who introduced a new way of inquiring into the world and the place of human beings in it.Missing: 600 | Show results with:600
  26. [26]
    Ancient Greek Philosophy
    The ancient atomists, Leucippus and Democritus (c.5th cn B.C.E.), were concerned with the smallest particles in nature that make up reality—particles that are ...
  27. [27]
    Aristotle's Natural Philosophy
    May 26, 2006 · The natural motions of the four sublunary elements are also caused by specific external causes responsible for these motions, and on the basis ...Missing: 350 | Show results with:350
  28. [28]
    [PDF] Construction and Operation of Archimedes' Iron Hand
    Similarly, in his work “On Floating Bodies” Archimedes formulated his Law of Buoyancy. This work, his most profound, contains a brilliant exposition on the ...
  29. [29]
    52.27 -- Hero's engine - UCSB Physics
    The apparatus used in this demonstration is similar to one invented by Hero of Alexandria, who lived some time during the first century C.E. You can find ...Missing: 1st | Show results with:1st
  30. [30]
    [PDF] Ibn al-Haytham (Alhazen) - NSUWorks
    Apr 25, 2025 · Ibn al-Haytham wrote Kitab al-Manazir (Book of Optics) from 1011 to 1021. Consisting of seven-volumes, this book on optics is his most famous ...Missing: Islamic | Show results with:Islamic
  31. [31]
    Ibn Sina's Natural Philosophy
    Jul 29, 2016 · 1. Medieval Physics. For Avicenna, the proper subject of natural philosophy, in its broadest or most general sense, is body insofar as it is ...Medieval Physics · Bodies and Magnitudes · Motion · BibliographyMissing: mechanics | Show results with:mechanics
  32. [32]
    John Buridan - Stanford Encyclopedia of Philosophy
    May 13, 2002 · The theory of impetus probably did not originate with Buridan, but his account appears to be unique in that he entertains the ...
  33. [33]
    Scientific Revolutions - Stanford Encyclopedia of Philosophy
    Mar 5, 2009 · The existence and nature of scientific revolutions is a topic that raises a host of fundamental questions about the sciences and how to interpret them.
  34. [34]
    Dialogues Concerning Two New Sciences | Online Library of Liberty
    Dialogues Concerning Two New Sciences by Galileo Galilei. Translated from the Italian and Latin into English by Henry Crew and Alfonso de Salvio. With an ...
  35. [35]
    [PDF] Kepler's Laws of Planetary Motion: 1609-1666 JL Russell
    Feb 12, 2008 · Kepler's laws of planetary motion were largely ignored between the time of their first publication (1609, 1619) and the publication of Newton's.
  36. [36]
    [PDF] Newton's Principia : the mathematical principles of natural philosophy
    NEWTON S PRINCIPIA. NATURAL PHILOSOPHY, BY SIR ISAAC NEWTON; TRANSLATED INTO ENGLISH BY ANDREW MOTTE.
  37. [37]
    A historical analysis of the independent development of calculus by ...
    This paper undertakes a historical investigation of the separate and independent development of calculus by Isaac Newton and Gottfried Leibniz in the late 17th ...
  38. [38]
    [PDF] FROM NEWTON'S MECHANICS TO EULER'S EQUATIONS
    Euler's three memoirs on fluid dynamics written in. 1755 contain of course much more than these equa- tions. They are immediately intelligible to the mod- ern ...
  39. [39]
    Joseph Louis Lagrange, Méchanique analitique, first edition (1788)
    Joseph Louis' book, Méchanique analitique, is the first textbook to treat theoretical mechanics in a purely analytic way. Its mathematical importance stems ...
  40. [40]
    History of the Royal Society
    November 28, 1660 ... Following a lecture by Christopher Wren, twelve men of science establish a 'College for the Promoting of Physico-Mathematical, Experimental ...
  41. [41]
    Reflections on the motive power of fire : Carnot, Sadi, 1796-1832
    Dec 9, 2009 · Reflections on the motive power of fire ; Publication date: 1960 ; Topics: Thermodynamics ; Publisher: New York, Dover Publications ; Collection ...
  42. [42]
    [PDF] On the Mechanical Equivalent of Heat
    On the Mechanical Equivalent of Heat. Author(s): James Prescott Joule. Source: Philosophical Transactions of the Royal Society of London, Vol. 140 (1850), pp ...Missing: primary | Show results with:primary
  43. [43]
    Universal Tendency in Nature to the Dissipation of Mechanical Energy
    When heat is created by any unreversible process (such as friction), there is a dissipation of mechanical energy, and a full restoration of it to its primitive ...Missing: 1851 primary
  44. [44]
    Experimental Researches In Electricity. - Project Gutenberg
    Experimental Researches In Electricity. By Michael Faraday, D.C.L. F.R.S.. Fullerian Profesor Of Chemistry In The Royal Institution.§ 2. Evolution of Electricity... · II. Ordinary Electricity. · ¶ vi. On the primary or...
  45. [45]
    [PDF] A Dynamical Theory of the Electromagnetic Field
    A Dynamical Theory of the Electromagnetic Field. J. Clerk Maxwell. , 459-512, published 1 January 1865. 155. 1865. Phil. Trans. R. Soc. Lond. Email alerting ...
  46. [46]
    Translation of Ludwig Boltzmann's Paper “On the Relationship ...
    Translation of the seminal 1877 paper by Ludwig Boltzmann which for the first time established the probabilistic basis of entropy.
  47. [47]
    [PDF] On the Relative Motion of the Earth and the Luminiferous Ether (with ...
    AMERICAN JOURNAL OF SCIENCE. [THIRD SERIES.] EDITORS. JAMES D. AND EDWARD S ... Michelson and Morley-Motion of the Earth, etc. n'-1 n' where. On the ...
  48. [48]
    Discovery of Radioactivity: Becquerel - Le Moyne
    This Becquerel was an expert on phosphorescent minerals. He is best known for his discovery of radioactivity, first reported just over a century ago.
  49. [49]
    Marie and Pierre Curie and the discovery of polonium and radium
    Dec 1, 1996 · At the end of June 1898, they had a substance that was about 300 times more strongly active than uranium. In the work they published in July ...
  50. [50]
    [PDF] The Manhattan Project - Department of Energy
    The road to the atomic bomb began with the revolutionary discoveries and insights of modern physics. In the early twentieth century, physicists conceived of the ...Missing: post- | Show results with:post-
  51. [51]
    Manhattan Project - Manhattan Project National Historical Park (U.S. ...
    During this time, nuclear science advanced at an exponential rate. New discoveries were made in rapid succession. The Manhattan Project produced hundreds of ...Beyond The Manhattan Project · Learn About Hanford, WA · Los Alamos, NM
  52. [52]
    Beyond the bomb: Atomic research changed medicine, biology
    Feb 27, 2014 · In the post-World War II era, the U.S. government produced radioisotopes in some of the same nuclear reactors that had been built to produce ...
  53. [53]
    The Standard Model | CERN
    The Standard Model includes the electromagnetic, strong and weak forces and all their carrier particles, and explains well how these forces act on all of the ...Missing: formulation gauge
  54. [54]
    [PDF] The Standard Model of electroweak interactions
    The Standard Model is a gauge theory based on SU(3)C ⊗SU(2)L ⊗U(1)Y, describing strong, weak, and electromagnetic interactions, and is a successful model in ...
  55. [55]
    [PDF] The Standard Model | DAMTP - University of Cambridge
    The Standard Model is a subject covered in lectures on Particle Physics, with elementary introductions available, and assumes familiarity with quantum field ...
  56. [56]
    CERN experiments observe particle consistent with long-sought ...
    “The discovery of a particle consistent with the Higgs boson opens the way to more detailed studies, requiring larger statistics, which will pin ...
  57. [57]
    The Higgs boson: a landmark discovery - atlas . CERN
    On 4 July 2012, the ATLAS and CMS experiments at CERN announced that they had independently observed a new particle in the mass region of around 125 GeV: a ...
  58. [58]
    Observation of Gravitational Waves from a Binary Black Hole Merger
    Feb 11, 2016 · This is the first direct detection of gravitational waves and the first observation of a binary black hole merger.
  59. [59]
    The 2011 Nobel Prize in Physics - Press release - NobelPrize.org
    Oct 4, 2011 · The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma – perhaps the greatest in physics today.
  60. [60]
    Planck 2013 results. I. Overview of products and scientific results
    In March 2013, ESA and the Planck Collaboration released the initial cosmology products based on the first 15.5 months of Planck data, along with a set of ...
  61. [61]
    Planck publications - ESA Cosmos - European Space Agency
    Planck 2013 Results. The scientific findings of the mission are presented in a series of papers based on data from the first 15.5 months of Planck operations.
  62. [62]
    James Webb Space Telescope - NASA Science
    NASA's James Webb Space Telescope has revealed a colorful array of massive stars and glowing cosmic dust in the Sagittarius B2 molecular cloud, the most massive ...NASA Webb Wows With... · NASA Webb's Autopsy of... · NASA's Webb Captures...
  63. [63]
    Three years of science: Ten cosmic surprises from NASA's Webb ...
    Jul 2, 2025 · In just three years of operations, Webb has brought the distant universe into focus, revealing unexpectedly bright and numerous galaxies.
  64. [64]
    Google Quantum AI Shows 13,000× Speedup Over World's Fastest ...
    achieving quantum supremacy in 2019 and advancing quantum ...
  65. [65]
    [PDF] Measurement of the Positive Muon Anomalous Magnetic Moment to ...
    Aug 10, 2023 · We present a new measurement of the positive muon magnetic anomaly, aµ ≡ (gµ − 2)/2, from the Fermilab Muon g−2 Experiment based on data ...
  66. [66]
    Muon g-2 announces most precise measurement of the magnetic ...
    Jun 3, 2025 · The final result agrees with their published results from 2021 and 2023 but with a much better precision of 127 parts-per-billion, surpassing ...
  67. [67]
    Newton's views on space, time, and motion
    Aug 12, 2004 · Isaac Newton founded classical mechanics on the view that space is distinct from body and that time passes uniformly without regard to whether anything happens ...Missing: continuity | Show results with:continuity
  68. [68]
    Chapter 3: Zeno's Paradoxes
    For Aristotle and the Greeks in general, continuity meant infinite divisibility. An interval cannot be partitioned into a set of points by an infinite sequence ...
  69. [69]
    Peter Suber, "Infinite Reflections" - Earlham College
    If time, space, or matter are infinitely divisible, then to experience a finite chunk of any one of them is to experience its infinity of parts. Having said ...
  70. [70]
    Has physics ever been deterministic? - Phys.org
    Dec 6, 2019 · Pierre-Simon Laplace illustrated this argument, later called Laplace's demon, in the early 1800s to illustrate the concept of determinism in ...
  71. [71]
    17 Symmetry and Conservation Laws - Feynman Lectures - Caltech
    In classical physics there are a number of quantities which are conserved—such as momentum, energy, and angular momentum. Conservation theorems about ...
  72. [72]
    Symmetries and conservation laws: Consequences of Noether's ...
    The conservation of momentum is related to the homogeneity of space. Invariance under translation in time means that the law of conservation of energy is valid.
  73. [73]
  74. [74]
    Newton's Philosophiae Naturalis Principia Mathematica
    Dec 20, 2007 · The view is commonplace that what Newton did was to put forward his theory of gravity to explain Kepler's already established “laws” of orbital ...Missing: scholarly | Show results with:scholarly
  75. [75]
    Newton's Laws of Motion | Glenn Research Center - NASA
    Jun 27, 2024 · In 1686, he presented his three laws of motion in the “Principia Mathematica Philosophiae Naturalis.” By developing his three laws of motion ...
  76. [76]
    Newton, Principia, 1687 - Hanover College History Department
    Isaac Newton is probably most famous for having discovered the universal laws of gravity. (That is, he showed that gravity explains the behavior of stars and ...
  77. [77]
    Newton's Principia and the Genesis of Universal Gravitation
    His universal law of gravitation provided a underlying set of rules that explained the motions of the planets which Kepler had documented. Philosophiæ naturalis ...
  78. [78]
    [PDF] Kepler's Laws - Central Force Motion - MIT OpenCourseWare
    In this lecture, we will start from Newton's laws and verify that the above three laws can indeed be derived from Newtonian mechanics. Equivalence between the ...
  79. [79]
    Projectile Motion – Introductory Physics for the Health and Life ...
    Projectile motion is the path of an object thrown into the air, influenced only by gravity, and can be analyzed by separating it into horizontal and vertical  ...
  80. [80]
    [PDF] Basics of Kepler and Newton
    Kepler's Laws, as derived by Newton. Collected most accurate observations of planetary motions to date. Found Copernican model still did not agree with data.
  81. [81]
    The Simple Pendulum - Graduate Program in Acoustics
    When displaced to an initial angle and released, the pendulum will swing back and forth with periodic motion. By applying Newton's secont law for rotational ...
  82. [82]
    Mécanique analytique : Lagrange, J. L. (Joseph Louis), 1736-1813
    Jan 18, 2010 · Publication date: 1811 ; Topics: Mechanics, Analytic ; Publisher: Paris, Ve Courcier ; Collection: thomasfisher; universityofottawa; toronto; ...Missing: source | Show results with:source
  83. [83]
    [PDF] Weaver, Hamilton, Hamiltonian Mechanics, and Causation - arXiv
    Nov 15, 2020 · papers on classical mechanics (Hamilton 1834; Second Essay 1835) provide persuasive ammunition for a causal interpretation of the laws of ...
  84. [84]
    Newtonian Gravitation and the Laws of Kepler
    Kepler's 3rd Law and Newton's 3rd Law imply that the force must be proportional to the product of the masses for the planet and the Sun. Thus, Kepler's laws and ...
  85. [85]
    13.6: Tidal Forces – University Physics Volume 1
    The tidal force can be viewed as the difference between the force at the center of Earth and that at any other location. In (Figure), this difference is shown ...
  86. [86]
    A History of Thermodynamics: The Missing Manual - PMC
    As noted by Gibbs, in 1850, Clausius established the first modern form of thermodynamics, followed by Thomson's 1851 rephrasing of what he called the Second Law ...
  87. [87]
    So-called zeroth law of thermodynamics - ACS Publications
    The "zeroth law of thermodynamics" elucidates the difference between the axiomatic and the epistemological method; it is neither a law nor a statement of ...Missing: original | Show results with:original
  88. [88]
    [PDF] The Universal Gas Constant R
    Jul 7, 2003 · Being French, Clapeyron had attributed the volume- pressure law to the French scientist, Edmé Mariotte (1620–. 1684), rather than to Robert ...
  89. [89]
    What Is the Real Clausius Statement of the Second Law of ... - NIH
    Sep 24, 2019 · The theorem of the equivalence of transformations is the real Clausius Statement of the second law of thermodynamics.
  90. [90]
    [PDF] Walther Nernst - Studies in chemical thermodynamics - Nobel Prize
    Here I pointed out in my first papers on the subject that my heat theorem, to begin with, does not appear to be applicable to gases, because we cannot ...
  91. [91]
    Boltzmann's Work in Statistical Physics
    Nov 17, 2004 · Boltzmann's begins the paper by stating that his goal is to elucidate the relationship between the Second Law and probability calculus. He ...
  92. [92]
    [1612.03062] A Look Back at the Ehrenfest Classification ... - arXiv
    Dec 9, 2016 · Translation and Commentary of Ehrenfest's 1933 paper introducing the notion of phase transitions of different order. Authors:Tilman Sauer.
  93. [93]
    [PDF] maxwell-theory-of-heat.pdf - Strange beautiful grass of green
    CHAPTER I. INTRODUCTION. Meaning of the word Temperature,. The Mercurial Thermometer. Heat as a Quantity.
  94. [94]
    This Month in Physics History | American Physical Society
    Charles Augustin Coulomb (top) used a calibrated torsion balance (bottom) to measure the force between electric charges. Around 600 BC, the Greek philosopher ...
  95. [95]
    5 Application of Gauss' Law - Feynman Lectures - Caltech
    Using Gauss' law, it follows that the magnitude of the field is given by E=ρr3ϵ0(r<R). You can see that this formula gives the proper result for r= ...
  96. [96]
    VIII. A dynamical theory of the electromagnetic field
    Oct 27, 2025 · (1) The most obvious mechanical phenomenon in electrical and magnetical experiments is the mutual action by which bodies in certain states ...
  97. [97]
    Maxwell's equations and the speed of light - Reading Feynman
    Sep 17, 2015 · Maxwell's equations and the speed of light · ∇ · E = σ/ε0: the flux of E through a closed surface is proportional to the charge inside. So that's ...
  98. [98]
    [PDF] Chapter 12 - Polarization - MIT OpenCourseWare
    Polarization is a general feature of transverse waves in three dimensions. The general elec- tromagnetic plane wave has two polarization states, ...
  99. [99]
    Who really discovered Snell's law? - IOPscience
    The principle of refraction – familiar to anyone who has dabbled in optics – is named after the Dutch scientist Willebrørd Snell (1591–1626).
  100. [100]
    17.1 Understanding Diffraction and Interference - Physics | OpenStax
    Mar 26, 2020 · By the end of this section, you will be able to do the following: Explain wave behavior of light, including diffraction and interference, ...
  101. [101]
    Principles of Interference | Nikon's MicroscopyU
    The formation of an image in the microscope relies on the complex interplay between two critical optical phenomena: diffraction and interference.Introduction To Light... · Figure 3 - Interference... · Figure 5 - Thomas Young's...<|control11|><|separator|>
  102. [102]
    [PDF] ON THE ELECTRODYNAMICS OF MOVING BODIES - FaMAF
    This edition of Einstein's On the Electrodynamics of Moving Bodies is based on the English translation of his original 1905 German-language paper. (published as ...
  103. [103]
    [PDF] Albert Einstein - Relativity: The Special and General Theory - Ibiblio
    Theory of Relativity, Inertia of Energy, Theory of the Brownian Movement, and the Quantum-Law of the Emission and Absorption of Light (1905). These were ...
  104. [104]
    [PDF] Einsteinʼs Special Theory of Relativity and the Problems in the ...
    Einstein's special theory of relativity is based on two postulates, stated by. Einstein in the opening section of his 1905 paper. The first is the principle of ...
  105. [105]
    [PDF] Einstein's 1905 Paper on E=mc2
    Herewith we present a very simple treatment of the problem which makes absolutely clear the logical difficulties in Einstein's first published work on ܧ ൌ ݉ܿଶ.
  106. [106]
    The Field Equations of Gravitation - Wikisource, the free online library
    Aug 9, 2025 · We obtain the ten general covariant equations of the gravitational field in spaces, in which matter is absent.
  107. [107]
    [PDF] JOHN NORTON - How Einstein found his field equations: 1912-1915
    In the. Entwurf paper, Einstein and Grossmann had come within a hair's breadth of the generally covariant field equations of the final theory. *University of ...
  108. [108]
    [physics/9905030] On the gravitational field of a mass point ... - arXiv
    May 12, 1999 · Translation by S. Antoci and A. Loinger of the fundamental memoir, that contains the ORIGINAL form of the solution of Schwarzschild's problem.
  109. [109]
    [1602.04040] Einstein's Discovery of Gravitational Waves 1916-1918
    Feb 12, 2016 · In his gravitational waves paper, Einstein concluded that gravitational fields propagate at the speed of light. The solution is the Minkowski flat metric plus ...
  110. [110]
    New General Relativistic Contribution to Mercury's Perihelion Advance
    Since the 1970s the perihelion advance has entered the pantheon of high-precision confirmations of general relativity.
  111. [111]
    Eddington Observes Solar Eclipse to Test General Relativity
    One of Eddington's photographs of the May 29, 1919, solar eclipse. The photo was presented in his 1920 paper announcing the successful test of general ...
  112. [112]
    The 1919 eclipse results that verified general relativity and their later ...
    Oct 21, 2021 · The results were announced of two British expeditions led by Eddington, Dyson and Davidson to measure how much background starlight is bent as it passes the ...Abstract · Introduction · The 1919 eclipse expedition... · The Earman and Glymour...
  113. [113]
    Max Planck and the birth of the quantum hypothesis - AIP Publishing
    Sep 1, 2016 · One of the most interesting episodes in the history of science was Max Planck's introduction of the quantum hypothesis, at the beginning of the 20th century.II. PLANCK'S... · III. PLANCK'S APPLICATION... · IV. PLANCK'S 1931...
  114. [114]
    [PDF] 1913 On the Constitution of Atoms and Molecules
    This paper is an attempt to show that the application of the above ideas to Rutherford's atom-model affords a basis for a theory of the constitution of ...
  115. [115]
    [quant-ph/9911107] 75 Years of Matter Wave: Louis de Broglie and ...
    Nov 25, 2004 · A physically real wave associated with any moving particle and travelling in a surrounding material medium was introduced by Louis de Broglie in a series of ...
  116. [116]
    [PDF] 1926-Schrodinger.pdf
    It was stated in the beginning of this paper that in the present theory both the laws of motion and the quantum conditions can be deduced from one Hamiltonian ...
  117. [117]
    The Uncertainty Principle - Heisenberg Web Exhibit
    Heisenberg presented his discovery and its consequences in a 14-page letter to Pauli in February 1927. The letter evolved into a published paper in which ...
  118. [118]
    Copenhagen Interpretation of Quantum Mechanics
    May 3, 2002 · The Copenhagen interpretation was the first general attempt to understand the world of atoms as this is represented by quantum mechanics.The Interpretation of the... · Misunderstandings of... · The Measurement Problem
  119. [119]
    "Relative State" Formulation of Quantum Mechanics | Rev. Mod. Phys.
    The many-worlds interpretation of quantum mechanics says that a measurement can cause a splitting of reality into separate worlds. See more ...
  120. [120]
    The Quantum Theory of Nuclear Disintegration - Nature
    GAMOW, G. The Quantum Theory of Nuclear Disintegration. Nature 122, 805–806 (1928). https://doi.org/10.1038/122805b0. Download citation. Issue date: 24 November ...Author Information · About This Article · Cite This Article
  121. [121]
    [PDF] Models of the Nucleus Liquid Drop, Fermi Gas, Shell
    Jan 29, 2007 · Well depth remains constant independent of A, at about 40 MeV. • B' binding energy for the last nucleon remains constant independent of A, at.Missing: curve | Show results with:curve
  122. [122]
    Shell Model of Nucleus - HyperPhysics Concepts
    The shell model of nuclear structure is the existance of magic numbers of neutrons and protons at which the nuclei have exceptional stability.
  123. [123]
    Four Fundamental Interaction
    The weak force governs beta decay and neutrino interactions with nuclei. The strong force, which we generally call the nuclear force, is actually the force ...
  124. [124]
    Fundamental Forces - HyperPhysics Concepts
    The weak interaction in the electron form at left above is responsible for the decay of the neutron and for beta decay in general. Discussion of weak force ...
  125. [125]
    The history of CERN | timeline.web.cern.ch
    The LHC starts up. At 10.28am on 10 September 2008 a beam of protons is successfully steered around the 27-kilometre Large Hadron Collider (LHC) for the first ...
  126. [126]
    CERN's LHCb experiment reports observation of exotic pentaquark ...
    Jul 14, 2015 · The LHCb experiment at CERN's Large Hadron Collider has reported the discovery of a class of particles known as pentaquarks.
  127. [127]
    [1507.03414] Observation of $J/ψp$ resonances consistent ... - arXiv
    Jul 13, 2015 · Observation of J/ψp resonances consistent with pentaquark states in {Λ_b^0\to J/ψK^-p} decays. Authors:LHCb collaboration: R. Aaij, B. Adeva, M.
  128. [128]
    Fission and Fusion: What is the Difference? - Department of Energy
    Fission splits atoms by neutron impact, while fusion combines atoms to form heavier ones. Fusion produces more energy and no radioactive fission products.
  129. [129]
    Nuclear Medicine
    Fused CT-PET scans more clearly show tumors and are therefore often used to diagnose and monitor the growth of cancerous tumors. What is nuclear medicine ...
  130. [130]
    A next-generation liquid xenon observatory for dark matter and ...
    Dec 14, 2022 · These detectors can also study neutrinos through neutrinoless double-beta decay and through a variety of astrophysical sources. A next- ...Missing: fission scans
  131. [131]
    Colloquium: Topological insulators | Rev. Mod. Phys.
    Nov 8, 2010 · Topological insulators are electronic materials that have a bulk band gap like an ordinary insulator but have protected conducting states on their edge or ...Article Text · Introduction · Topological Band Theory · 3D Topological Insulators
  132. [132]
    Crystal structure (Chapter 1) - Fundamentals of Condensed Matter ...
    In this chapter, we will examine the structure of crystalline matter in which particles are arranged in a repeating pattern that extends over very long ...Missing: seminal | Show results with:seminal
  133. [133]
    1947: Invention of the Point-Contact Transistor | The Silicon Engine
    John Bardeen & Walter Brattain achieve transistor action in a germanium point-contact device in December 1947. Bardeen, Brattain, and Shockley ( ...
  134. [134]
    Theory of Superconductivity | Phys. Rev.
    A theory of superconductivity is presented, based on the fact that the interaction between electrons resulting from virtual exchange of phonons is attractive.
  135. [135]
    Possible highT c superconductivity in the Ba−La−Cu−O system
    Possible highT c superconductivity in the Ba−La−Cu−O system. Published: June 1986. Volume 64, pages 189–193, (1986); Cite this ...
  136. [136]
    [PDF] GRAPHENE - Nobel Prize
    Oct 5, 2010 · The breakthrough was done by Geim,. Novoselov and their co-workers; it was their paper from 2004 which ignited the development. For this they ...
  137. [137]
    Recent Advances in Quantum Spin Liquids in Two-Dimensional ...
    This short review highlights critical advances in these materials, emphasizing experimental signatures consistent with a Dirac quantum spin liquid and the ...Missing: 2020s | Show results with:2020s
  138. [138]
    Twenty years of 2D materials | Nature Physics
    Jan 16, 2024 · Two-dimensional crystals have revolutionized fundamental research across a staggering range of disciplines. We take stock of the progress gained after twenty ...
  139. [139]
  140. [140]
  141. [141]
  142. [142]
    Scientific Method - Stanford Encyclopedia of Philosophy
    Nov 13, 2015 · Falsification is deductive and similar to H-D in that it involves scientists deducing observational consequences from the hypothesis under test.
  143. [143]
    Karl Popper: Philosophy of Science
    Popper's falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be ...
  144. [144]
    1.2 The Scientific Methods - Physics | OpenStax
    Mar 26, 2020 · Properties other than appearance or location are usually modelled using mathematics, where functions are used to show how these properties ...
  145. [145]
    Editorial Policies and Practices - Physical Review Journals
    Purpose of Peer Review​​ The peer review process is directed by a team of staff editors employed by APS and active scientists (academic editors), who are ...
  146. [146]
    Reviewing Peer Review - PHYSICS - APS.org
    Oct 1, 2021 · Peer review—the evaluation of scientific work by experts in the field—is the main method by which papers are published, grants assigned, and ...
  147. [147]
    NASA - Solar Eclipses of History
    Sep 28, 2009 · 1919 May 29 - Einstein's Eclipse (Test of General Relativity) ... predicted by Einstein in his general theory of relativity" - Totality ...
  148. [148]
    Michelson Interferometer | Experimental Physics I & II "Junior Lab"
    The objective of this experiment is to demonstrate the interference pattern obtained from combining coherent monochromatic light beams using a Michelson ...
  149. [149]
    [PDF] Calorimeters for high energy colliders
    Calorimeters are the only source of information on the momenta of neutral particles such as (unconverted) photons, neutrons, and neutral kaons. They are used ...
  150. [150]
    [PDF] Introduction to Particle Detectors - CERN Indico
    Jul 7, 2017 · Scintillators are materials that produce sparks or scintillations of light when ionizing radiation passes through them. The charged particle ...
  151. [151]
    [PDF] 8.882 LHC Physics - Particle Detectors Overview - MIT
    The Cloud Chamber (C.T.R. Wilson). • an air volume saturated with water. • lower pressure to generate a super-saturated air volume. • charged particles cause.
  152. [152]
    The Large Hadron Collider
    ### Summary of the Large Hadron Collider (LHC)
  153. [153]
    Hubble Space Telescope - NASA Science
    ### Facts About Hubble Space Telescope
  154. [154]
    World's Most Accurate and Precise Atomic Clock Pushes New ...
    Jul 1, 2024 · The new JILA clock uses a web of light known as an “optical lattice” to trap and measure tens of thousands of individual atoms simultaneously.
  155. [155]
    [PDF] Listening to Space with LIGO
    This warping of space-time can be detected, if we can construct an instrument that can measure tiny changes in distance (strain) of the order of one part in.
  156. [156]
    An Undulatory Theory of the Mechanics of Atoms and Molecules
    The paper gives an account of the author's work on a new form of quantum theory. §1. The Hamiltonian analogy between mechanics and optics.
  157. [157]
    [physics/0503066] Invariant Variation Problems - arXiv
    Mar 8, 2005 · Authors:Emmy Noether, M. A. Tavel. View a PDF of the paper titled Invariant Variation Problems, by Emmy Noether and M. A. Tavel. View PDF.
  158. [158]
    Cloud-resolving climate model meets world's fastest supercomputer
    Apr 6, 2023 · The world's fastest supercomputer, Frontier at Oak Ridge National Laboratory, has reached 1.1 exaflops, breaking the exascale speed barrier.
  159. [159]
    Machine learning and the physical sciences | Rev. Mod. Phys.
    Dec 6, 2019 · This article reviews in a selective way the recent research on the interface between machine learning and the physical sciences.
  160. [160]
    Quantum gravity beyond frameworks - CERN Courier
    Sep 9, 2025 · Quantum Gravity 2025 offered a wide snapshot of a field still far from closure, yet increasingly shaped by common goals, the convergence of ...
  161. [161]
    Quantum Gravity and Field Theory - MIT Physics
    The interface of quantum physics and gravity is currently leading to exciting new areas of progress, and is expected to remain vibrant in the coming decade.Netta Engelhardt · Daniel Harlow · Krishna Rajagopal · Hong Liu
  162. [162]
    Is gravity quantum? Experiments could finally probe one of physics ...
    Aug 13, 2025 · Physicists are developing laboratory tests to give insight into the true nature of gravity.
  163. [163]
    Non-Renormalizable Grand Unification Utilizing the Leptoquark ...
    Sep 23, 2025 · We analyze a non-supersymmetric grand unified theory whose particle content is that of the Georgi-Glashow model augmented only by scalars ...
  164. [164]
    (PDF) Asymptotic grand unification in SO(10) with one extra dimension
    Aug 6, 2025 · Asymptotic grand unification provides an alternative approach to gradually unify gauge couplings in the UV limit, where they reach a non-trivial ...
  165. [165]
    Physics - Dark Matter Detector Releases Best-Yet Result
    Jul 1, 2025 · First, none of the detected events were caused by WIMPs with mass over 9 GeV/c2. Second, the cross section for a 40-GeV/c2 WIMP striking a xenon ...Missing: axions | Show results with:axions<|separator|>
  166. [166]
    [PDF] Recent Results from the XENONnT Experiment - TDLI-Indico
    Jun 8, 2025 · PRL 134, 111802 (2025). WIMP. arXiv: 2502.18005 (2025). SR2. Page 7. Jingqiang Ye (CUHK-Shenzhen). June 8th, 2025. TOPAC 2025. First WIMP Search ...
  167. [167]
    The XENON program for dark matter direct detection - ScienceDirect
    Results on the WIMP-nucleon scattering cross section achieved by these successive experiments are shown in Fig. 1b. The unprecedented improvement in sensitivity ...Missing: 2020s | Show results with:2020s
  168. [168]
    Quantum error correction below the surface code threshold - Nature
    Dec 9, 2024 · Quantum error correction provides a path to reach practical quantum computing by combining multiple physical qubits into a logical qubit, ...
  169. [169]
    IBM roadmap to quantum-centric supercomputers (Updated 2024)
    May 10, 2022 · IBM has since updated the development roadmap as we learn more about the engineering and innovations required to realize error-corrected quantum ...
  170. [170]
    Quantum computing's six most important trends for 2025 - Moody's
    Feb 4, 2025 · On December 9, Google announced that its Willow chip demonstrated below-threshold error correction, lowering error rates as more physical qubits ...
  171. [171]
    The Multiverse Has a Measure Problem | Science and Culture Today
    Jan 24, 2025 · In terms of science, the central problem with the naïve multiverse is that it could explain any observation, so it really explains nothing.Missing: current | Show results with:current
  172. [172]
    Multiverse Predictions for Habitability: Fundamental Physics and ...
    Sep 15, 2025 · In the multiverse hypothesis, a range of universes exist with differing values of our physical constants. Here, we investigate how the ...
  173. [173]
    Fast Scrambling and the Emergence of Black Hole Firewalls
    Sep 2, 2025 · Under reasonable assumptions, black holes have been argued to form firewalls, burning up anything crossing their horizons.
  174. [174]
    [PDF] Resolving the Black Hole Information Paradox - HAL
    May 13, 2025 · This paper provides a comprehensive review of the black hole information paradox, trac- ing its origins and recent theoretical developments that ...
  175. [175]
    William B. Shockley – Facts - NobelPrize.org
    In 1947 John Bardeen and Walter Brattain produced a semiconductor amplifier, which was further developed by William Shockley. The component was named a ...
  176. [176]
    1959: Practical Monolithic Integrated Circuit Concept Patented
    Robert Noyce builds on Jean Hoerni's planar process to patent a monolithic integrated circuit structure that can be manufactured in high volume.
  177. [177]
    The Birth of the Visible LED: Nick Holonyak Jr. and a Turning Point ...
    Apr 8, 2025 · Nick Holonyak Jr. (1928–2022), the father of the first practical visible-spectrum LED. Today, LEDs are everywhere—flashing on clocks and ...
  178. [178]
    Best Research-Cell Efficiency Chart | Photovoltaic Research - NREL
    Jul 15, 2025 · NREL maintains a chart of the highest confirmed conversion efficiencies for research cells for a range of photovoltaic technologies, plotted ...
  179. [179]
    [PDF] Press Conference - ITER
    Jul 3, 2024 · The Start of Deuterium-Tritium Operation Phase will be about 4 years delayed from the previous baseline, from 2035 to 2039.
  180. [180]
    Physical principles of ultrasound | Radiology Reference Article
    Mar 31, 2020 · The ultrasound beam originates from mechanical oscillations of numerous crystals in a transducer, which is excited by electrical pulses ( ...
  181. [181]
    The physical basis and future of radiation therapy - PMC - NIH
    The main focus of physics in radiation therapy has always been to increase the level of precision and accuracy of dose delivery to the (tumour) target volume.
  182. [182]
    Biomedical Applications of Quantum Dots: Overview, Challenges ...
    May 2, 2022 · Quantum dots (QDs) are semiconductors-based nanomaterials with numerous biomedical applications such as drug delivery, live imaging, and medical diagnosis.
  183. [183]
    The history of optical fibre communications
    Optical Fibre Communication was invented in the early 1960's, at Standard Telecommunication Laboratories, in Harlow in the UK. Who? Charles Kuen Kao was the ...
  184. [184]
    NIST Finds Wireless Performance Consistent Across 5G Millimeter ...
    May 10, 2022 · Wireless systems are moving to the mmWave spectrum at 10-100 gigahertz (GHz), above crowded cellular frequencies as well as early 5G systems ...
  185. [185]
    12.3 Stress, Strain, and Elastic Modulus - UCF Pressbooks
    Stress is force per unit area causing deformation. Strain is the fractional change in length, volume, or geometry. Elastic modulus is the proportionality ...Missing: civil | Show results with:civil
  186. [186]
    Bernoulli's Principle | SKYbrary Aviation Safety
    Bernoulli's principle states that an increase in the speed of a fluid occurs simultaneously with a decrease in pressure or a decrease in the fluid's potential ...
  187. [187]
    12.6: Elasticity and Plasticity - Physics LibreTexts
    Sep 12, 2022 · Elasticity is the tendency to return to original shape after load removal. Plasticity is when a material deforms irreversibly and does not ...
  188. [188]
    Strength of carbon nanotubes depends on their chemical structures
    Jul 10, 2019 · Single-walled carbon nanotubes theoretically possess ultimate intrinsic tensile strengths in the 100–200 GPa range, among the highest in existing materials.Missing: primary | Show results with:primary
  189. [189]
    [PDF] 2. Electric Circuit Theory - Dei-Unibo
    Electric circuit theory and Electromagnetic theory are the two fundamental theories upon which all branches of electrical engineering are based.
  190. [190]
    Superconductive magnet design - Questions and Answers ​in MRI
    The conductor used in nearly all modern superconducting MR scanners is niobium-titanium (NbTi) that becomes superconductive below 9.4°K. Each wire is ...
  191. [191]
    Free Will - Stanford Encyclopedia of Philosophy
    Jan 7, 2002 · Underlying the belief that free will is incompatible with determinism is the thought that no one would be morally responsible for any actions in ...
  192. [192]
    Causal Determinism - Stanford Encyclopedia of Philosophy
    Jan 23, 2003 · Causal determinism is, roughly speaking, the idea that every event is necessitated by antecedent events and conditions together with the laws of nature.Missing: implications | Show results with:implications
  193. [193]
    Quantum Approaches to Consciousness
    Nov 30, 2004 · There are three basic types of corresponding approaches: (1) consciousness is a manifestation of quantum processes in the brain, (2) quantum concepts are used ...
  194. [194]
    Philosophical Issues in Quantum Theory
    Jul 25, 2016 · This article is an overview of the philosophical issues raised by quantum theory, intended as a pointer to the more in-depth treatments of other entries in the ...Missing: determinism | Show results with:determinism
  195. [195]
    What Einstein Really Thought about Quantum Mechanics
    Sep 1, 2015 · What Einstein Really Thought about Quantum Mechanics. Einstein's assertion that God does not play dice with the universe has been misinterpreted.
  196. [196]
    Time - Stanford Encyclopedia of Philosophy
    Nov 24, 2020 · B-theorists typically emphasize how special relativity eliminates the past/present/future distinction from physical models of space and time.Time Travel · Absolute and Relational... · Time Travel and Modern Physics
  197. [197]
  198. [198]
    Hard Problem of Consciousness | Internet Encyclopedia of Philosophy
    The problem touches on issues in ontology, on the nature and limits of scientific explanation, and on the accuracy and scope of introspection and first-person ...
  199. [199]
    [PDF] Guidelines for Self-Study and External Evaluation of Undergraduate ...
    There are standardized assessment vehicles available for all levels of the undergraduate physics curriculum, some of which heavily emphasize problem-solving ...
  200. [200]
    [PDF] Table of Contents - American Association of Physics Teachers
    The Graduate Physics Curriculum. A. The Existing “Core”. The traditional (historical) graduate physics curriculum consists of a “core” of required courses.
  201. [201]
    Why Do So Many Physics Students Want to Work in Academia?
    Sep 12, 2024 · 30% of new physics bachelor's degree-earners want to be employed by a college or university, and yet only 5% will end up in tenure-track faculty positions.
  202. [202]
    Who's Hiring Physics PhDs - AIP.ORG
    Jan 25, 2024 · ... Physics PhD Recipients Holding Potentially Permanent Positions. All Employment Sectors: Academic. Government. Private sector. Other.
  203. [203]
    American Physical Society
    The premier physics research journal, providing rapid publication of short reports of fundamental research across all fields.About APS · APS membership · Meet and connect · PHYSICS - APS.orgMissing: definition | Show results with:definition
  204. [204]
    Institute of Physics
    ### Summary of IOP and Its Role as a Professional Body for Physics
  205. [205]
    APS Bridge Program | American Physical Society
    The APS Bridge Program creates pathways to physics graduate programs for students from all backgrounds, ensuring everyone has the opportunity to pursue ...Members · Students · Partners · Background
  206. [206]
    Physics Bachelors' Outcomes: Focus on Graduate School and the ...
    Dec 14, 2023 · After five years of little change, the proportion of new physics bachelors in the class of 2020 who immediately entered the workforce dropped by ...
  207. [207]
    Opinion: The Rise of the Data Physicist | American Physical Society
    Oct 13, 2023 · I expect the data physicist to have a strong physics background and extensive training in statistics, data science, and machine learning.