Fact-checked by Grok 2 weeks ago

Scientific realism

Scientific realism is a philosophical stance in the asserting that the primary aim of science is to develop theories that provide a literally true description of both and aspects of the world, and that acceptance of well-confirmed scientific theories warrants in their approximate truth, including the of their postulated entities such as electrons or quarks. This view posits that successful scientific theories are not merely empirically adequate—saving the phenomena—but offer genuine knowledge of a mind-independent . Central to scientific realism is the commitment to semantic realism, which holds that scientific statements possess objective truth conditions independent of human cognition, allowing for meaningful discourse about unobservables. A key argument supporting this position is the no-miracles argument, famously articulated by , which contends that the predictive and explanatory success of mature scientific theories would be an inexplicable "miracle" if those theories were not approximately true, as realism provides the best explanation for why science works so effectively. Proponents, including philosophers like Richard Boyd and Stathis Psillos, argue that this success justifies epistemic confidence in the reality of theoretical entities described by theories such as or . Opposing scientific realism are anti-realist views, such as Bas van Fraassen's constructive empiricism, which maintains that science aims only for empirical adequacy—accurate prediction of observables—and that belief in unobservables is unnecessary and unwarranted, as acceptance of a theory requires only that it "saves the phenomena" without committing to its full truth. A prominent challenge to realism is the pessimistic meta-induction, which points to the historical record of science where past successful theories, like the of heat or phlogiston, were later abandoned, suggesting that current theories are similarly likely to be false despite their success, thus inducing pessimism about their truth. The debate has evolved through various forms of realism, including entity realism (focusing on the reality of specific entities rather than entire theories) and structural realism (emphasizing the preservation of mathematical structures across theory change), both attempting to address historical challenges while retaining core realist commitments. Influential figures like and have bolstered realist arguments by linking scientific explanation to belief in unobservables, while critics like and highlight and of observations as reasons to doubt realist claims. Overall, scientific realism remains a cornerstone of of , balancing the triumphs of scientific progress with rigorous scrutiny of its ontological implications.

Definition and Core Principles

Characteristic Claims

Scientific realism posits a core thesis that successful scientific theories are approximately true and that the entities they describe, such as electrons and quarks, exist mind-independently as real components of the world. This commitment extends to the belief that theoretical terms in well-established sciences genuinely refer to these entities, rather than serving merely as predictive devices. A key element of this position is the ideal-theory thesis, which holds that mature scientific theories in their respective domains offer increasingly accurate representations of objective reality, capturing essential features of the world through successive refinements. For instance, in , realists maintain that descriptions of subatomic particles provide literal, albeit approximate, insights into the structure of matter, justifying belief in their existence based on the theory's explanatory and predictive power. Complementing this is the convergence thesis, according to which scientific progress involves theories that successively approximate truth, with core elements of successful predecessors retained and built upon in later frameworks. In , this manifests as an endorsement of mechanisms like as real processes shaping , where accumulating evidence from and refines but does not discard the foundational realist interpretation of Darwinian theory. Scientific realists further commit to the validity of scientific methodology, particularly inference to the best explanation (IBE), as a reliable means of warranting in theoretical entities and structures. Under IBE, the best explanation for observed phenomena—such as the behavior of —is the literal truth of the underlying theory positing unobservables like protons and neutrons, rather than instrumentalist alternatives that limit claims to observables alone. Scientific realism occupies a distinct position within the by endorsing the literal truth of scientific theories about both and entities, setting it apart from various antirealist alternatives that either deny or downplay commitments to unobservables. Unlike antirealisms, which often treat theoretical claims as non-truth-apt or epistemically optional, scientific realism maintains that the success of theories warrants belief in their approximate truth, including posits like electrons or quarks. A primary contrast lies with , which views scientific theories primarily as tools for and of rather than as descriptions of an underlying reality. , associated with logical positivists like and Carl Hempel, denies that unobservable entities have referential meaning or exist independently, interpreting theoretical terms as mere calculational devices without truth values. For instance, in Niels Bohr's interpretation of , wave functions are seen as instrumental aids for predicting measurement outcomes, not representations of a real , rejecting any realist commitment to unobservables. This differs sharply from scientific realism, which insists on the truth-aptness and approximate accuracy of such theoretical claims. Scientific realism also diverges from constructive empiricism, developed by , which holds that the aim of is empirical adequacy—saving the observable phenomena—rather than truth about the unobservable world. According to van Fraassen, acceptance of a theory requires belief only in its success regarding observables, remaining agnostic about unobservables like black holes or subatomic particles, even if the theory is empirically successful. This position, articulated in The Scientific Image (1980), prioritizes epistemic modesty by avoiding metaphysical commitments beyond what direct observation can confirm, contrasting with scientific realism's extension of belief to unobservables as necessary for full theoretical truth. Van Fraassen argues that empirical adequacy is a weaker, more defensible goal than realism's demand for overall truth, as it aligns with the observable focus of scientific practice. In relation to metaphysical realism—the broader philosophical doctrine that the world exists mind-independently and is structured independently of our conceptions—scientific realism serves as a specific application focused on the ontology implied by successful scientific theories. While metaphysical realism addresses general questions of external reality and truth, scientific realism narrows this to the approximate truth of scientific descriptions, including unobservables, without necessarily endorsing a comprehensive metaphysics beyond science. Thus, scientific realism inherits metaphysical realism's commitment to an independent world but grounds it empirically in scientific success rather than abstract ontology. The debate positions scientific realism on a from to in the , where views include and as further alternatives. treats theoretical entities as useful fictions that guide practice without asserting their existence, akin to viewing models as non-literal stories (e.g., Arthur Fine's natural ontological attitude). , drawing from figures like Charles Peirce and , emphasizes the practical utility and predictive success of theories over their correspondence to hidden realities, adopting theoretical frameworks conventionally for their instrumental value. These positions form a continuum: realists affirm truth about unobservables, while antirealists range from instrumental prediction () to agnostic (constructive ) to outright fictional or pragmatic dismissal of ontological commitments. A key boundary distinguishing scientific realism from empiricist antirealisms is its acceptance of the explanatory power of unobservables in accounting for observable phenomena. Realists argue that entities like atoms or are indispensable for explanations that go beyond mere , positing their reality as the best account of theoretical success (e.g., via to the best explanation). In contrast, empiricists like van Fraassen limit to s, viewing unobservables as optional posits that do not require belief, even if they enhance predictions, thereby avoiding what they see as unwarranted metaphysical . This divide underscores scientific realism's bolder epistemic stance on the scope of scientific knowledge.

Historical Development

Early Foundations in Philosophy of Science

The roots of scientific realism can be traced to , where commitments to structures underpinned explanations of the natural world. Plato's , articulated in dialogues such as the and , posits eternal, unchanging entities—such as Beauty Itself or the Good—as the true reality, distinct from the sensible particulars that merely participate in them. These Forms represent a realist , emphasizing , mind-independent structures that account for the properties and changes observed in the physical realm, serving as a precursor to later scientific commitments to theoretical entities beyond direct perception. Aristotle, in contrast, developed in his Metaphysics, conceiving physical entities as composites of matter (hylē) and form (morphē), where form constitutes the essence and actuality of the thing, enabling a moderate realism that grounds universals in particulars without separating them into a transcendent realm. This framework commits to the reality of formal causes as principles organizing matter, influencing by prioritizing explanatory depth over mere appearances. The Scientific Revolution in the 16th and 17th centuries further advanced realist tendencies by positing real entities to explain celestial and mechanical phenomena. Nicolaus Copernicus's heliocentric model, presented in De revolutionibus orbium coelestium (1543), rejected geocentric appearances in favor of a physical system where the Earth orbits the Sun, asserting that satisfactory astronomy must describe the actual structure of the cosmos rather than mere predictive instruments. This shift implied commitment to unobservable mechanisms, such as planetary motions governed by underlying laws, marking an early endorsement of realism over instrumentalism in astronomy. Isaac Newton's Philosophiæ Naturalis Principia Mathematica (1687) extended this by introducing absolute space and time as real, independent substances providing a fixed framework for motion, distinguishable from relative perceptions through arguments like the rotating bucket experiment. Newton's substantival view treated these unobservables as objective entities essential to the laws of mechanics, reinforcing a realist interpretation of scientific theories as descriptions of mind-independent reality. In the , debates in chemistry highlighted tensions between realist posits and phenomenological descriptions. John Dalton's , outlined in A New System of Chemical Philosophy (1808), proposed that elements consist of indivisible atoms with specific weights, explaining laws of chemical combination through the real existence of these unobservable particles rather than surface-level observations alone. This realist approach contrasted with phenomenological methods, such as those of , who initially used formulas to represent proportions without committing to atoms' physical reality, viewing them as convenient calculational tools. Over time, accumulating evidence from and valency rules shifted acceptance toward realism, establishing atoms as genuine constituents of matter. The rise of 19th-century positivism challenged these realist inclinations but indirectly shaped their defense. Auguste Comte's Cours de philosophie positive (1830–1842) advocated the positive stage of knowledge, restricting to observable phenomena and laws while rejecting unobservable causes or essences as metaphysical remnants. , influenced by in his System of Logic (1843), emphasized empirical and the uniformity of but critiqued overly restrictive , allowing for hypothetical entities if supported by evidence, thus fostering responses that reconciled with realist commitments to unobservables. This tension between positivist and the explanatory power of theoretical structures set the stage for later articulations of scientific realism.

20th-Century Emergence and Key Figures

The decline of logical positivism in the mid-20th century was significantly accelerated by Willard Van Orman Quine's 1951 essay "Two Dogmas of Empiricism," which critiqued the analytic-synthetic distinction central to verificationism and argued that it relied on unjustified assumptions, thereby undermining the positivist framework for scientific meaning and confirmation. This critique contributed to a broader post-positivist shift, where philosophers increasingly questioned strict empiricist reductions of scientific theories to observable data. In this evolving context, Thomas Kuhn's 1962 work introduced the concept of paradigms, portraying scientific progress as revolutionary shifts rather than cumulative verification, which challenged positivist ideals of rational theory choice and prompted defenses of realism to preserve the objectivity of scientific knowledge. Similarly, Paul Feyerabend's 1975 book advocated epistemological anarchism, arguing that methodological rules hinder scientific innovation and that proliferation of theories is essential, further eroding positivist orthodoxy and necessitating realist responses to affirm the truth-aim of science amid apparent relativism. A pivotal influence came from Quine's advocacy of , which integrated philosophy into empirical science and emphasized to the entities posited by our best scientific theories, as he outlined in works like "Epistemology Naturalized" (1969), thereby providing a naturalistic foundation for realist interpretations of theoretical terms. This approach shifted focus from a priori verification to the holistic acceptance of scientific webs of , bolstering by tying existence claims directly to theoretical utility and empirical success. Key figures in the 20th-century emergence of scientific realism included , who in the 1970s developed influential defenses such as the "no miracles" argument, positing that the predictive success of science would be miraculous without the approximate truth of its theories, as articulated in essays like those collected in Mathematics, Matter and Method (1975). later refined this into "realism with a human face" in his 1990 collection, emphasizing a commonsense, non-metaphysical responsive to scientific . As a foil, Bas van Fraassen's 1980 book The Scientific Image proposed , advocating belief only in aspects of theories for empirical adequacy rather than full truth, which sharpened debates by offering an anti-realist alternative grounded in observational limits. Surveys among philosophers have reflected the dominance of realist leanings emerging from post-positivist developments; for instance, a 2009 survey of professional philosophers found that 75% accepted or leaned toward scientific realism. This trend continued in the 2020 survey, where 72% of respondents accepted or leaned toward scientific realism.

Arguments in Favor

No Miracles Argument

The No Miracles Argument (NMA) constitutes a central defense of scientific realism, asserting that the empirical success of mature scientific theories—particularly their predictive and explanatory achievements—would be an inexplicable "" unless those theories are approximately true about unobservable entities and structures. Formulated most influentially by in the , the argument maintains that realism provides the only non-miraculous explanation for why scientific theories reliably generate novel predictions and facilitate technological applications. Putnam encapsulated this by stating, "The positive argument for realism is that it is the only philosophy that doesn't make the success of science a ." At its core, the reasoning unfolds as an inference to the best : scientific theories succeed because they correctly describe the causal powers and interactions of real entities posited in their ontologies, rather than by mere or instrumental utility. If theories' central terms (such as "" or "") fail to refer to actual entities, or if their descriptions of unobservables are entirely false, then their capacity to yield accurate predictions about observables—often novel ones unforeseen at the theory's inception—defies rational comprehension. This success encompasses not only empirical adequacy but also the progressive refinement of theories, where later ones retain and extend the explanatory virtues of predecessors, suggesting an approximation to truth. Illustrative examples underscore this logic. In chemistry, the incorporating electrons successfully predicts molecular behaviors and reaction rates, such as the valence electron configurations explaining bonding patterns, which would be fortuitous if electrons were not real particles with the theorized properties. Similarly, Einstein's general posits the curvature of by mass-energy, leading to precise predictions like the 1919 solar eclipse observation of starlight deflection and the anomalous of Mercury's orbit—outcomes that align too closely with reality to attribute to non-referential formalism. The NMA directly counters instrumentalist alternatives, which treat theories as mere calculational devices for predictions without to unobservables' , by arguing that such views render success inexplicable: why would non-truth-tracking instruments consistently outperform alternatives unless guided by genuine referential success? posits no deeper mechanism for this reliability, leaving the alignment between theoretical posits and empirical outcomes as an improbable coincidence. Originating amid mid-20th-century debates on interpretation, the NMA emerged in Putnam's 1975 paper and was refined in his 1976 lectures, amid challenges from logical and early anti-realist critiques. Contemporary defenses, such as those emphasizing selective of components, have bolstered its resilience against historical counterexamples like replacements.

Success and Approximate Truth

One key argument for scientific realism posits that the empirical success of scientific theories, particularly their ability to make novel predictions, provides evidence that those theories are approximately true in their core components. This view holds that successful theories capture partial truths about entities and mechanisms, and that scientific involves the preservation and refinement of these truths across changes. For instance, the of the atom, despite its ultimate inaccuracies, successfully predicted spectral lines and laid groundwork for by correctly positing quantized energy levels, which were retained and extended in later formulations. The concept of approximate truth addresses critiques of realism, such as Larry Laudan's pessimistic induction, which highlighted historically successful but ultimately false theories like and caloric theories. Realists respond through selective realism, committing only to the successful, truth-tracking parts of past theories while discarding the rest; for example, while failed overall, its emphasis on material transformations contributed to the conservation principles preserved in modern . Similarly, the caloric theory's notion of heat as a approximated of , even as the model was abandoned. This selective approach allows realists to maintain that theory succession reveals increasing approximation to truth, rather than wholesale replacement. (Kitcher 1993, The Advancement of Science) Inference to the best explanation (IBE) further bolsters this position, arguing that the most plausible account of why theories succeed empirically—beyond mere utility—is their approximate truth about the world. Realists contend that alternatives, such as viewing success as coincidental or purely predictive without , fail to explain the depth and reliability of scientific achievements. Notable examples include Darwin's theory of , whose core mechanisms of variation, inheritance, and differential survival have been retained and integrated into modern despite refinements in . In particle physics, gauge theories underlying the have yielded precise predictions, such as those in , supporting realist belief in their approximate truth regarding fundamental forces and particles. Empirical support for these realist inclinations tied to views of success is evident in philosophical surveys; the 2020 PhilPapers Survey found that 76.5% of philosophers of science accept or lean toward scientific realism, often linking this to the explanatory power of successful theories approximating truth.

Arguments Against

Pessimistic Meta-Induction

The pessimistic meta-induction (PMI) constitutes a major challenge to scientific realism by drawing on the history of science to argue that the success of past theories does not reliably indicate their truth. Formulated most influentially by Larry Laudan in his 1981 paper "A Confutation of Convergent Realism," the argument posits that since numerous historically successful scientific theories were later deemed fundamentally false—often involving non-referring central terms—current theories, despite their predictive and explanatory successes, are likely false in their theoretical claims about unobservables as well. Laudan contends that this pattern refutes the convergent realism defended by figures like Hilary Putnam, which holds that science converges on approximate truth over time, thereby undermining the inference from empirical success to ontological commitment. Laudan's case relies on a list of over a dozen historical theories that achieved significant empirical success yet failed to describe accurately. For instance, the of , dominant in the , successfully explained why metals gain weight upon (attributing it to phlogiston release) and why required air, but phlogiston itself was a fictitious substance discarded by Lavoisier's oxygen theory. Similarly, the of , prevalent from the mid-18th to mid-19th century, posited heat as an indestructible fluid (caloric) that accounted for phenomena like , specific heat capacities, and latent heats during changes; experiments such as those by on ice melting confirmed its predictions, yet caloric was eliminated without trace in the kinetic-molecular theory of and others. In astronomy, the Ptolemaic , incorporating crystalline spheres to carry planets in epicycles, yielded precise ephemerides for over a millennium, enabling accurate and predictions, but its core of solid, rotating spheres was wholly rejected by heliocentric and Newtonian frameworks. Another key example is the 19th-century luminiferous theory, which successfully explained propagation as waves in a pervasive medium, aligning with Fresnel's predictions for polarization and refraction, until the null result of the Michelson-Morley experiment and Einstein's rendered the ether unnecessary and false. These theory replacements demonstrate, according to Laudan, that empirical success often stems from instrumental virtues rather than truth: past theories worked because their mathematical or observational components approximated superficially, without their theoretical posits referring to actual entities. The thus erodes the "no miracles" intuition supporting , implying that the convergence of is toward better , not deeper truth about the unobservable world. Realists have countered the PMI through strategies emphasizing selective retention and theoretical continuity. Stathis Psillos, in his 1996 analysis, argues that Laudan's examples overstate discontinuity: many central terms from discarded theories (e.g., "force" from Aristotelian to Newtonian ) are retained in successor theories, albeit with altered interpretations, while outright failures typically involve auxiliary hypotheses or idealizations rather than posits. For the , Psillos notes that while caloric as a substance vanished, the theory's structural insights into prefigured the first law of thermodynamics. Other responses invoke structural realism, positing that the mathematical relations preserved across theory changes—such as Fresnel's equations approximating in —indicate realism about structure, if not entities, thereby mitigating the induction's pessimistic scope. The PMI has also been extended to contemporary science, particularly (QFT), where uncertainties about the ontological status of fields, particles, and infinities (e.g., issues) invite regarding the literal truth of its unobservables, mirroring historical patterns of revision. Despite such challenges, the argument continues to fuel debates, with realists refining their positions to accommodate historical evidence without abandoning ontological commitments.

Underdetermination of Theory by Evidence

The of theory by evidence poses a significant challenge to scientific realism by suggesting that the available empirical data can be compatible with multiple incompatible theories, thereby undermining the realist's claim that successful theories reveal the approximate truth about entities and structures. This issue arises from the holist view that scientific theories are tested not in isolation but as part of a broader of beliefs, where any apparent falsification can be accommodated by adjusting auxiliary hypotheses rather than the core theory itself. The core problem is articulated in Willard Van Orman Quine's thesis of global , which holds that for any compatible with all current evidence, there exists an empirically equivalent rival theory that is logically incompatible but can be constructed by making suitable adjustments to auxiliary assumptions or background . Quine argued that this underdetermination extends beyond local adjustments to the entirety of our scientific worldview, as the totality of evidence underdetermines the choice among possible theories. This idea builds on the Duhem-Quine thesis, which emphasizes confirmational holism: empirical tests confirm or disconfirm hypotheses only in conjunction with auxiliary assumptions, allowing theories to be "saved" from refutation through modifications elsewhere in the system. A classic example is the empirical equivalence between Hendrik Lorentz's ether theory, incorporating (the Lorentz-FitzGerald contraction), and Albert Einstein's special in the early . Both theories predicted identical observational outcomes for phenomena like the Michelson-Morley experiment, differing only in their commitments to unobservable entities such as the luminiferous , yet no evidence at the time could decisively favor one over the other. This holism in confirmation illustrates how arises not from insufficient data but from the interconnected nature of theoretical commitments. For scientific realism, the implications are profound: if multiple empirically equivalent theories are possible, there is no evidential reason to privilege one—presumed to approximate truth—over its rivals, casting doubt on the unique referential success of scientific theories. comes in degrees, distinguished as transient (or local) and permanent (or global). Transient underdetermination occurs when current favors multiple , but future evidence may resolve the ambiguity, as in historical cases where one eventually prevails. Permanent underdetermination, however, posits that for any , rivals can always be devised to match all possible evidence, rendering theory choice irredeemably underdetermined at a fundamental level. Realists have countered these challenges by appealing to non-empirical virtues of theories, such as , explanatory unification, and , which guide rational theory selection beyond strict evidential fit without invoking truth directly. For instance, Einstein's is preferred over Lorentz's theory not solely for empirical reasons but because it offers greater theoretical by eliminating unnecessary posits like the undetectable . Additionally, realists often reject the "bad lot" objection—where anti-realists argue that accepted theories are merely the best of empirically inadequate rivals—by contending that such rivals are not genuinely viable or that historical success patterns provide inductive grounds for despite potential . These responses aim to preserve the idea that theoretical virtues track approximate truth, even if evidence alone cannot uniquely determine it.

Constructive Empiricism and Observability

Constructive empiricism, as articulated by Bas C. van Fraassen in his 1980 book The Scientific Image, maintains that the primary aim of science is not to uncover true descriptions of un entities but to construct that save the phenomena—accurately accounting for observable aspects of the world. This position contrasts sharply with scientific realism by limiting rational acceptance of a to in its empirical adequacy, defined as the theory's capacity to correctly describe all observable events, past and future, while suspending judgment on claims about unobservables. Van Fraassen argues that this empiricist stance aligns with scientific practice, where are valued for their predictive success regarding what humans can directly perceive rather than for metaphysical commitments to hidden structures. Central to constructive empiricism is the observability criterion, which delineates between observable and entities based on perceptual capabilities. Van Fraassen specifies that an entity is observable if, under suitable circumstances, it can be directly detected by the unaided senses, such as through or touch, without relying on or that introduces theoretical assumptions. For instance, neutrinos are considered despite their detection in experiments, as they interact too weakly to produce direct sensory impressions and require indirect evidence from particle tracks or energy deposits. This distinction poses a direct challenge to scientific realism, which posits that entities like subatomic particles or theoretical fields exist independently and truly, by suggesting that belief in such entities exceeds what empirical adequacy demands. Van Fraassen bolsters constructive empiricism with key arguments, including the reflection principle and the bad lot objection. The asserts that upon accepting a theory, a should believe it is empirically adequate regarding observables but need not extend that belief to , reflecting a disciplined epistemic attitude that avoids overcommitment. Complementing this, the bad lot argument critiques realist reliance on to the best by noting that the "best" theory among empirically adequate rivals may still be false about unobservables if all contenders in the lot are erroneous in that domain; thus, explanatory superiority provides no warrant for belief in unobservable claims. In applications to quantum mechanics, constructive empiricism favors interpretations like the Copenhagen view, which emphasizes observable measurement outcomes and wave function collapse without positing unobservable realities, thereby achieving empirical adequacy without realist ontology. In contrast, Bohmian mechanics introduces hidden variables and definite particle trajectories—unobservable entities—to provide a deterministic account, committing realists to truths beyond observables that empiricists deem unnecessary. Van Fraassen's empiricist modal interpretations of quantum mechanics further illustrate this by modeling states in terms of observable possibilities, reinforcing the focus on phenomena over hidden mechanisms. Scientific realists have countered these critiques by emphasizing the indispensable explanatory role of unobservables. Stathis Psillos, for example, argues that entities like electrons are not mere instrumental posits but essential for explaining effects, such as spectral lines in atomic spectra, rendering about them untenable for a coherent scientific . Realists contend that dismissing unobservables undermines the depth of scientific understanding, as theories without such commitments fail to unify diverse phenomena under common causal structures.

Incompatible Properties Problem

The incompatible properties problem poses a significant challenge to scientific realism by highlighting how attributes mutually exclusive attributes to the same entities, such as electrons exhibiting both wave-like and particle-like behaviors depending on the measurement context, thereby questioning the existence of definite, mind-independent properties for unobservables. This issue arises because non-commuting observables in , like position and momentum, cannot simultaneously possess definite values, as formalized in the and Kochen-Specker theorem, preventing a complete realist description of quantum objects with consistent intrinsic properties. A representative example is wave-particle duality in the , where electrons produce interference patterns indicative of delocalized waves when unobserved, but register as discrete particles upon detection, suggesting no unified for the entity. Similarly, incompatible models across quantum interpretations—such as the many-worlds view, which assigns definite properties across branching universes, versus objective collapse theories, which introduce non-unitary state reductions—imply contradictory attributions to the same , further eroding the realist ideal of a single, approximate truth about reality. The implications for scientific realism are that such incompatibilities undermine the commitment to a coherent ontology, potentially requiring realists to accept either instrumentalism, where theories describe observables only, or a fragmented pluralism that dilutes the mind-independent status of theoretical entities. In response, some realists advocate modal interpretations of quantum mechanics, which assign definite actual properties to a maximal subset of compatible observables at each moment, treating incompatible ones as merely possible without definite values, thus preserving a realist framework while accommodating quantum indeterminacy. Contemporary debates extend this problem to , where rival approaches like , positing and vibrating strings, and , emphasizing discrete , yield incompatible descriptions of fundamental reality yet remain empirically indistinguishable at testable scales, intensifying and challenging realist confidence in any single .

Varieties of Scientific Realism

Standard Realism

Standard scientific realism, often referred to as traditional or baseline scientific realism, posits that the mature and successful theories of provide approximately true descriptions of both and aspects of . This view asserts that theoretical terms in scientific theories genuinely refer to entities that exist independently of our conceptual schemes, and that the relations and properties ascribed to these entities in the theories hold approximately as stated. For instance, it treats claims about atoms having definite positions and trajectories in as literally true in the approximate sense relevant to the theory's domain of application. Central commitments of standard include the literal interpretation of scientific theories across all their components, encompassing both observational predictions and theoretical posits, and the outright rejection of , which confines scientific claims to empirical adequacy without ontological import for unobservables. Proponents like and Richard Boyd, who developed this position in the 1970s and 1980s, emphasized that best accounts for the instrumental reliability of scientific methods, as approximately true theories enable predictive success and methodological progress. Putnam famously articulated this by stating that "the positive argument for is that it is the only philosophy that doesn't make the success of a ," highlighting how explains why theories like those in yield precise predictions without invoking coincidence. One key strength of standard realism lies in its ability to underwrite the unification and explanatory power observed in science, such as in , where entities like quarks are taken as real and contribute to explaining fundamental interactions. This approach posits that the coherence across disparate phenomena arises because theories capture genuine structures and mechanisms in nature, rather than mere calculational devices. Regarding criticisms of overcommitment—such as endorsing details in past theories like caloric fluid that later proved false—defenders invoke the notion of approximate truth, arguing that scientific progress involves refining successively better approximations without requiring perfect accuracy from the outset. Boyd, in particular, maintained that this allows realism to accommodate historical changes while preserving commitment to the core referential success of theoretical terms.

Structural Realism

Structural realism posits that the success of scientific theories reveals the reality of their structural relations rather than the intrinsic natures of unobservable entities. This view emerged as a response to challenges facing traditional , such as the pessimistic meta-induction, by emphasizing the continuity of mathematical structures across successive theories. It divides into two main variants: epistemic structural realism, which holds that our knowledge is limited to structures, and ontic structural realism, which asserts that structures constitute the fundamental of the world. Epistemic structural realism (ESR), introduced by John Worrall in 1989, maintains that while the intrinsic properties of objects remain unknowable, the mathematical structures posited by successful theories accurately describe relations among phenomena. For instance, in the transition from Fresnel's wave theory of light to Maxwell's electromagnetic theory, the structural equations governing light propagation as transverse waves were preserved, even as the underlying ontology shifted from ether vibrations to . This approach allows realists to endorse the approximate truth of structural content without committing to the full descriptive accuracy of past theories' objects, thereby mitigating the threat of theory replacement. A key example is the enduring symmetries of the , where Maxwell's classical equations capture relational invariances—such as gauge symmetries—that persist in (QED), despite the shift to quantized fields. Ontic structural realism (OSR), developed by James Ladyman and Steven French, advances a bolder metaphysical claim: structures are all that exists ontologically, with objects emerging as derivative or illusory nodes within relational networks. In this framework, supports OSR by revealing phenomena like entanglement, where individual particle identities dissolve into holistic relations, eliminating the need for intrinsic properties. OSR thus reframes reality as a web of structural possibilities, drawing on the relational of to argue that objects lack independent existence beyond their structural roles. Structural realism addresses key antirealist challenges through its focus on . It counters the pessimistic meta-induction by highlighting how structures endure amid theoretical change, preserving the realist's explanation of empirical success without endorsing discarded entities. In quantum contexts, it handles incompatibilities—such as non-local correlations—via a purely relational that avoids classical assumptions about object individuality. Since 2000, structural realism has gained significant traction in the , influencing debates on and field theories, with proponents arguing it best reconciles with the structural turn in contemporary .

Entity Realism

Entity realism, also known as experimental realism, is a selective form of scientific realism that posits the reality of unobservable entities based on their causal roles in experiments, particularly through human manipulation, rather than on the approximate truth of entire scientific theories. Philosopher introduced this view in his 1983 book Representing and Intervening, arguing that if scientists can intervene with an entity to produce effects or use it to investigate other phenomena, then that entity exists independently of theoretical frameworks. The core slogan capturing this idea is: "If you can spray them, then they are real," referring to the ability to direct entities like electrons in experiments to probe other aspects of nature. This approach advocates for a limited commitment to the of specific entities, without endorsing the full theoretical apparatus surrounding them. For instance, entity realists accept the independent existence of electrons as manipulable objects with properties like charge and —termed "home truths"—but remain agnostic about deeper theoretical elements, such as the ontological implications of quantum wave functions in . This selectivity allows scientists to treat entities as real tools in experimentation, even if the overarching theories evolve or face challenges. Key examples illustrate how manipulation provides evidence for entity reality across scientific domains. In particle physics, electrons become "experimenter's entities" when accelerated and directed through cloud chambers to detect tracks or sprayed onto targets to study weak neutral currents, confirming their causal efficacy without relying on complete theoretical agreement. Similarly, microscopes enable intervention with biological structures, such as observing and manipulating cellular components, while particle accelerators like those at allow physicists to collide and redirect subatomic particles, yielding reliable interactions that affirm their existence. In modern biology, techniques like demonstrate manipulation of neural entities, where light-sensitive proteins in neurons are activated to control behavior in living organisms, extending Hacking's microscope argument to molecular interventions. One major advantage of entity realism is its resilience against the pessimistic meta-induction, which argues that past successful theories were later falsified, undermining belief in current theories' truth. By committing only to causally efficacious entities rather than requiring theories to be approximately true, entity realism sidesteps this critique, as manipulated entities like electrons persist across theoretical shifts. It also bridges and by emphasizing observable experimental outcomes and interventions, providing a middle ground that validates scientific practice without full theoretical realism. Despite these strengths, realism faces limitations and ongoing debates, particularly regarding the criteria for what qualifies as genuine . Critics question whether all purported manipulations truly demonstrate independent reality; for example, manipulations involving quarks or quasi-particles may rely on theoretical constructs rather than direct causal contact, potentially conflating with . These discussions highlight the need for clearer boundaries on "" to avoid overextending commitments to theoretical posits.

References

  1. [1]
    [PDF] Arguments Concerning - Scientific Realism
    In his book Between Science and Philosophy, Smart gives two main arguments for realism. One is that only realism can respect the im- portant distinction ...
  2. [2]
    None
    ### Definition of Scientific Realism
  3. [3]
    [PDF] 1 THE SCIENTIFIC REALISM DEBATE - Ioannis Votsis
    The argument that came to be known as the 'no miracle argument' (NMA) was independently proposed by J.J.C. Smart (1963) and Hilary Putnam (1975). According.<|control11|><|separator|>
  4. [4]
    [PDF] The Current Status of Scientific Realism
    I conclude that the typical realist rebuttals to empiricist or constructivist arguments against realism are, in important ways, inadequate. I diagnose the ...
  5. [5]
    Richard Boyd, Realism, approximate truth, and philosophical method
    Realism, approximate truth, and philosophical method · Richard Boyd. In C. Wade Savage, Scientific Theories. University of Minnesota Press. pp. 355-391 ...
  6. [6]
    Richard Boyd, On the current status of the issue of scientific realism
    Richard Boyd on Scientific Realism.HilaryHG Putnam ... Which Models of Scientific Explanation Are (In)Compatible with Inference to the Best Explanation?
  7. [7]
    Scientific Realism - Stanford Encyclopedia of Philosophy
    Apr 27, 2011 · Scientific realism is a positive epistemic attitude toward the content of our best theories and models, recommending belief in both observable and unobservable ...Considerations Against... · Antirealism: Foils for Scientific...
  8. [8]
    Scientific Realism and Antirealism
    Scientific realism is the view that well-confirmed scientific theories are approximately true; the entities they postulate do exist; and we have good reason to ...Quine's Immanent Realism · Scientific Realism · Historical Challenges to...
  9. [9]
    Constructive Empiricism - Stanford Encyclopedia of Philosophy
    Oct 1, 2008 · Constructive empiricism is the version of scientific anti-realism promulgated by Bas van Fraassen in his famous book The Scientific Image (1980).
  10. [10]
    Plato's Middle Period Metaphysics and Epistemology
    Jun 9, 2003 · A prime example of the interpretative problems facing the student of Plato is the development of his most distinctive doctrine, the theory of ...
  11. [11]
    Aristotle's Metaphysics - Stanford Encyclopedia of Philosophy
    Oct 8, 2000 · The essence of such a hylomorphic compound is evidently its form, not its matter. As Aristotle says “by form I mean the essence of each thing ...
  12. [12]
    Copernican Revolution | History, Science, & Impact - Britannica
    In contrast to Platonic instrumentalism, Copernicus asserted that to be satisfactory astronomy must describe the real, physical system of the world.
  13. [13]
    absolute and relational space and motion, post-Newtonian theories
    Aug 11, 2006 · In this article, we explore the ways in which the selfsame issues have been taken up by contemporary authors, beginning with Mach, moving on to Einstein.
  14. [14]
    Atomism from the 17th to the 20th Century
    Jun 30, 2005 · The key assumption of Dalton's chemical atomism is that chemical elements are composed of 'ultimate particles' or atoms. The least part of a ...
  15. [15]
    Realism and Instrumentalism in 19th-Century Atomism - jstor
    I argue that there was a gradual transition from an instrumentalist to a realistic acceptance of the atomic theory, because of gradual increases in its ...
  16. [16]
    Auguste Comte - Stanford Encyclopedia of Philosophy
    Oct 1, 2008 · Auguste Comte (1798–1857) is the founder of positivism, a philosophical and political movement which enjoyed a very wide diffusion in the second half of the ...Missing: challenge realism
  17. [17]
    Quine's Two Dogmas as a Criticism of Logical Empiricism
    “Two Dogmas” was to demonstrate that logical positivism was possible solely due to unjustified assumptions. Quine aimed to point out that the rescuing of ...Missing: impact | Show results with:impact<|separator|>
  18. [18]
    Willard Van Orman Quine - Stanford Encyclopedia of Philosophy
    Apr 9, 2010 · Quine, however, famously casts doubt on analytic-synthetic distinction, and rejects the use made of it by the Logical Empiricists and other ...Missing: decline | Show results with:decline
  19. [19]
    Thomas Kuhn - Stanford Encyclopedia of Philosophy
    Aug 13, 2004 · Kuhn claimed that science guided by one paradigm would be 'incommensurable' with science developed under a different paradigm, by which is meant ...
  20. [20]
    [PDF] After Popper, Kuhn and Feyerabend - PhilPapers
    Using a political metaphor to describe scientific revolutions Kuhn says of scientists working in different paradigms that. 'because they acknowledge no supra ...
  21. [21]
    Willard Van Orman Quine: Philosophy of Science
    This article provides an overview of Quine's naturalistic conception of philosophy, and elaborates on its examination of the epistemological and ontological ...
  22. [22]
    Putnam's no Miracles Argument - OpenEdition Journals
    Dec 20, 2021 · In this paper I investigate Hilary Putnam's conception of scientific realism by examining the so-called “no miracles argument.”
  23. [23]
    [PDF] Hilary Putnam - Assets - Cambridge University Press
    One of Putnam's responses to these nonrealist positions is the argument from success: while realism has a sim- ple explanation for the success of science ...
  24. [24]
    [PDF] What Do Philosophers Believe? - PhilPapers
    Nov 30, 2013 · Spurred by this sociological, historical, and methodological interest, we conducted a survey of the views of professional philosophers in late ...
  25. [25]
    [PDF] MICHEL GHINS PUTNAM'S NO-MIRACLE ARGUMENT - PhilPapers
    construed as the thesis that mature theories are partially true and that their theoretical terms have real ...
  26. [26]
    The No-Miracles Argument for Realism: Inference to an ... - jstor
    How- ever, the first premise is probably false: the general theory of relativity does not explain why scientific theories reliably yield correct predictions;.
  27. [27]
    Towards a realistic success-to-truth inference for scientific realism
    Jul 8, 2016 · With this distinction in place, the realist makes an inference from novel predictive success to the (probable) approximate truth of only the ...
  28. [28]
    A Confutation of Convergent Realism | Philosophy of Science
    Apr 1, 2022 · This essay contains a partial exploration of some key concepts associated with the epistemology of realist philosophies of science.
  29. [29]
    Approximate Truth and Scientific Realism | Philosophy of Science
    Apr 1, 2022 · This paper describes a theory of accuracy or approximate truth and applies it to problems in the realist interpretation of scientific theories. ...
  30. [30]
    Science: scientific anti-realism or scientific realism?
    This is a survey of professional philosophers in the English-speaking world and others concerning their views on some central philosophical questions, ...Missing: mid- pre-
  31. [31]
    Scientific Realism and the 'Pessimistic Induction' - jstor
    Although it is correct that realists should not worry about all past theories that. Laudan suggests, this move alone is not enough to defeat the 'pessimistic ...
  32. [32]
    Incompatibility and the pessimistic induction: a challenge for ...
    May 4, 2021 · Two powerful arguments have famously dominated the realism debate in philosophy of science: The No Miracles Argument (NMA) and the Pessimistic Meta-Induction ( ...
  33. [33]
    Underdetermination of Scientific Theory
    Aug 12, 2009 · The simple idea that the evidence available to us at a given time may be insufficient to determine what beliefs we should hold in response to it.A First Look: Duhem, Quine... · Holist Underdetermination and...
  34. [34]
    [PDF] Two Dogmas of Empiricism
    The other dogma is reductionism: the belief that each meaningful statement is equivalent to some logical construct upon terms which refer to immediate ...Missing: positivism | Show results with:positivism
  35. [35]
    [PDF] Philosophical Responses to Underdetermination in Science
    I must point out that transient underdetermination poses no less threat to scientific realism than permanent underdetermination at the point of time when all ...
  36. [36]
    Simplicity in the Philosophy of Science
    A standard realist response is to emphasize the role of the so-called “theoretical virtues” in theory choice, among which simplicity is normally listed. The ...
  37. [37]
    [PDF] Van Fraassen's Best of a Bad Lot Objection, IBE and Rationality
    The best of a bad lot objection is predicated on, and really only requires, the idea that in any real case of IBE where one hypothesis is favored as best over ...
  38. [38]
    [PDF] Constructive Empiricism Now - Princeton University
    Constructive Empiricism, the view introduced in The Scientific. Image, is a view of science, an answer to the question “what is science?
  39. [39]
    On Van Fraassen's Critique of Abductive Reasoning - jstor
    unobservables. In line with some recent ar this partial scepticism cannot ... consider them in turn. III. VAN FRAASSEN'S ARGUMENT FROM THE BAD LOT. The ...
  40. [40]
    Can a Constructive Empiricist Adopt the Concept of Observability?
    Jan 1, 2022 · From L it follows that an electron (e) is unobservable by humans because it is far too tiny to be detected by our eyes by means of light-waves:.Missing: technology | Show results with:technology
  41. [41]
    [PDF] Is the Copenhagen Interpretation Compatible with Philosophical ...
    This is what is known as wave-particle duality. In quantum mechanics, the ... scientific realism states that “science aims to give us, in its theories ...
  42. [42]
    Colloquium: Incompatible measurements in quantum information ...
    Feb 6, 2023 · In this Colloquium joint measurability and incompatibility are reviewed from the perspective of quantum information science.
  43. [43]
    Scientific realism and underdetermination in quantum theory - Egg
    Sep 30, 2021 · This paper surveys the status of scientific realism in relation to quantum physics, focusing on the problem of underdetermination.2 Textbook Qm And The... · 3 Underdetermination And Qm · 5 Realist Responses To...
  44. [44]
    [PDF] REALISM AND QUANTUM MECHANICS - PhilSci-Archive
    sentations tell us that quantum mechanics is not compatible with realism. ... A number of views of traditional realist philosophy is incompatible with the results ...
  45. [45]
    [PDF] On the current status of the issue of scientific realism
    The aim of the present essay is to assess the strengths and weaknesses of the various "traditional" arguments for and against scientific realism. I.
  46. [46]
    Structural Realism: The Best of Both Worlds?* - Worrall - 1989
    The main argument for scientific realism is that our present theories in science are so successful empirically that they can't have got that way by chance.
  47. [47]
    [PDF] Holism and Structuralism in U(1) Gauge Theory - PhilSci-Archive
    When combined with Dirac's matter field theory, Maxwell's theory becomes embedded into the wider framework of the Dirac-Maxwell gauge theory or, once quantized ...
  48. [48]
    [PDF] Ontic Structural Realism and Modality - PhilArchive
    Introduction. Ontic structural realism (OSR), as originally developed by Steven French and James. Ladyman, incorporates an explicit commitment to the claims ...
  49. [49]
    [PDF] PhilSci-Archive - A Confutation of the Pessimistic Induction
    In response, the realists could construct an optimistic induction with an equal number of past theories which are considered approximately true in the light of ...
  50. [50]
    Introduction: Structuralists of the world unite - ScienceDirect.com
    Key arguments and claims in Steven French's The Structure of the World are articulated and assessed. Differences between different forms of ontic structural ...
  51. [51]
    [PDF] Structuralism in the philosophy of physics - UQ eSpace
    Ontic structural realism or ontic structuralism originally aims both to constitute a novel scientific realist defence against the standard anti-realist attacks ...
  52. [52]
    [PDF] Experimentation and scientific realism
    The philosopher's standard "theoretical entity" is the electron. I will illustrate how electrons have become experimental entities, or experi- menter's entities ...Missing: chamber | Show results with:chamber<|control11|><|separator|>
  53. [53]
    From Microscopes to Optogenetics: Ian Hacking Vindicated
    Jan 1, 2022 · Ian Hacking's (Reference Hacking1983) “little” book is best remembered for his defense of “scientific entity realism,” encapsulated in the ...