Fact-checked by Grok 2 weeks ago

Explanatory power

Explanatory power refers to the capacity of a or to account for by increasing its probability relative to background , thereby reducing the associated with observed phenomena in . This concept is central to evaluating scientific explanations, distinguishing them from mere descriptions or predictions by emphasizing how well they unify and make sense of data. In probabilistic terms, explanatory power is often formalized using Bayesian frameworks, where it measures the degree to which an explanans ( h) enhances the likelihood of an explanandum ( e). A prominent measure, proposed by Schupbach and Sprenger, defines it as E(e, h) = \frac{P(h|e) - P(h|\neg e)}{P(h|e) + P(h|\neg e)}, yielding values from -1 (contradiction) to 1 (full ), with 0 indicating neutrality; this satisfies key like positive and from probabilities. Alternative measures, such as the likelihood \frac{P(e|h)}{P(e)} from Popper and Good, or normalized differences like \frac{P(e|h) - P(e)}{1 - P(e)} from Crupi and Tentori, similarly quantify how much a boosts evidential support. These approaches enable contrastive comparisons, assessing power relative to alternative hypotheses, and play a pivotal in Inference to the Best Explanation (IBE), where theories are selected for their superior ability to explain rather than just predict. Beyond , explanatory power extends to fields like and , where it evaluates models' interpretability and generalizability, though debates persist on balancing it against and predictive accuracy. For instance, systematic power builds on explanatory power by incorporating predictive and retrodictive elements, promoting theories that are truth-conducive over finite sets. Its normative role underscores that explanations should not only fit data but also illuminate underlying mechanisms, fostering scientific progress through deeper understanding.

Definition and Key Concepts

Core Definition

Explanatory power refers to the capacity of a or to effectively account for its subject matter by rendering phenomena intelligible and expected, in contrast to explanatory impotence, where the hypothesis fails to illuminate or connect observations in a meaningful way. This concept is central to , emphasizing not merely predictive accuracy but the provision of understanding through causal insight or conceptual integration. In probabilistic terms, it measures how a hypothesis increases the probability of relative to background . Key components of explanatory power include scope, which measures the breadth of facts and phenomena encompassed by the explanation, and depth, which involves the detail and precision with which underlying causal mechanisms are articulated. Unification entails integrating diverse observations into a coherent, parsimonious that reveals underlying patterns. These elements collectively enhance a theory's ability to foster genuine comprehension rather than superficial description. Charles Sanders Peirce advanced the understanding of explanatory power through his development of as a mode of inference, whereby a surprising fact is transformed into an expected outcome via a proposed . He formalized this process in his 1903 Harvard lectures with the schema: "The surprising fact, C, is observed; But if A were true, C would be a matter of course, Hence, there is reason to suspect that A is true." This approach underscores explanation's role in resolving anomalies by introducing ideas that align observations with broader expectations. David Deutsch further refined the notion with his "hard to vary" principle, positing that robust explanations are those in which every detail is functionally essential, such that alterations would undermine the explanation's coherence or applicability without improving it. In this view, explanatory power derives from the explanation's resistance to arbitrary modifications, ensuring it tightly constrains possible interpretations of the phenomena.

Evaluation Criteria

The evaluation of explanatory power involves assessing how well a theory accounts for observed phenomena through established standards in the philosophy of science. These criteria emphasize the theory's capacity to integrate empirical evidence, provide mechanistic insights, and withstand rigorous testing, distinguishing robust explanations from superficial accounts. Key criteria include the theory's scope, or its ability to account for a broad range of facts and observations beyond isolated cases, ensuring comprehensive coverage of relevant data. Accuracy in detailing causal relations is another essential standard, requiring the theory to specify precise mechanisms that link causes to effects without inconsistencies. Theories with strong explanatory power also offer predictive implications, enabling forecasts of future observations that can be empirically verified, thereby demonstrating fruitfulness in generating new research avenues. Further evaluation hinges on the theory's reliance on empirical rather than appeals to , grounding explanations in testable . Effective theories minimize ad hoc assumptions, avoiding arbitrary adjustments that complicate the framework without adding explanatory value, which aligns with principles of . serves as a critical demarcation, where a theory must permit potential refutation through observation, as emphasized by , who argued that scientific progress depends on bold conjectures open to empirical challenge. Additionally, efficient compression—unifying diverse observations under fewer principles—enhances explanatory strength by revealing underlying patterns. Popper further stressed the importance of rigorous testing to detect and eliminate immunizing stratagems, such as auxiliary hypotheses introduced solely to shield a from refutation, which undermine genuine scientific . Unlike mere descriptions that catalog correlations, true explanations must delineate causation and operational , providing into how and why phenomena occur rather than just what happens.

Historical Development

Philosophical Origins

The concept of explanatory power traces its philosophical roots to , particularly in the work of , who developed a framework of —material, formal, efficient, and final—to account for phenomena in nature. Among these, teleological explanations, centered on final causes, were pivotal, as they emphasized purpose and goal-directedness as essential to providing depth and completeness in understanding why things occur. Aristotle argued that final causes reveal the intrinsic ends or purposes that drive natural processes, thereby conferring greater explanatory power than mere descriptions of how events unfold. In the medieval period, synthesized Aristotelian causality with , adapting the to demonstrate how natural explanations align with . Aquinas viewed explanations through final causes as manifestations of God's rational order, where the purposes observed in reflect the divine and intellect. This integration positioned explanatory power not only as a tool for comprehending the world but as a means to uncover the underlying harmony of the divine plan. The brought a critical turn with David Hume's toward traditional notions of causation, challenging metaphysical explanations in favor of observable empirical patterns. In his analysis, causation consists primarily in constant conjunctions or regularities among events, derived from experience rather than necessary connections or hidden powers. This shift diminished the role of teleological or purpose-based explanations, redirecting philosophical inquiry toward inductive generalizations that explain phenomena through predictable associations alone. By the 19th century, American pragmatist reintroduced explanatory considerations through his theory of , a form of that selects based on their ability to render surprising facts intelligible. Peirce described as the process of forming an explanatory to account for observed anomalies, where the hypothesis's explanatory power—its capacity to unify and elucidate data—justifies its adoption over alternatives. This approach marked a pragmatic , emphasizing explanatory adequacy as a for hypothesis selection in .

Developments in Philosophy of Science

In the early , logical sought to formalize scientific explanation within a rigorous logical framework, culminating in the deductive-nomological (DN) model proposed by Carl Hempel and Paul Oppenheim. This model conceives of explanations as deductive arguments where the phenomenon to be explained (the explanandum) is logically derived from a set of general laws of nature and particular initial conditions, ensuring that the truth of the premises guarantees the truth of the conclusion. Explanatory power, under the DN model, resides in the ability of theories to subsume diverse phenomena under universal laws, thereby unifying empirical observations through logical necessity rather than mere description. This approach, rooted in the Vienna Circle's emphasis on verifiability and logical structure, marked a shift toward viewing explanation as a cornerstone of scientific , influencing subsequent debates on theory evaluation. By the mid-20th century, Karl Popper's falsificationism reframed explanatory power in terms of empirical , arguing that robust scientific explanations must generate falsifiable predictions to distinguish from . In his seminal work, Popper contended that explanations derive their strength from bold conjectures that risk refutation through , integrating explanatory depth with the potential for severe testing. This perspective elevated explanatory power beyond mere , emphasizing its role in advancing knowledge through critical scrutiny and the elimination of erroneous theories, while aligning it closely with Popper's demarcation criterion of . Falsificationism thus positioned explanation as a dynamic process within scientific progress, where the power to explain observed facts is inextricably linked to the vulnerability to empirical disconfirmation. In the late , Peter 's development of Inference to the Best Explanation (IBE) further centralized explanatory power in scientific reasoning, portraying it as the mechanism by which leads to theory selection. Lipton argued that scientists infer the most likely by assessing which potential best unifies and illuminates the data, with "best" determined by virtues such as depth, breadth, , and unification. Under IBE, explanatory power transcends formal or falsification alone, serving as a pragmatic guide for abductive inference in both everyday and scientific contexts, where competing explanations are evaluated holistically. This framework highlighted explanation's inferential primacy, influencing by underscoring how explanatory coherence drives theoretical acceptance amid by . Extending these ideas into the , David Deutsch's Popperian refined the notion of explanatory power through the criterion of "hard-to-vary" explanations, which resist arbitrary modifications while retaining their ability to account for phenomena. Deutsch posits that good explanations are those whose components are tightly constrained by the facts they explain, making them conjectural yet resilient to variation without loss of scope or precision. Building on Popper's , this approach emphasizes creativity in conjecture and the explanatory reach of theories in fostering unbounded progress, as seen in fields like physics where explanations must withstand rigorous variation tests. Deutsch's contributions thus evolve explanatory power into a key driver of epistemological , where the quality of explanations determines the of scientific .

Formal Measures

Probabilistic Measures

In probabilistic approaches to explanatory power, the concept is quantified using conditional probabilities to assess how well a H (explanans) accounts for E (explanandum). A foundational measure defines explanatory power as the difference EP(H, E) = P(E \mid H) - P(E \mid \neg H), which evaluates the extent to which H renders E more probable compared to its absence. This difference highlights the hypothesis's role in elevating the likelihood of the evidence beyond baseline expectations. Schupbach and Sprenger (2011) advance a more refined, symmetric probabilistic measure that integrates both confirmatory strength and informativeness, given by EP(H, E) = \frac{P(H \mid E) - P(H \mid \neg E)}{P(H \mid E) + P(H \mid \neg E)}. This formula, derived from a set of adequacy conditions for explanatory measures, yields values between -1 (indicating H makes E less likely) and 1 (indicating H entails E), with 0 signifying neutrality. It balances the degree to which E confirms H (via P(H \mid E)) against the hypothesis's ability to distinguish E from its (via P(H \mid \neg E)), ensuring that explanations are not merely redundant with prior knowledge. Unlike simpler differences, this account avoids overvaluing hypotheses that confirm evidence already deemed highly probable independently. Subsequent work by Crupi and Tentori (2022) provides a second look at these posterior ratio measures, proving representation theorems that characterize them based on intuitive adequacy conditions. They also introduce an alternative class of measures using relative probability distances, which they endorse for better aligning with explanatory intuitions by overcoming limitations in the Schupbach-Sprenger measure, such as sensitivity to certain probabilistic anomalies. These developments further link explanatory power to inductive confirmation in Bayesian epistemology. Within Bayesian frameworks, these probabilistic measures unify explanatory power with hypothesis confirmation through , P(H \mid E) = \frac{P(E \mid H) P(H)}{P(E)}, where high explanatory power amplifies the posterior likelihood P(H \mid E) by maximizing P(E \mid H) relative to alternatives. This integration positions explanatory power as a driver of updating, emphasizing hypotheses that not only fit the but also reduce its prior improbability. Such measures find application in demarcating genuine explanations from spurious correlations by penalizing cases where P(E \mid \neg H) remains high, indicating that the evidence would occur with similar probability absent the hypothesis. For example, a hypothesis correlating with E through shared background factors yields low explanatory power, as it fails to provide distinctive probabilistic support, whereas a true explanans lowers surprise in E specifically under H. This probabilistic criterion thus aids in evaluating inferential strength in scientific reasoning without relying on causal assumptions.

Information-Theoretic Measures

In information-theoretic approaches to explanatory power, Ray Solomonoff's theory of universal induction provides a foundational framework by conceptualizing explanations in terms of algorithmic compressibility. Solomonoff proposed that the explanatory power of a hypothesis or theory lies in its ability to generate observed data via the shortest possible computer program, measured by Kolmogorov complexity. Formally, the Kolmogorov complexity K(x) of a binary string x is defined as the length of the shortest program (in bits) on a universal Turing machine that outputs x and halts. This measure quantifies the intrinsic information content of the data, where a theory with high explanatory power corresponds to a low K(x) for the observations it describes, effectively capturing regularities in a minimal descriptive form. Building on this, the criterion posits that theories exhibiting strong explanatory power achieve unification of diverse observations by minimizing the total description length required to encode the . In this view, explanatory power emerges from the efficiency of , where a superior reduces the number of bits needed to specify both the and the under it, thereby revealing underlying patterns without superfluous detail. This approach treats as an akin to lossless coding, favoring models that concisely account for the phenomena while avoiding adjustments. Jorma Rissanen's Minimum Description Length (MDL) principle extends these ideas into a practical statistical tool for , balancing the complexity of the model against its fidelity to the data. The MDL is given by the formula L(M) + L(D \mid M), where L(M) is the length (in bits) needed to describe the model M, and L(D \mid M) is the length required to encode the data D given M. Theories or models with high explanatory power minimize this total description length, ensuring that the explanation is neither overly simplistic (failing to fit the data) nor excessively complex (introducing unnecessary parameters). This principle has been widely adopted in fields requiring inferential rigor, as it provides a quantitative basis for preferring parsimonious yet comprehensive accounts. These information-theoretic measures formalize by grounding simplicity in the reduction of descriptive information, where the "simplest" explanation is the one that compresses the data most effectively without loss. By linking explanatory power to algorithmic brevity, they offer a non-probabilistic justification for prioritizing theories that achieve maximal informativeness per unit of complexity.

Applications and Examples

Scientific Examples

Darwinian evolution, as articulated in Charles Darwin's theory of , demonstrates significant explanatory power by accounting for the vast observed in living organisms, the adaptive traits that enable survival in diverse environments, and patterns in the record that show gradual changes over geological time. posits that heritable variations within populations, combined with differential survival and reproduction, drive evolutionary change, transforming what were once seen as anomalous features—such as the geographic distribution of species or vestigial structures—into expected outcomes of . This framework unifies disparate biological phenomena under a single causal mechanism, explaining why species exhibit intricate adaptations like the of peppered moths during industrial pollution or the antibiotic resistance in , without invoking intervention. The theory of exemplifies explanatory power in Earth sciences by integrating previously disconnected observations of earthquakes, volcanic activity, and into a coherent model of lithospheric dynamics. Proposed in the mid-20th century, it describes how Earth's outer shell is divided into rigid plates that move atop the , driven by , leading to subduction zones where oceanic plates sink and cause seismic events, or divergent boundaries that form mid-ocean ridges and new crust. This theory resolves the puzzle of continental fit—such as the jigsaw-like match between and —while explaining the concentration of over 80% of global earthquakes and volcanoes along plate boundaries, such as the . By providing a unified mechanism for these surface features, plate tectonics not only retrofits historical data like fossil distributions across continents but also offers predictive insights for seismic hazards. The showcases explanatory power by supplanting the earlier , which attributed infections to "bad air" from decaying matter, with a microbial causation model that elucidates the and of infectious diseases. Developed through experiments by and in the late 19th century, it establishes that specific microorganisms—bacteria, viruses, and fungi—invade hosts and provoke illness, explaining phenomena like the contagion of via contaminated water or the spread of through airborne droplets. This causal framework reveals why diseases cluster in patterns tied to and contact, such as puerperal fever in hospitals, and underpins interventions like , which targets pathogens directly to confer immunity, as seen in the eradication of . Quantum mechanics illustrates superior explanatory power over for atomic and subatomic phenomena, despite its counterintuitive principles like wave-particle duality and superposition, by accurately describing behaviors that classical models fail to predict. Formulated in the early by pioneers including and , it explains the discrete energy levels of s in atoms, accounting for atomic spectra and the stability of matter that would otherwise collapse under classical electrostatic forces. For instance, resolves the problem and the , where light behaves as discrete quanta (photons), enabling technologies like semiconductors while classical alternatives, reliant on continuous fields, predict infinite energy emissions or incorrect thresholds. This explanatory reach extends to phenomena like electron tunneling in chemical reactions, unifying diverse atomic-scale observations under probabilistic laws.

Philosophical Examples

In , the myth of and serves as a classic illustration of limited explanatory power. According to the ancient Greek narrative, the seasons arise because , daughter of the goddess (goddess of ), spends half the year in the underworld with after eating pomegranate seeds, causing 's grief to withhold fertility from the earth during that time, resulting in winter. This story provides a superficial account of seasonal changes but lacks depth, as its elements—such as the specific gods involved or the pomegranate—could be easily varied without altering the core prediction of winter's arrival, rendering it a poor, mutable . In contrast, the heliocentric model, refined through , offers a robust causal : Earth's and elliptical orbit produce predictable seasonal variations through gravitational laws that are hard to vary without disrupting broader astronomical phenomena. This scientific framework unifies diverse observations, such as planetary motions and eclipses, under immutable principles, demonstrating superior explanatory depth. The mind-body problem further exemplifies how explanatory power distinguishes competing philosophical positions. Substance , as articulated by , posits the mind as a non-physical substance distinct from the physical body, yet it struggles to explain their interaction, such as how immaterial thoughts causally influence physical actions like raising an arm. This leads to what terms the "explanatory impotence" of , as it fails to account for the mechanisms of mind-body causation without invoking unspecified or mysterious interfaces, leaving the theory and disconnected from . , conversely, achieves greater unification by reducing mental states to brain processes, with neuroscience explaining phenomena like decision-making through neural networks and synaptic activity, thereby integrating the mind into the physical world without positing separate realms. For instance, functional MRI studies correlate specific thoughts with brain regions, providing a causal chain that cannot match in scope or precision. In the free will debate, demonstrates stronger explanatory power compared to . , such as and , argue that is compatible with , defining it as the capacity for reasoned action absent external , which integrates seamlessly with scientific understandings of causation and human . This view explains through evolved cognitive processes, such as and , without requiring indeterministic breaks in the causal chain, thus unifying with empirical of behavior. , which insists on to preserve alternative possibilities, often relies on assumptions about uncaused events in the or , failing to specify how such randomness enhances control or , rendering it explanatorily vague and disconnected from . 's approach, by contrast, accounts for the of while aligning with deterministic models of decision-making observed in . Theodicies addressing the highlight varying degrees of explanatory power in reconciling with an omnipotent, omnibenevolent God. Traditional theodicies, like the defense proposed by , attempt to explain as a necessary byproduct of genuine human choice, suggesting that a world with free agents allows for greater goods like love and virtue, though it struggles with the scope of evils unrelated to human agency, such as earthquakes or diseases. John Hick's soul-making extends this by positing as essential for moral and spiritual growth, enabling souls to develop toward perfection, but critics note its limited depth in addressing gratuitous horrors, like the prolonged agony of innocents, which seem disproportionate to any developmental benefit. Atheistic alternatives, such as J.L. Mackie's logical , gain explanatory traction by arguing that the sheer quantity and intensity of —encompassing both and evils—renders theistic hypotheses inconsistent without additional, unparsimonious assumptions, offering a unified account grounded in that avoids the need for justifications. These evaluations underscore how theodicies' power depends on their ability to comprehensively cover the breadth of without resorting to overly flexible or incomplete rationales.

Relations to Other Epistemic Virtues

Comparison with Simplicity

Simpler theories frequently exhibit greater explanatory power by eschewing superfluous entities or assumptions, aligning with , which prescribes preferring the simpler hypothesis among those with equivalent explanatory scope. This overlap stems from the idea that unnecessary complexity can dilute a theory's ability to coherently account for phenomena without introducing adjustments. Nevertheless, explanatory power and diverge in their priorities: while emphasizes in terms of entity count or structural minimalism, explanatory power values unification across diverse phenomena and mechanistic depth, permitting more intricate theories to prevail if they uncover underlying causal structures. For instance, a theory's capacity to integrate disparate observations into a cohesive may justify added , as unification enhances overall understanding beyond mere reduction in components. This tension manifests in scientific theory choice, such as the transition from Newtonian gravity to , where the latter's greater mathematical and conceptual complexity yields superior explanatory power in strong gravitational fields, accounting for anomalies like Mercury's orbital that Newtonian mechanics fails to address. Within inference to the best explanation (IBE), functions primarily as a tie-breaker, guiding selection among rival explanations that possess comparable degrees of explanatory power, thereby promoting theoretical economy without overriding substantive explanatory merits.

Comparison with Predictive Power

Explanatory power and serve complementary roles in scientific theorizing. Explanatory power focuses on retrospectively unifying and causally accounting for existing data, providing insight into underlying mechanisms, whereas emphasizes prospectively forecasting novel observations to test and refine theories. This distinction highlights how explanatory models prioritize causal hypotheses and theoretical understanding, while predictive models stress empirical accuracy and for future validation. A potential conflict arises when theories excel in explanatory breadth but lack robust predictive precision. For instance, Freudian psychoanalysis offers extensive post-hoc interpretations of , unifying diverse psychological phenomena under concepts like the unconscious, yet it often fails to generate specific, testable predictions that can be empirically falsified. This imbalance underscores how strong explanatory frameworks may accommodate known evidence too flexibly, reducing their capacity for risky, forward-looking forecasts. In scientific practice, the most effective theories integrate both virtues, balancing causal unification with verifiable predictions. Karl Popper's framework of corroboration exemplifies this by requiring explanations to withstand severe tests through bold, falsifiable predictions, thereby enhancing a theory's overall epistemic standing. Such integration ensures that explanatory depth is not merely but actively probed against new data, aligning with Popper's criterion. Empirical studies in various fields suggest that high explanatory power frequently correlates with strong predictability, as unified causal models often yield reliable forecasts when rigorously tested. However, exceptions persist, such as , which ambitiously unifies and through elegant mathematical structures but currently offers limited testable predictions due to its high-energy scales beyond experimental reach. This case illustrates how ambitious explanatory aims can outpace predictive confirmation, prompting ongoing debates about theory assessment in untested domains.

Criticisms and Debates

Key Challenges

One major challenge to relying on explanatory power in theory selection is its inherent subjectivity, as assessments of what constitutes a "good" explanation often reflect anthropocentric intuitions rather than objective features of the world. In constructive empiricism, Bas van Fraassen argues that explanatory virtues like power are pragmatic rather than epistemic, varying with human interests, contexts, and questions posed, thus making them unreliable for justifying belief in unobservable entities beyond empirical adequacy. This subjectivity undermines the use of explanatory power in inference to the best explanation (IBE), where judgments of explanatory superiority can differ across individuals or cultures without a neutral standard to resolve disputes. A related issue is the epistemic circularity embedded in IBE when explanatory power serves as the criterion for theory choice. To evaluate which theory best explains the evidence, one must already possess standards for what counts as explanatory adequacy, yet IBE posits that the selected theory itself defines those standards, creating a loop where the method presupposes its own validity. This circularity arises because explanatory power is not derived independently from the theories under consideration but is instead retrofitted based on the theory's success, rendering the inference non-justified without external grounding. Historical examples illustrate how theories with apparent explanatory power can later be falsified, highlighting the risk of over-relying on this virtue. The of , prevalent in the , provided a unified for , rusting, and by positing a fire-like substance released during these processes, which seemed explanatorily powerful at the time compared to alternatives. However, experiments by demonstrated that involves oxygen gain rather than phlogiston loss, falsifying the theory and showing how explanatory appeal can mislead when accumulates. Finally, incorporating explanatory power into Bayesian frameworks leads to incoherence, as van Fraassen contends. If explanatory considerations directly influence prior probabilities, agents risk assigning higher credence to theories solely for their explanatory virtues, even when is , resulting in beliefs that violate Bayesian conditionalization and fail to track empirical adequacy consistently. This argument reveals a between explanatory power and probabilistic , where prioritizing the former can produce irrational credences incompatible with standard Bayesian norms.

Alternatives and Responses

One prominent alternative to emphasizing explanatory power in scientific inference is , which prioritizes predictive success as the primary criterion for evaluating theories, viewing primarily as a for forecasting phenomena rather than uncovering underlying truths or . In this view, theories are assessed based on their instrumental utility in generating accurate predictions, without commitment to their explanatory depth regarding unobservables. Another key alternative is Bas van Fraassen's constructive , which advocates for empirical adequacy over comprehensive , holding that the goal of science is to develop theories that "save the phenomena" by accurately describing observables, while remaining agnostic about the truth of claims concerning unobservables. Van Fraassen argues that explanatory virtues, such as depth or unification, are pragmatic features tied to human interests rather than objective indicators of theoretical merit, thereby demoting explanatory power in favor of empirical fit. To address concerns about the subjectivity in assessing explanatory power, philosophers have proposed objective formal measures, particularly probabilistic ones, that quantify explanatory strength in terms of how much a increases the probability of relative to alternatives or . For instance, measures like those developed by Schupbach and Sprenger define explanatory power as the normalized in likelihoods, E(e, h) = \frac{P(e|h) - P(e|\neg h)}{P(e|h) + P(e|\neg h)}, providing a standardized, non-subjective that aligns qualitative intuitions with Bayesian . However, recent critiques of such purely probabilistic measures argue that they suffer from issues like temporal shallowness (failing to differentiate predictive from retrodictive power) and handling negative causal relevance inadequately, suggesting the need for hybrid or non-probabilistic approaches to fully capture explanatory virtues. Defenses of inference to the best explanation (IBE), which centrally features explanatory power, have been advanced by Gilbert Harman and , who contend that explanatory considerations are indispensable for rational belief formation beyond mere predictive accuracy. Harman originally formulated IBE as a fundamental form of non-deductive reasoning where hypotheses are selected for their superior explanatory virtues, while elaborates that such inferences track truth by privileging "lovely" explanations that cohere with in a way predictive success alone cannot. These arguments are bolstered by , where models of human reasoning, such as Paul Thagard's theory of explanatory coherence, demonstrate that people naturally employ IBE-like processes to integrate explanations, supporting its role as a basic cognitive mechanism. Hybrid approaches integrate explanatory power with other epistemic virtues like unification and within multi-criteria frameworks for theory assessment, recognizing that no single virtue suffices in isolation. For example, frameworks inspired by Thagard's work combine explanatory depth with the degree to which a unifies disparate phenomena and maintains internal , allowing for a balanced that mitigates the limitations of explanatory power alone. Such integrations appear in of , where IBE is refined to weigh explanatory power alongside unification to better approximate scientific practice.

References

  1. [1]
    [PDF] The Logic of Explanatory Power
    Abstract. This paper introduces and defends a probabilistic measure of the ex- planatory power that a particular explanans has over its explanandum. To.
  2. [2]
    [PDF] On the Role of Explanatory and Systematic Power in Scientific ...
    Definition 3 (Explanatory Power 1). If hypothesis H explains evidence E, then the explanatory power of H regarding E with respect to probability function Pr ...
  3. [3]
    Dissecting explanatory power - jstor
    Dec 6, 2008 · depth and explanatory power. Philosophers too use these notions to ... them. 5 Mechanistic detail and causal importance. The idea that unification ...
  4. [4]
    C. S. Peirce, Abduction, and the Pursuit of Scientific Theories - jstor
    Peirce offered the following schema for abduction in the Harvard lectures in. 1903: The surprising fact, C, is observed;. But if A were true, C would be a ...
  5. [5]
    David Deutsch: A new way to explain explanation | TED Talk
    Oct 26, 2009 · The search for hard-to-vary explanations is the origin of all progress. It's the basic regulating principle of the Enlightenment. So, in ...Missing: primary source
  6. [6]
    [PDF] 356 - Objectivity, Value Judgment, and Theory Choice
    accuracy, consistency, scope, simplicity, and fruitfulness—are all standard criteria for evaluating the adequacy of a theory. If they had not been, I would ...
  7. [7]
    [PDF] Thinking About Mechanisms* - CSULB
    We do not claim that all scientists look for mechanisms or that all explanations are descriptions of mechanisms. We suspect that this analysis is applicable to ...
  8. [8]
    [PDF] EXPLANATION AND TELEOLOGY IN ARISTOTLE'S COSMOLOGY
    The strategy that Aristotle employs to give plausible accounts is to posit teleological principles as a way of finding final causes in difficult cases. The ...
  9. [9]
    Causality and Ontological Hierarchy in Thomas Aquinas - PhilPapers
    May 23, 2025 · Causality in Thomas Aquinas is therefore more than a natural explanation ... divine order, with God as the First Cause and source of all ...Missing: theology | Show results with:theology
  10. [10]
    Realism and Empiricism in Hume's Account of Causality - jstor
    empiricist and realist. Hume attempted to show that our idea of causality was derived from experience. And although his analysis has been very influential ...Missing: critique | Show results with:critique
  11. [11]
    [PDF] A Unified Interpretation of Peirce's Theory of Abduction - PhilArchive
    According to the Generative Interpretation, Peirce holds that “Abduction is the process of forming an explanatory hypothesis” (5.171, 1903) and it is the only.
  12. [12]
    Scientific Explanation - Stanford Encyclopedia of Philosophy
    May 9, 2003 · This article thus discusses treatments of scientific explanation up to the end of the twentieth century.Background and Introduction · The DN Model · A Unificationist Account of...
  13. [13]
    Studies in the Logic of Explanation - PhilPapers
    Abstract. To explain the phenomena in the world of our experience, to answer the question “why?” rather than only the question “what?
  14. [14]
    [PDF] Karl Popper: The Logic of Scientific Discovery - Philotextes
    ... power to decide the truth or falsity of its theories. Without it, clearly, science would no longer have the right to distinguish its. THE LOGIC OF SCIENCE. 4 ...
  15. [15]
    Karl Popper - Stanford Encyclopedia of Philosophy
    Nov 13, 1997 · These factors combined to make Popper take falsifiability as his criterion for demarcating science from non-science: if a theory is incompatible ...Backdrop to Popper's Thought · Basic Statements, Falsifiability... · Critical Evaluation
  16. [16]
    [PDF] Inference to the Best Explanation
    One of the central aims of the philosophy of science is to give a principled account of these judgements and inferences connecting evidence to theory. In the ...
  17. [17]
    Peter Lipton, Inference to the Best Explanation - PhilPapers
    Science depends on judgments of the bearing of evidence on theory. Scientists must judge whether an observation or the result of an experiment supports, ...
  18. [18]
    Inference to the Best Explanation - 2nd Edition - Peter Lipton - Routl
    $$54.99 In stock Free deliveryIn Inference to the Best Explanation, Peter Lipton gives this important and influential idea the development and assessment it deserves. The second edition has ...
  19. [19]
    Applying Deutsch's concept of good explanations to artificial ... - arXiv
    Dec 16, 2020 · In this work we investigate Deutsch's hard-to-vary principle and how it relates to more formalized principles in deep learning such as the bias-variance trade- ...
  20. [20]
    [PDF] Comparing Probabilistic Measures of Explanatory Power
    Mar 26, 2010 · In both of these cases, there is seen to be some reason in a hypothesis's favor precisely because of its ability to explain some observed fact.
  21. [21]
    The Logic of Explanatory Power | Philosophy of Science
    Jan 1, 2022 · This article introduces and defends a probabilistic measure of the explanatory power that a particular explanans has over its explanandum.
  22. [22]
    Explanatory Power | Bayesian Philosophy of Science
    Explanatory power is explicated as a function of the features of such a distribution. More precisely, we explicate explanatory power in a contrastive way, as we ...
  23. [23]
    [PDF] AF RMA THE RFI DUCTIVE I FERE CE$ Part l*1 - Ray Solomonoff
    The presently proposed inductive inference methods can in a sense be regarded as an inversion of H uffman coding, in that we first obtain the minimal code for a ...
  24. [24]
    An Introduction to Kolmogorov Complexity and Its Applications
    Written by two experts in the field, this is the only comprehensive and unified treatment of the central ideas and applications of Kolmogorov complexity.
  25. [25]
    Modeling by shortest data description - ScienceDirect.com
    By finding the model which minimizes the description length one obtains estimates of both the integer-valued structure parameters and the real-valued system ...
  26. [26]
    [PDF] Sharpening Occam's Razor* - CWI
    We provide a new representation-independent formulation of Occam's razor theorem, based on Kolmogorov complexity. This new formulation.
  27. [27]
    Darwinian natural selection: its enduring explanatory power - PMC
    The theory of natural selection revolutionised our understanding of living things, furnishing us with a comprehension of our existence where previously science ...
  28. [28]
    Darwinism - Stanford Encyclopedia of Philosophy
    Aug 13, 2004 · Darwinism designates a distinctive form of evolutionary explanation for the history and diversity of life on earth.
  29. [29]
    Understanding Natural Selection: Essential Concepts and Common ...
    Apr 9, 2009 · Nature as a Selecting Agent​​ Darwin demonstrated that the driving force of [adaptive] evolution comes from the accumulation, over countless ...
  30. [30]
    Plate Tectonics - National Geographic Education
    May 21, 2025 · Plate tectonics is a theory explaining how Earth's crust and upper mantle move, creating landforms like mountains, volcanoes, and earthquakes.
  31. [31]
    Plate tectonics, volcanoes and earthquakes - Science Learning Hub
    Apr 9, 2010 · Plates sliding past each other cause friction and heat. Subducting plates melt into the mantle, and diverging plates create new crust material.
  32. [32]
    Plate Tectonics in a Nutshell
    More than 80% of the world's earthquakes and volcanoes occur along or near boundaries of the tectonic plates.
  33. [33]
    a historical approach to theories of infectious disease transmission
    From miasma to germ theory we trace the evolution of conceptions in infectious disease transmission. Starting from the unproved theories of contagiousness.
  34. [34]
    Our history is a battle against the microbes: we lost terribly before ...
    Jul 20, 2020 · The germ theory of disease was the breakthrough in the fight against the microbe. Scientists identified the pathogens that cause the different ...Missing: explanatory power
  35. [35]
    Chapter 24 – Insects and the Germ Theory of Disease
    In time, the notion that invisible microscopic organisms could cause illness became known as the Germ Theory of Disease. Pasteur took the research one step ...
  36. [36]
    [PDF] How Quantum Theory Helps us Explain - arXiv
    The explanatory power of quantum theory is without parallel in the history of physics. From. Schrödinger's explanation of the energy levels of the hydrogen atom ...
  37. [37]
    Quantum mechanics: Definitions, axioms, and key concepts of ...
    Apr 29, 2024 · Quantum mechanics, or quantum physics, is the body of scientific laws that describe the wacky behavior of photons, electrons and the other subatomic particles ...
  38. [38]
    Dualism - Stanford Encyclopedia of Philosophy
    Aug 19, 2003 · Dualism usually enters philosophy as a response to the mind-body problem, where its main competitor is materialism, the form of monism that says ...
  39. [39]
    Compatibilism - Stanford Encyclopedia of Philosophy
    Apr 26, 2004 · Compatibilism offers a solution to the free will problem, which concerns a disputed incompatibility between free will and determinism.Arguments for Incompatibilism · State of the Art · Incompatibilist
  40. [40]
    Free Will - Stanford Encyclopedia of Philosophy
    Jan 7, 2002 · The term “free will” has emerged over the past two millennia as the canonical designator for a significant kind of control over one's actions.
  41. [41]
    The Problem of Evil - Stanford Encyclopedia of Philosophy
    Sep 16, 2002 · The epistemic question posed by evil is whether the world contains undesirable states of affairs that provide the basis for an argument that makes it ...Leibniz on the Problem of Evil · 1. For additional critical... · ProofMissing: explanatory | Show results with:explanatory
  42. [42]
    [PDF] Simplicity, Truth, and Probability - andrew.cmu.ed
    May 16, 2010 · Abstract. Simplicity has long been recognized as an apparent mark of truth in science, but it is difficult to explain why simplicity should ...
  43. [43]
    [PDF] Philip Kitcher Source: Philosophy of Science, Vol. 48, No. 4 (Dec ...
    that explanatory power is a special virtue of theories is a myth. We accept scientific theories on the basis of their empirical adequacy and simplicity, and ...
  44. [44]
    [PDF] Explanatory Depth* - PhilArchive
    Introduction. My aim in this article is to develop an account of ex- planatory depth that preserves the explanatory autonomy of the nonfun-.
  45. [45]
    [PDF] On the Reduction of General Relativity to Newtonian Gravitation
    Reduc- tion can be more explanatory in other ways as well: sometimes the simpler theory is an older, predecessor of the theory being reduced. These latter. “ ...<|control11|><|separator|>
  46. [46]
    [PDF] To Explain or to Predict? - UC Berkeley Statistics
    While predictive power can be assessed for both explanatory and predictive models, explanatory power is not typically possible to assess for predictive models ...
  47. [47]
    Karl Popper: Philosophy of Science
    Popper's falsificationist methodology holds that scientific theories are characterized by entailing predictions that future observations might reveal to be ...Background · Falsification and the Criterion... · Criticisms of Falsificationism
  48. [48]
    [PDF] THEORY ASSESSMENT AND FINAL THEORY CLAIM IN STRING ...
    String theory has by now maintained a highly influential position in high energy physics for more than a quarter of a century. The present article analyses ...
  49. [49]
    Constructive Empiricism - Stanford Encyclopedia of Philosophy
    Oct 1, 2008 · Constructive empiricism is the version of scientific anti-realism promulgated by Bas van Fraassen in his famous book The Scientific Image (1980).Missing: subjectivity | Show results with:subjectivity
  50. [50]
    Inference to the best explanation and epistemic circularity - PhilPapers
    Inference to the best explanation—or, IBE—tells us to infer from the available evidence to the hypothesis which would, if correct, best explain that evidence ...
  51. [51]
    9 Inference to the Best Explanation and Epistemic Circularity
    Inference to the best explanation (IBE) tells us to infer from the available evidence to the hypothesis which would, if correct, best explain that evidence. As ...
  52. [52]
    [PDF] The Best Explanation: Criteria for Theory Choice - Paul R. Thagard
    Jan 14, 2004 · According to the accepted phlogiston theory, burning objects give off the substance phlogiston, whereas, according to Lavoisier, burn- ing ...
  53. [53]
    The Best Explanation: Criteria for Theory Choice - jstor
    By what criteria is one hypothesis judged to provide a better explana- tion than another hypothesis? Except for some very brief remarks about choosing a ...
  54. [54]
    Inference to the best explanation made coherent - PhilPapers
    Van Fraassen (1989) argues that Inference to the Best Explanation is incoherent in the sense that adopting it as a rule for belief change will make one ...
  55. [55]
    Scientific Realism and Antirealism
    For van Fraassen, a theory's explanatory virtues (simplicity, unity, convenience of expression, power) are pragmatic—a function of its relationship to its users ...<|control11|><|separator|>
  56. [56]
    (PDF) The Instrument of Science: Scientific Anti-Realism Revitalised
    Apr 26, 2019 · Roughly, instrumentalism is the view that science is primarily, and should primarily be, an instrument for furthering our practical ends.
  57. [57]
    Scientific Realism - Stanford Encyclopedia of Philosophy
    Apr 27, 2011 · Additionally, van Fraassen (1985: 297–298) argues that scientific explanations of evidential consilience may be accepted without the ...
  58. [58]
    The Logic of Explanatory Power* - jstor
    This article introduces and defends a probabilistic measure of the explanatory power that a particular explanans has over its explanandum.
  59. [59]
    Inference to the Best Explanation, 2nd edition | Reviews
    $$33.95Jun 1, 2005 · Peter Lipton, Inference to the Best Explanation, 2nd edition, Routledge, 2004, 219pp, $33.95 (pbk), ISBN 0415242037.
  60. [60]
    Inference to the best explanation is basic | Behavioral and Brain ...
    Feb 4, 2010 · Inference to the best explanation is basic. Published online by Cambridge University Press: 04 February 2010. John R. Josephson.
  61. [61]
    [PDF] Explanatory coherence - Gwern
    Abstract: This target article presents a new computational theory of explanatory coherence that applies to the acceptance and rejection of scientific ...
  62. [62]
    Coherence, Explanation, and Hypothesis Selection
    This article provides a new approach to inference to the best explanation (IBE) based on a new coherence measure for comparing how well hypotheses explain ...
  63. [63]
    [PDF] Coherence, Explanation, and Hypothesis Selection
    This paper provides a new approach to inference to the best explanation. (IBE) based on a new coherence measure for comparing how well hypothe-.<|control11|><|separator|>
  64. [64]
    The Unity of Science - Stanford Encyclopedia of Philosophy
    Aug 9, 2007 · The topic of unity in the sciences can be explored through questions such as the following: Is unity a feature of reality or of our modes of cognition?1. Historical Development In... · 3. Epistemological Unities · 3.3 Epistemic Roles: From...<|separator|>