Fact-checked by Grok 2 weeks ago

Classical logic

Classical logic is a formal system of deductive reasoning that forms the foundation of much of Western philosophy, mathematics, and science, characterized by its bivalent semantics—where every proposition is either true or false—and adherence to principles such as the law of excluded middle (every statement is either true or its negation is true) and the law of non-contradiction (no statement can be both true and false). It employs propositional connectives like negation (¬), conjunction (∧), disjunction (∨), implication (→), and equivalence (↔), along with quantifiers (∀ for universal, ∃ for existential) in its predicate form, to construct valid arguments through rules such as modus ponens (from A and A → B, infer B). This system assumes that truth is absolute and independent of verification, enabling proofs by contradiction and the principle of explosion (from a contradiction, anything follows). The origins of classical logic trace back to ancient Greece, particularly (384–322 BCE), who developed syllogistic logic as a method for categorical reasoning using propositions of the form "All S are P," "No S are P," "Some S are P," and "Some S are not P," organized into valid syllogisms across four figures with 15 valid moods. 's work, detailed in his , emphasized deductive necessity and the relating these proposition types as contraries, contradictories, or subcontraries, laying the groundwork for formal deduction despite blending material and logical elements. Subsequent developments by the Stoics, including (c. 279–206 BCE), extended this to propositional logic with truth-functional connectives and a two-valued system, introducing rules like (from A → B and ¬B, infer ¬A) and hypothetical syllogisms (transitivity of implication). In the modern era, classical logic was rigorously formalized in the 19th century by (1848–1925), who introduced predicate logic with quantifiers to handle relations and variables, enabling the expression of complex mathematical statements beyond syllogisms. This evolution distinguished classical logic from alternatives like intuitionistic or constructive logics, which reject the for unverified propositions and prioritize constructive proofs. Key semantic tools, such as truth tables for evaluating propositional formulas under all possible truth assignments, confirm validity: a formula is a if true in every interpretation, underpinning applications in , , and . Despite critiques for its non-constructive nature—exemplified in 19th-century debates between (constructivist) and (classical defender)—classical logic remains the default framework for rigorous argumentation due to its completeness and decidability in propositional form.

Overview and Fundamentals

Definition and Scope

Classical logic is the standard in and that assumes bivalence, whereby every has exactly one of two truth values: true or false. It adheres to fundamental principles such as the , which states that for any proposition A, either A or its \neg A is true (A \lor \neg A), and the , which asserts that no proposition can be both true and false (\neg (A \land \neg A)). These axioms ensure a sharp distinction between truth and falsity, forming the bedrock of classical reasoning. The scope of classical logic primarily encompasses propositional logic, which deals with the truth-functional combinations of atomic propositions using connectives like (\neg), (\land), disjunction (\lor), and (\to), and predicate logic, which extends this framework to include quantifiers (\forall, \exists) over individuals in a domain, along with s and relations. It excludes higher-order logics, which quantify over s or properties, as well as non-standard variants like or intuitionistic logics that deviate from bivalence or classical principles. This delineation positions classical logic as the foundational paradigm for deductive inference in most mathematical and philosophical contexts. Central to classical logic are core assumptions that govern its inference rules and semantics, including material , where A \to B is equivalent to \neg A \lor B, treating as a truth-functional operation rather than a . elimination allows \neg \neg A to imply A, reinforcing the equivalence of a proposition and its in bivalent systems. Additionally, the explosion principle, or ex falso quodlibet, stipulates that from a (A \land \neg A), any follows, underscoring the system's commitment to as paramount. Classical logic emerged as the dominant framework in following Aristotle's foundational work on syllogistic reasoning, evolving through medieval and into modern formalizations that solidified its status as the orthodox system for logical analysis.

Key Characteristics

Classical logic is defined by its adherence to truth-functionality, a property whereby the of any compound is entirely determined by the truth values of its constituent atomic propositions. This means that logical connectives such as , , and disjunction operate as truth functions, mapping combinations of truth values (true or false) to a single output for the entire . For instance, the truth of a A \land B holds both A and B are true, independent of any external context or meaning beyond these values. Central to classical logic is the principle of bivalence, which asserts that every is either true or false, with no intermediate or indeterminate s possible. This binary assignment underpins the semantics of the system, ensuring that interpretations assign exactly one to each , thereby supporting key theorems like the (A \lor \neg A) and double negation elimination. Bivalence distinguishes classical logic from multivalued or fuzzy logics, where propositions might hold degrees of truth. Another defining feature is monotonicity, the property that permits the addition of new premises to a set without invalidating previously derived conclusions. Formally, if a set of premises \Gamma entails a formula \phi, then any superset \Gamma' \supseteq \Gamma also entails \phi. This ensures that the deductive power of classical logic accumulates reliably as knowledge expands, reflecting its suitability for rigorous, non-revisable inference but contrasting with defeasible reasoning in non-monotonic systems. Classical logic also exhibits distinct decidability properties depending on its scope. Propositional classical logic is decidable, as truth-table methods can systematically evaluate all possible assignments of truth values to atomic propositions (up to $2^n for n atoms), determining validity or in finite steps. In contrast, predicate logic is only semi-decidable: while valid formulas can be proven within the system (by the completeness theorem), there is no general to determine invalidity for all cases, a result established by , Gödel, and Turing in the 1930s. Finally, classical logic lacks paraconsistency, embracing instead the principle of ex falso quodlibet (from falsehood, anything follows), which states that a in the entails every possible . This explosive rule implies that inconsistent theories become trivial, as any \psi derives from including both \theta and \neg \theta. Paraconsistent logics reject this to tolerate inconsistencies without collapse, but classical systems prioritize as a foundational requirement for non-trivial deduction.

Historical Development

Ancient Foundations

The roots of classical logic can be traced to pre-Socratic philosophers in , particularly the Eleatic school, where early concerns about and consistency emerged. of Elea (c. 515–450 BCE) emphasized the principle of non-, arguing that reality is unchanging and one, rejecting the possibility of opposites or change as illusory, which laid groundwork for logical coherence in philosophical . His student (c. 490–430 BCE) further highlighted logical inconsistencies through paradoxes, such as the and Achilles and the Tortoise, which demonstrated contradictions in common assumptions about motion and plurality, thereby challenging opponents to resolve apparent logical flaws in their views. Aristotle (384–322 BCE) systematized these ideas into the foundational framework of classical logic in his collection of works known as the . In the Categories, he outlined ten fundamental classifications of being, providing a basis for analyzing predicates and substances in logical arguments. The developed syllogistic logic, a deductive system where conclusions follow necessarily from two premises through valid moods, such as the Barbara syllogism ("All men are mortal; Socrates is a man; therefore, is mortal"). Complementing this, the addressed scientific demonstration, explaining how syllogisms could yield certain knowledge from first principles in demonstrative sciences. Central to Aristotle's system were , articulated primarily in Metaphysics Book Gamma: the (A is A), the (it is impossible for the same thing to belong and not belong to the same thing in the same respect), and the (of two contradictories, one must be true and the other false). Following , Hellenistic philosophers, especially the , extended logical foundations toward propositional logic. of Soli (c. 279–206 BCE), the third head of the Stoic school, advanced a theory of axiōmata (assertibles) as the basic units of logic, focusing on their truth values rather than terms. Stoic logic incorporated connectives such as ("and," true only if both parts are true) and ("if...then," true unless the antecedent is true and the consequent false), forming the basis for arguments like the five indemonstrables, including . These developments complemented Aristotelian syllogistics by emphasizing compound propositions and their inferential relations, influencing later logical traditions.

Medieval to Modern Evolution

During the early medieval period, classical logic was preserved primarily through the efforts of , who translated and commented on several works from Aristotle's , including the Categories, , and , thereby transmitting key elements of Aristotelian logic to Latin Europe. These translations formed the foundation of logical study in monastic and cathedral schools until the 12th century. In the Islamic world, integrated Aristotelian logic into his comprehensive philosophical system, extending it with innovations in while emphasizing its role in scientific demonstration, as detailed in his extensive commentaries on the . Similarly, provided detailed commentaries on Aristotle's logical corpus, defending its purity against Neoplatonic influences and influencing both Islamic and later Latin scholastic traditions. Scholastic logic flourished in medieval universities such as and from the onward, where Aristotle's works, bolstered by newly translated commentaries, became central to the and curricula, leading to developments like for analyzing terms in context. However, following the initial preservation by , the study of logic experienced a relative decline in during the , largely due to the dominance of theological priorities and the loss of texts, limiting engagement to abbreviated commentaries until the 12th-century renaissance of translations. In the and early modern periods, critiqued the complexity of Aristotelian scholastic logic, advocating a simplified, topic-based system focused on natural method and rhetorical utility, which gained popularity in Protestant universities but was later seen as diminishing deductive rigor. envisioned a , a universal symbolic language for logic that would enable mechanical resolution of disputes through calculation, laying conceptual groundwork for later formal systems despite remaining unrealized in his lifetime. The marked a revival through algebraic approaches, with developing the first explicit algebraic treatment of logic in his 1847 work The Mathematical Analysis of Logic and 1854's An Investigation of the Laws of Thought, representing propositions as binary variables and operations, thus bridging logic and mathematics. contributed foundational relations in this tradition, formulating —which state that the of a is the disjunction of negations and vice versa—essential for manipulating logical expressions in . In the 20th century, classical logic underwent rigorous formalization, beginning with Gottlob Frege's 1879 Begriffsschrift, which introduced modern predicate logic notation and quantifiers, providing a precise symbolic framework for expressing mathematical inferences. Alfred North Whitehead and Bertrand Russell advanced this in their multi-volume Principia Mathematica (1910–1913), attempting to derive all mathematics from logical axioms using a ramified type theory to avoid paradoxes, though it highlighted limitations in reducing arithmetic to pure logic. Kurt Gödel's 1930 completeness theorem established that every valid first-order logical formula is provable within the system, confirming the adequacy of classical semantics for capturing deductive validity and solidifying the foundations of modern mathematical logic. This progression from medieval preservation amid theological constraints to 19th- and 20th-century symbolic innovations revived classical logic as a formal , transforming it from a rhetorical into a cornerstone of and .

Logical Connectives and Syntax

Propositional Connectives

In classical propositional logic, atomic propositions form the foundational elements of the language. These are simple, indivisible units, typically denoted by lowercase letters such as p, q, or r, each of which can be assigned a of true (T) or false (F) independently of others. The connectives, or logical operators, combine atomic propositions to build compound expressions. Classical propositional logic employs five primary connectives: negation (\neg), conjunction (\land), disjunction (\lor), implication (\to), and biconditional (\leftrightarrow). Their semantics are precisely defined via truth tables, which enumerate all possible truth-value assignments to the atomic components and determine the resulting truth value of the compound formula. For negation, a unary connective, \neg p is true if p is false, and false otherwise: \begin{array}{c|c} p & \neg p \\ \hline \mathrm{T} & \mathrm{F} \\ \mathrm{F} & \mathrm{T} \\ \end{array} Conjunction p \land q is true only when both p and q are true; disjunction p \lor q is true if at least one is true (inclusive or); implication p \to q is false solely when p is true and q is false (material conditional); and biconditional p \leftrightarrow q is true when both have the same truth value. The full truth tables for the binary connectives are as follows: \begin{array}{c|c|c|c|c|c} p & q & p \land q & p \lor q & p \to q & p \leftrightarrow q \\ \hline \mathrm{T} & \mathrm{T} & \mathrm{T} & \mathrm{T} & \mathrm{T} & \mathrm{T} \\ \mathrm{T} & \mathrm{F} & \mathrm{F} & \mathrm{T} & \mathrm{F} & \mathrm{F} \\ \mathrm{F} & \mathrm{T} & \mathrm{F} & \mathrm{T} & \mathrm{T} & \mathrm{F} \\ \mathrm{F} & \mathrm{F} & \mathrm{F} & \mathrm{F} & \mathrm{T} & \mathrm{T} \\ \end{array} These definitions ensure that the connectives capture intuitive notions of "not," "and," "or," "if...then," and "if and only if," respectively. Well-formed formulas (wffs), also known as , are constructed recursively from propositions using the connectives, with parentheses to ensure unambiguous grouping. The recursive proceeds as follows: (1) Every proposition is a wff; (2) if A is a wff, then \neg A is a wff; (3) if A and B are wffs, then (A \land B), (A \lor B), (A \to B), and (A \leftrightarrow B) are wffs; (4) nothing else qualifies as a wff. For instance, (p \land q) \to r and \neg (p \lor \neg q) are wffs, while incomplete expressions like p \land are not. This syntax guarantees that every wff has a unique , facilitating systematic evaluation and proof. Several equivalence laws hold among wffs, preserving truth values across all interpretations and enabling formula transformations. establish that \neg (p \land q) \leftrightarrow \neg p \lor \neg q and \neg (p \lor q) \leftrightarrow \neg p \land \neg q, allowing to be distributed over and disjunction. Distributive laws mirror arithmetic, with p \land (q \lor r) \leftrightarrow (p \land q) \lor (p \land r) and p \lor (q \land r) \leftrightarrow (p \lor q) \land (p \lor r). These and related tautological equivalences, such as the law of \neg \neg p \leftrightarrow p, form the basis for algebraic manipulations in propositional logic. An provides a minimal set of starting formulas and rules sufficient to derive all valid wffs, ensuring the system's . A standard Hilbert-style axiomatization for classical propositional uses three axiom schemas—all instances of which are wffs—and the rule (from A and A \to B, infer B):
  1. A \to (B \to A)
  2. (A \to (B \to C)) \to ((A \to B) \to (A \to C))
  3. (\neg A \to \neg B) \to (B \to A)
This set, originating from Hilbert and Bernays, captures the full expressive power of the connectives and is semantically complete, meaning every is provable. Alternative minimal bases exist, such as those using only and , but Hilbert's remains influential for its simplicity and generality.

Predicate Extensions

Predicate extensions in classical logic extend the syntax of propositional logic by incorporating terms, , and quantifiers, enabling the expression of statements about objects and their relations in predicate logic. This augmentation allows for greater expressive power, permitting generalizations over domains of individuals rather than merely combining truth values. Terms form the basic building blocks for referring to objects within the language. They consist of individual constants, such as a or b, which denote specific objects; individual variables, such as x or y, which act as placeholders; and function symbols applied to terms, such as f(x), where f is a function of appropriate arity. Constants and variables are the simplest terms, while functions compose more complex ones, ensuring all terms are closed (without free variables) or open depending on their variables. Predicates extend atomic formulas beyond propositional atoms by relating terms to properties or relations. A predicate symbol P of arity n combines with n terms to form an atomic formula, such as the unary P(x) for a property of x, or the binary R(x, y) for a relation between x and y. The identity predicate = is a standard binary predicate, with t_1 = t_2 asserting equality between terms t_1 and t_2. Zero-arity predicates correspond to propositional atoms, bridging to the propositional base. Quantifiers introduce binding over variables, with the universal quantifier \forall and existential quantifier \exists scoping over formulas. The universal quantifier \forall x \, \phi binds variable x in formula \phi, expressing that \phi holds for all values of x, while \exists x \, \phi asserts of some x making \phi true. The of a quantifier is the subformula it governs, and occurs only within that scope, preventing conflicts with external variables. Well-formed formulas (wffs) in predicate logic are defined recursively, starting from atomic formulas and incorporating propositional connectives with quantified subformulas. For instance, \forall x (P(x) \to Q(x)) is a wff where the implication combines predicates under universal quantification. Every first-order formula can be transformed into prenex normal form, where all quantifiers prefix a quantifier-free matrix, such as \forall x \exists y \, R(x, y). Variables in formulas are either free or bound: a variable is free if it appears without being in the scope of a quantifier binding it, and bound if captured by a \forall or \exists. For example, in \forall x (A(x, y) \to B(x)), x is bound while y is free. Substitution replaces free occurrences of a variable v in a formula \theta with a term t, denoted \theta(v \mid t), but must avoid capturing free variables in t by bound ones in \theta. To handle variable renaming without altering meaning, \alpha-conversion renames bound variables consistently, such as transforming \forall x P(x) to \forall y P(y) by replacing all bound x with y, provided y is not free in the original scope. This preserves logical equivalence and facilitates safe substitutions.

Semantics

Truth-Value Semantics

In truth-value semantics for classical propositional logic, an interpretation consists of an assignment of truth values—true (T) or false (F)—to each atomic proposition in the language. These assignments serve as the basic models for evaluating the truth of compound formulas built from atomic propositions using logical connectives such as negation (¬), conjunction (∧), disjunction (∨), implication (→), and equivalence (↔). The semantics is bivalent, meaning every proposition receives exactly one of the two truth values under any given interpretation, reflecting the core principle of classical logic that propositions are either true or false without intermediate values. The valuation function extends these truth assignments recursively to all compound formulas, determining their truth values based on the semantics of the connectives. For instance, the truth value of a v(φ ∧ ψ) is T if and only if both v(φ) = T and v(ψ) = T; otherwise, it is F. This can be formalized numerically by assigning T = 1 and F = 0, where v(φ ∧ ψ) = min(v(φ), v(ψ)), v(φ ∨ ψ) = max(v(φ), v(ψ)), v(¬φ) = 1 - v(φ), v(φ → ψ) = 1 if v(φ) ≤ v(ψ) (i.e., F only when φ is T and ψ is F), and v(φ ↔ ψ) = 1 if v(φ) = v(ψ). These recursive definitions ensure that the truth value of any depends solely on the truth values of its atomic components, making the semantics fully truth-functional. A is a if it is true under every possible , meaning its valuation is T regardless of the truth assignments to its atomic propositions. The , p ∨ ¬p, exemplifies this: for any atomic p, if p is T then ¬p is F and the disjunction is T; if p is F then ¬p is T and the disjunction is again T. Tautologies capture the validities of classical propositional logic and form the semantic basis for theorems in proof systems. Logical consequence is defined semantically such that a set of formulas Γ entails a φ (denoted Γ ⊨ φ) if every that satisfies all formulas in Γ also satisfies φ. In other words, there is no where all members of Γ are T but φ is F. A φ is valid (a ) if the entails it, i.e., ∅ ⊨ φ. Models of Γ are precisely the interpretations satisfying Γ. The for propositional logic states that a set of formulas Γ is if and only if every finite of Γ is . This means that if Γ has no satisfying , then some finite portion of Γ already lacks one, allowing to be checked via finite means despite potentially infinite Γ. The follows from the finite nature of truth assignments over finitely many atoms in any finite and the extension to the whole set via a chain of consistent extensions. Truth tables provide a concrete method to compute valuations exhaustively for formulas with finitely many atomic propositions. Consider the biconditional (p → q) ↔ (¬p ∨ q), which equates implication to the material conditional. The full truth table is:
pq¬pp → q¬p ∨ q(p → q) ↔ (¬p ∨ q)
TTFTTT
TFFFFT
FTTTTT
FFTTTT
This table shows the formula is a tautology, as the final column is always T, confirming the semantic equivalence of the two expressions under all interpretations.

Model-Theoretic Semantics

Model-theoretic semantics interprets formulas of classical predicate logic () using mathematical structures, extending the truth-value assignments of propositional logic to languages with quantifiers, predicates, functions, and constants by considering varying domains and interpretations. This approach, formalized by , defines truth and satisfaction relative to a structure, enabling the study of as preservation across all models. A structure \mathcal{M} for a first-order language L is a pair (D, I), where D is a non-empty set known as the domain of discourse, and I is an interpretation function. The function I maps each constant symbol c \in L to an element I(c) \in D, each n-ary function symbol f \in L to a function I(f): D^n \to D, and each n-ary predicate symbol P \in L to a relation I(P) \subseteq D^n. For example, if P is a unary predicate, then I(P) \subseteq D. This setup allows formulas to be evaluated relative to specific interpretations, contrasting with the fixed truth tables used for propositional connectives. Satisfaction in a structure \mathcal{M} = (D, I) is defined recursively for a formula \phi under a variable assignment \nu: \text{Vars} \to D, denoted \mathcal{M} \models \phi[\nu]. For atomic formulas, such as P(t_1, \dots, t_n) where the t_i are terms, \mathcal{M} \models P(t_1, \dots, t_n)[\nu] holds if (d_1, \dots, d_n) \in I(P), with each d_i being the denotation of t_i under \nu and I. The definition extends to connectives: \mathcal{M} \models \neg \phi[\nu] if not \mathcal{M} \models \phi[\nu]; \mathcal{M} \models \phi \land \psi[\nu] if both \mathcal{M} \models \phi[\nu] and \mathcal{M} \models \psi[\nu]; and similarly for \lor, \to, and \leftrightarrow. For quantifiers, \mathcal{M} \models \forall x \, \phi[\nu] if for every d \in D, \mathcal{M} \models \phi[\nu[x/d]], where \nu[x/d] is \nu modified to assign d to x; dually, \mathcal{M} \models \exists x \, \phi[\nu] if there exists d \in D such that \mathcal{M} \models \phi[\nu[x/d]]. A sentence (closed formula) \phi is true in \mathcal{M} if \mathcal{M} \models \phi[\nu] for any \nu, since it lacks free variables. Herbrand structures provide a specific class of models useful for analyzing , particularly in connection with skolemization. The Herbrand universe for a with function symbols is the set of all ground terms (terms without variables) generated from the constants and functions. A has this set as its domain, with interpretations of predicates extended homomorphically from the term algebra, allowing reduction of via skolemization, where existentials are replaced by skolem functions to preserve . Two formulas \phi and \psi are logically equivalent if they have exactly the same models, meaning for every \mathcal{M} and \nu, \mathcal{M} \models \phi[\nu] \mathcal{M} \models \psi[\nu]. The Löwenheim-Skolem theorem establishes that if a (a set of ) has an infinite model, then it has models of every infinite at least as large as the of the language; in particular, every countable with an infinite model has a countable model. For example, the sentence \exists x \forall y \, R(x,y) is satisfiable: consider a with D = \{a, b\}, where I(R) = \{(a,a), (a,b)\}; assigning x = a satisfies the universal quantifier over all y \in D.

Proof Theory

Deductive Systems

Deductive systems in classical logic provide formal methods for constructing proofs from premises using inference rules that mirror natural reasoning patterns. These systems ensure that derivations are valid relative to the syntax of the language, capturing the structural properties of logical connectives without direct reference to semantics. Two primary systems are and , both developed by in the 1930s to facilitate the analysis of proofs and demonstrate key metatheoretic results. Natural deduction emphasizes the introduction and elimination of logical connectives through rules that correspond to their intuitive meanings. For (∧), the introduction rule (∧I) allows inferring φ ∧ ψ from premises φ and ψ, while the elimination rules (∧E) permit deriving φ (or ψ) from φ ∧ ψ. For (→), the introduction rule (→I) discharges a φ to conclude φ → ψ if ψ follows from φ, and the elimination rule (→E), or , derives ψ from φ and φ → ψ. (¬) follows a similar pattern: ¬φ is introduced (¬I) by assuming φ and deriving a , and eliminated (¬E) by deriving an arbitrary formula from φ and ¬φ. These rules apply to propositional formulas, with extensions to predicates in . A simple derivation in natural deduction illustrates these rules, such as proving p → p from no premises:
  1. Assume p ().
  2. From 1, infer p (reiteration or assumption discharge).
  3. Discharge the assumption to infer p → p (→I).
This highlights the system's ability to generate theorems without external assumptions. To incorporate classical principles, natural deduction augments intuitionistic rules with a rule for indirect proof (), allowing the assertion of φ after deriving a from ¬φ. Equivalently, classical logic can be obtained by adding , ((φ → ψ) → φ) → φ, as an or rule, which enables proofs of excluded middle and elimination. Sequent calculus, Gentzen's alternative system, represents proofs using sequents of the form Γ ⊢ Δ, where Γ is a of (left side) and Δ a of conclusions (right side), with ⊢ denoting derivability. This bilateral structure allows rules to operate symmetrically on antecedents and succedents. Structural rules include weakening, which adds a to either side without affecting validity (e.g., from Γ ⊢ Δ infer φ, Γ ⊢ Δ), and , which merges duplicates (e.g., from φ, φ, Γ ⊢ Δ infer φ, Γ ⊢ Δ). Logical rules govern connectives; for on the left (→L), from Γ ⊢ Δ, φ and ψ, Σ ⊢ Θ infer φ → ψ, Γ, Σ ⊢ Δ, Θ. On the right (→R), from φ, Γ ⊢ ψ, Δ infer Γ ⊢ φ → ψ, Δ. The classical variant () permits multiple formulas on the right, reflecting disjunctive commitments inherent in classical reasoning. Hilbert systems offer another approach, relying on a of axiom schemas and as the sole . Common s include φ → (ψ → φ) and (φ → (ψ → χ)) → ((φ → ψ) → (φ → χ)) for , supplemented by s or s for other connectives like ¬φ → (φ → ψ). This axiomatic style prioritizes a minimal set, contrasting with the -heavy natural deduction and sequent calculus.

Soundness and Completeness

In classical first-order logic, the soundness theorem asserts that if a set of Γ is syntactically provable to entail a φ (denoted Γ ⊢ φ), then φ is semantically valid with respect to Γ (denoted Γ ⊨ φ). This ensures that the deductive system does not prove any semantically invalid statements, preserving truth from premises to conclusions. The proof proceeds by on the length of the : the cases verify that s and logical truths are semantically valid, while the inductive step shows that each preserves semantic entailment, such as mapping true antecedents and implications to true consequents. The completeness theorem provides the converse: if Γ ⊨ φ, then Γ ⊢ φ, meaning every semantically valid statement is syntactically provable. This equivalence between provability and validity was established by in his 1930 paper, linking the syntactic and semantic perspectives in classical logic. Gödel's proof relies on Lindenbaum's lemma, which states that every consistent set of sentences can be extended to a maximal consistent set—a set that is consistent but adding any further sentence renders it inconsistent. From such a maximal consistent set Σ (with ¬φ ∉ Σ if Γ ⊨ φ), one constructs a model where the elements are equivalence classes of terms under provable equivalence in Σ, and interpretations satisfy exactly the sentences in Σ, thereby proving φ. For countable languages, Leon Henkin's 1949 construction offers an explicit method to build models from consistent theories, extending Gödel's approach by iteratively adding witnesses for existential quantifiers via Skolem-like functions while preserving consistency. This involves expanding the language with new constants for each existential formula in the theory and ensuring the extended theory remains consistent, ultimately yielding a model via a countable domain. Key corollaries follow from these theorems. In propositional logic, completeness implies decidability: a is valid if and only if , and validity can be checked algorithmically via truth tables enumerating all finite assignments. For first-order predicate logic, holds—a set of sentences has a model if and only if every finite does—arising because if an is inconsistent, some finite must be, by the contrapositive of . Completeness also yields the characterization that Γ ⊨ φ if and only if every consistent extension of Γ proves φ, as non-provable extensions would yield countermodels via the maximal consistent .

Applications and Extensions

In Mathematics and Philosophy

Classical logic serves as the foundational framework for much of modern , particularly in the formalization of and . Zermelo-Fraenkel with the (ZFC), the standard axiomatic system for sets, is formulated within classical predicate logic, enabling the rigorous definition and proof of mathematical structures from basic axioms. Similarly, Peano , which axiomatizes the natural numbers and their operations, relies on classical logic to express its axioms, including the schema, providing a deductive basis for . These systems demonstrate how classical logic's bivalent truth values and rules of inference support the derivation of theorems across mathematical domains. In mathematical proofs, classical logic facilitates from axioms to conclusions, as seen in elementary . For instance, the —stating that in a right-angled triangle, the square of the equals the sum of the squares of the other two sides—can be proved deductively using classical logical principles, such as and the , applied to axioms about parallels and congruence. This approach underscores classical logic's role in ensuring the validity and universality of mathematical results without reliance on empirical verification. Within philosophy, classical logic is instrumental in analyzing and formalizing arguments, particularly in metaphysics and . Ontological proofs for the , such as Anselm's, are reconstructed using classical logical deduction to argue from the concept of a greatest conceivable being to its necessary , employing principles like . Furthermore, classical logic underpins debates between and in the , where realists defend the objective truth of mathematical statements independent of human cognition, justified by classical bivalence, while anti-realists challenge this by verification-transcendent truths, often proposing alternatives but acknowledging classical logic's dominance in realist frameworks. Classical logic also connects to through its formalization of and Turing machines. , defined via primitive recursion and minimization in classical logical terms, characterize the computable functions, with Turing machines providing an equivalent model that simulates logical deduction processes. This equivalence highlights how classical logic's precise syntax and semantics underpin the theoretical . A key limitation of classical logic in these applications is its undecidability in predicate form, as established by Church and Turing. In 1936, Church proved that there is no general to determine the validity of all sentences in first-order logic, a result independently confirmed by Turing's analysis of , implying that not all mathematical truths can be mechanically decided within classical systems. This undecidability extends to foundational theories like ZFC and Peano arithmetic, as revealed by of 1931, which show that any consistent capable of expressing basic arithmetic contains true but unprovable statements.

Relation to Non-Classical Logics

Classical logic's adherence to the principle of bivalence—every is either true or false—and the distinguishes it from , which requires constructive proofs and rejects the , \vdash A \lor \neg A, unless a of A or \neg A is available. In classical logic, the double negation elimination rule, \neg \neg A \to A, is valid, allowing the inference from \neg \neg A to A, but this fails in , where double negations do not necessarily imply the original without additional constructive evidence. Unlike classical logic, which operates without modal operators, modal logics extend it by incorporating necessity (\square) and possibility (\Diamond) operators to express statements about possible worlds or epistemic states, with systems like S5 providing a classical modal extension where necessity is an equivalence relation across accessible worlds. Classical logic's commitment to bivalence contrasts with fuzzy logics, such as Łukasiewicz logic, which allow truth values in the continuum [0,1] to handle vagueness, rejecting strict true/false dichotomy in favor of degrees of truth for propositions like "tall" or "heap." In contrast to classical logic's ex falso quodlibet principle, where a contradiction implies any statement (explosion), paraconsistent logics tolerate inconsistencies without deriving arbitrary conclusions, enabling reasoning in inconsistent but non-trivial theories, such as those in databases or dialetheic philosophies. Embeddings like Glivenko's theorem demonstrate a formal relationship, stating that a propositional formula is classically provable if and only if its double negation is intuitionistically provable, allowing classical tautologies to be translated into intuitionistic logic via the double negation embedding.

References

  1. [1]
    [PDF] CHAPTER 2 Introduction to Classical Logic
    Classical Logic was created to describe the reasoning principles of mathematics and hence reflects the ”black” and ”white” qualities of mathematics; we expect.Missing: key | Show results with:key
  2. [2]
    [PDF] An Introduction to Classical Propositional Logic: Syntax, Semantics ...
    We now illustrate some general principles of reasoning which are valid in classical propositional logic. We deliberately use formula shapes in terms of ϕ ...
  3. [3]
    [PDF] Chapter Two: The Origins of Logic From Language to Reason
    The Discovery of Logical Necessity. Our historical survey of the origins of logic begins of course with the Greeks; but it does not begin with the Greek's ...
  4. [4]
    [PDF] Classical and constructive logic - andrew.cmu.ed
    Logic can be described as the science of reasoning. Some of the central questions that logicians try to address are: “What constitutes a logical argument?Missing: key | Show results with:key
  5. [5]
    Classical Logic - Stanford Encyclopedia of Philosophy
    Sep 16, 2000 · A logic consists of a formal or informal language together with a deductive system and/or a model-theoretic semantics.Language · Deduction · Meta-theory · The One Right Logic?
  6. [6]
  7. [7]
  8. [8]
  9. [9]
  10. [10]
    Propositional Logic - Stanford Encyclopedia of Philosophy
    May 18, 2023 · Propositional logic is the study of the meanings of, and the inferential relationships that hold among, sentences based on the role that a specific class of ...
  11. [11]
    Non-monotonic Logic - Stanford Encyclopedia of Philosophy
    Dec 11, 2001 · Non-monotonic logic (NML) is a family of formal logics designed to model and better understand defeasible reasoning.<|separator|>
  12. [12]
    Paraconsistent Logic - Stanford Encyclopedia of Philosophy
    Sep 24, 1996 · Paraconsistency is a property of a consequence relation. The argument ex contradictione quodlibet (ECQ) is paraconsistently invalid: in general, ...Missing: falso | Show results with:falso
  13. [13]
    Zeno's Paradoxes of Motion and Plurality (Chapter 3)
    Sep 28, 2020 · The paradoxes are based on Parmenides's understanding of the principle of non-contradiction but allow Zeno to start with the position of his ...3 - Zeno's Paradoxes Of... · 3 Zeno's Paradoxes Of Motion... · 3.6 The Paradoxes Of Motion
  14. [14]
  15. [15]
  16. [16]
  17. [17]
  18. [18]
    [PDF] A Comparative Study of Stoic and Contemporary Logics - PhilArchive
    Aug 1, 2025 · In doing so, we shall see how key Stoic logicians like Chrysippus innovated logical theory, and how these innovations compare to the work of ...
  19. [19]
    Medieval Theories of the Categories
    Apr 14, 2006 · Boethius (ca. 480–524/5) is important because he sought to preserve Greek philosophy by translating all of the works of Plato and Aristotle into ...
  20. [20]
    Medieval Philosophy
    Sep 14, 2022 · Medieval philosophy was regarded as having taken place in Western Europe, mostly in Latin, with Paris and Oxford as its greatest centres.
  21. [21]
    The Algebra of Logic Tradition - Stanford Encyclopedia of Philosophy
    Mar 2, 2009 · The algebra of logic, as an explicit algebraic system showing the underlying mathematical structure of logic, was introduced by George Boole (1815–1864)
  22. [22]
    Principia Mathematica - Stanford Encyclopedia of Philosophy
    May 21, 1996 · Principia Mathematica, the landmark work in formal logic written by Alfred North Whitehead and Bertrand Russell, was first published in three volumes in 1910, ...
  23. [23]
    Propositional Logic | Internet Encyclopedia of Philosophy
    Interpreted in propositional logic, the first is the principle that every statement is either true or false, the second is the principle that no statement is ...
  24. [24]
    Second-order and Higher-order Logic
    Aug 1, 2019 · This is in sharp contrast to first order logic where the following theories are decidable: first order logic with empty vocabulary, first order ...
  25. [25]
    [PDF] CHAPTER 2 INTRODUCTION TO CLASSICAL PROPOSITIONAL ...
    As we consider only two logical values, the semantics is also called 2 valued semantics. It is expressed in terms of truth tables for logical connectives,.
  26. [26]
    [PDF] Lecture 23: The Compactness Theorem
    Compactness Theorem: A set of formulas Φ is satisfiable iff it is finitely satisfiable. The compactness theorem is often used in its contrapositive form: A ...
  27. [27]
    [PDF] On Herbrand's Theorem - UCSD Math
    Herbrand's theorem is one of the fundamental theorems of mathematical logic and allows a certain type of reduction of first-order logic to propositional logic.
  28. [28]
    Natural Deduction Systems in Logic
    Oct 29, 2021 · Natural deduction allows especially perspicuous comparison of classical with intuitionistic logic, as formulations of the two logics can be ...
  29. [29]
    The Development of Proof Theory
    Apr 16, 2008 · He therefore invented another logical calculus that he called sequent calculus (Sequenzenkalkul, literally “calculus of sequences”) and made it ...
  30. [30]
    Proof Theory - Stanford Encyclopedia of Philosophy
    Aug 13, 2018 · ... ex falso quodlibet. In the case of classical logic, if a proof of ⊥ is obtained from the assumption ¬ A , then infer A (and cancel the ...Development of · Appendix D · Provably computable functions
  31. [31]
    Die Vollständigkeit der Axiome des logischen Funktionenkalküls
    Apr 30, 2005 · Die Vollständigkeit der Axiome des logischen Funktionenkalküls ... Article PDF. Download to read the full article text. Use our pre-submission ...
  32. [32]
    [PDF] The Completeness of the First-Order Functional Calculus
    The completeness of the system S0 is an immediate consequence of our theorem. COROLLARY 1: If A is a valid wff of SO then F A. First consider the case where ...
  33. [33]
    Should Type Theory Replace Set Theory as the Foundation of ...
    Feb 13, 2023 · Set Theory is usually based on classical logic, formally it is presented in the framework of first order predicate logic. Set Theory can be ...
  34. [34]
    [PDF] 6 First-order Peano Arithmetic - - Logic Matters
    PA – First-order Peano Arithmetic2 – is the theory with a stan- dard first-order logic whose language is LA and whose axioms are those of Q plus the all ...
  35. [35]
    Pythagorean Theorem and its many proofs
    The Pythagorean (or Pythagoras') Theorem is the statement that the sum of (the areas of) the two small squares equals (the area of) the big one.Euclid's Proof of Pythagoras... · Analog gadgets · The Eye Opener Series
  36. [36]
    [PDF] On the Logic of the Ontological Argument - PhilArchive
    Saint Anselm of Canterbury offered several arguments for the existence of God. We examine the famous ontological argument in Proslogium ii.
  37. [37]
    [PDF] Anti-Realist Classical Logic and Realist Mathematics - Greg Restall
    Oct 13, 2009 · My aim in this paper is to apply a semantically anti-realist under- standing of (classical) logical consequence, and to then use the change of.Missing: vs | Show results with:vs
  38. [38]
    [PDF] Recursive functions and Turing machines
    A function f : Nk → N is obtained by composition from g : Nl → N and h1,...,hl : Nk → N if f(⃗n) = g(h1(⃗n),...,hl(⃗n)). We use the notation f = g◦ (h1,h2,...,hl).
  39. [39]
    [PDF] Undecidability of First-Order Logic - Computer Science
    In- deed, a negative solution to the decision problem was given by Alonzo Church (1903–1995) in 1935–36 and independently by Alan Turing (1912-1954) in 1936–37.
  40. [40]
    Intuitionistic Logic - Stanford Encyclopedia of Philosophy
    Sep 1, 1999 · Intuitionistic logic encompasses the general principles of logical reasoning which have been abstracted by logicians from intuitionistic mathematics.
  41. [41]
    [PDF] Classical logic, intuitionistic logic - Xavier Leroy
    The first lecture introduced intuitionistic logic as classical logic “minus” some reasoning principles: excluded middle P ∨ ¬P, that is, every P is either true ...
  42. [42]
    Modal Logic - Stanford Encyclopedia of Philosophy
    Feb 29, 2000 · Modal logic is, strictly speaking, the study of the deductive behavior of the expressions 'it is necessary that' and 'it is possible that'.<|control11|><|separator|>
  43. [43]
    Jan Łukasiewicz - Stanford Encyclopedia of Philosophy
    May 15, 2014 · Jan Łukasiewicz (1878–1956) was a Polish logician and philosopher who introduced mathematical logic into Poland, became the earliest founder of the Warsaw ...
  44. [44]
    [PDF] Glivenko's theorem
    There are a few fundamental theorems bearing his name. This note is about his embedding of classical logic in intuitionistic logic. Specifically, Glivenko ...Missing: source | Show results with:source