Fact-checked by Grok 2 weeks ago

Gentzen's consistency proof

Gentzen's consistency proof is a seminal result in mathematical logic, established by German mathematician Gerhard Gentzen in 1936, demonstrating the consistency of Peano arithmetic (PA)—the standard formal system for natural number theory—by proving that no formal derivation within PA can yield a contradiction, such as 0 = 1. This proof relies on Gentzen's development of sequent calculus, a formal deduction system where proofs are represented as sequents (expressions of the form Γ ⇒ Δ, indicating that the formulas in Γ imply those in Δ), and employs the cut-elimination theorem to normalize proofs into cut-free forms that depend only on atomic axioms and structural rules. The proof unfolds in two main stages: first, Gentzen shows that every provable in extended with a generalized admits a cut-free proof, which can be reduced to a finite, combinatorial interpretable in a weaker subsystem like (); second, to handle the full strength of 's schema, Gentzen introduces along well-ordered ordinals up to ε₀ (the least fixed point of α ↦ ω^α), assigning ordinal ranks to proof trees to ensure termination of the normalization process without circularity. This not only establishes but also determines the proof-theoretic ordinal of as ε₀, marking the boundary of what can be proved finitistically about 's derivations. Historically, Gentzen's work emerged in response to Gödel's incompleteness theorems, which revealed that no finitary consistency proof for sufficiently strong systems like is possible within the system itself, thereby challenging David Hilbert's program to secure the foundations of mathematics through finite methods. Although Gentzen's use of was criticized by some, such as , as exceeding strict , it is widely regarded as the first rigorous relative consistency proof for , inspiring subsequent developments in , including ordinal analyses of stronger systems and automated proof search techniques. The proof's influence extends to computer science, where sequent calculi underpin logical frameworks for verification and .

Background and Historical Context

Hilbert's Program

David Hilbert initiated his formalist program in a series of lectures during the 1920s, most notably in his 1921 talks and the 1925 address "Über das Unendliche," where he advocated for proving the consistency of mathematical systems using only finite, concrete methods to safeguard against paradoxes arising from infinite processes. This approach was further formalized in the 1928 textbook Grundzüge der theoretischen Logik co-authored with , which provided a rigorous framework for as the basis for finitary . Hilbert's goal was to establish on secure foundations by demonstrating that formal axiomatic systems, such as those for and , are free from through proofs conducted entirely within finitary means, avoiding any reliance on idealized infinite structures. Central to were the principles of , which restricted reasoning to finite objects and intuitive manipulations thereof, and contentual reasoning, an informal metamathematical analysis that justified the of formal systems without invoking their elements. Hilbert distinguished between "real" elements—finite, contentually graspable objects like symbols and finite sequences—and "ideal" elements, such as sets or transfinite numbers, which were to be treated as useful fictions analogous to ideal points in , tolerated only after proving the overall system's via finitary methods. This allowed classical to retain its power while grounding it in concrete, paradox-free operations. The program emerged as a response to foundational crises in the early , including Bertrand Russell's 1901 paradox exposing contradictions in and L.E.J. Brouwer's , which challenged the and impredicative definitions in favor of constructive proofs. Hilbert sought to vindicate classical by providing metamathematical proofs that would demonstrate the reliability of methods as safe abbreviations for finitary ones, thereby resolving these debates without abandoning established mathematical practice. A cornerstone of the program was securing the of first-order , for which Hilbert proposed the epsilon substitution method as an early finitary technique to eliminate quantifiers from proofs and normalize terms, though it proved insufficient for full success. Gödel's incompleteness theorems later challenged the program's feasibility by showing that no finitary consistency proof could capture the full strength of , yet they did not entirely refute Hilbert's vision, as subsequent work explored refined finitary or near-finitary approaches.

Gödel's Incompleteness Theorems

Kurt Gödel's , published in 1931, established profound limitations on the power of formal axiomatic systems in mathematics, particularly those capable of expressing . These results demonstrated that such systems, if consistent, are inherently incomplete and cannot verify their own consistency from within. Gödel's work appeared in the paper "Über formal unentscheidbare Sätze der und verwandter Systeme I," which targeted systems like and Russell's but extended to broader formal theories. The first incompleteness theorem asserts that in any consistent sufficient to formalize basic , there exist statements that are true but neither provable nor disprovable within the system. A canonical example is the Gödel sentence G, a self-referential statement constructed to assert its own unprovability: if the system proves G, then G is false (leading to inconsistency), but if it does not prove G, then G is true yet unprovable. This theorem applies to any consistent system at least as strong as Robinson arithmetic Q, a weak axiomatization of without full that still encodes recursive functions adequately. The second incompleteness theorem builds on the first, stating that if such a is , it cannot prove its own ; formally, the statement \operatorname{Con}(S), expressing the of S, is unprovable in S. Gödel derived this by showing that \operatorname{Con}(S) implies the unprovability of the Gödel sentence, reducing the second theorem to the first. Central to Gödel's proofs is the arithmetization of syntax, or , which assigns unique natural numbers to formulas, proofs, and other syntactic objects, thereby encoding metamathematical notions as arithmetic within the system itself. The diagonalization lemma then guarantees the existence of self-referential sentences, allowing statements like the Gödel sentence to be formalized arithmetically. Additionally, Gödel defined a provability predicate \operatorname{Bew}(x), which captures whether a number x (representing a sentence) is provable, enabling the precise formulation of unprovability claims. These theorems exposed a critical vulnerability in , which sought finitary proofs of consistency for strong mathematical systems from within those systems, rendering absolute consistency proofs unattainable in this manner.

Gentzen's Methodological Innovations

Sequent Calculus

is a for developed by , representing logical inferences through sequents of the form \Gamma \Rightarrow \Delta, where \Gamma and \Delta are finite multisets of formulas denoting that the of formulas in \Gamma implies the disjunction of those in \Delta. This framework allows for a more flexible and intuitive handling of multiple premises and conclusions compared to single-conclusion systems. The system operates via two main categories of rules: structural rules, which manage the form of sequents without altering their logical meaning, and logical rules, which handle the connectives and quantifiers. Structural rules include weakening, which adds a formula to \Gamma or \Delta (e.g., from \Gamma \Rightarrow \Delta to \Gamma, A \Rightarrow \Delta); , which removes duplicates (e.g., from \Gamma, A, A \Rightarrow \Delta to \Gamma, A \Rightarrow \Delta); exchange, which reorders formulas in \Gamma or \Delta; and cut, which links two sequents sharing a formula A (e.g., from \Gamma \Rightarrow A and A, \Delta \Rightarrow \Theta to \Gamma, \Delta \Rightarrow \Theta). Logical rules specify how to introduce or eliminate operators on either the antecedent (left) or succedent (right) side of the . For (\land), the left rule (\landL) permits inferring \Gamma, A \land B \Rightarrow \Delta from \Gamma, A, B \Rightarrow \Delta, while the right rule (\landR) derives \Gamma \Rightarrow \Delta, A \land B from both \Gamma \Rightarrow \Delta, A and \Gamma \Rightarrow \Delta, B. For the universal quantifier (\forall), the left rule (\forallL) allows \Gamma, \forall x \, A(x) \Rightarrow \Delta from \Gamma, A(t) \Rightarrow \Delta for any term t, and the right rule (\forallR) yields \Gamma \Rightarrow \Delta, \forall x \, A(x) from \Gamma \Rightarrow \Delta, A(a), where a is a fresh eigenvariable. Gentzen introduced in his 1934 dissertation "Untersuchungen über das logische Schließen," published in Mathematische Zeitschrift in 1935. Unlike Hilbert-style systems, which rely on a fixed set of axioms and , sequent calculus better reflects natural reasoning patterns by separating structural manipulations from logical inferences, resulting in more concise proofs and enabling normalization techniques such as cut-elimination. Proofs in form an inverted structure, with sequents like A \Rightarrow A at the leaves and the target at the root, where each internal node applies an inference rule to its subproofs.

The , also known as Gentzen's Hauptsatz, states that in the , every that has a possibly involving cut inferences also possesses an equivalent without any cuts, thereby preserving the provability of the . This result establishes that the cut rule is redundant for deriving , as its elimination yields a normal form proof consisting solely of logical , structural rules, and non-cut inferences. Gentzen proved this theorem in his seminal 1935 paper by employing an iterative reduction process on proofs containing cuts. The proof proceeds by on the cut-rank, defined as the maximum logical (rank) of the formulas involved in cut inferences within the . For a given cut, the elimination involves transforming the proof through a series of local that replace the cut with auxiliary inferences, such as resolvents derived from the principal formulas on either side of the cut, and incorporating a mix rule to handle the interaction between the subderivations above and below the cut. These systematically lower the cut-rank until no cuts remain, ensuring that the final proof derives the same . A crucial aspect of the proof is the strict decrease in a well-founded measure—either the overall proof length or the cut-rank—at each step, which guarantees termination of without infinite regressions. This enables important properties, such as the midsequent , where cut-free proofs exhibit a clear separation between the introduction of assumptions and the extraction of conclusions, facilitating analytic proofs that directly construct models from the proof structure. The stands in a relationship to Herbrand's on the of for , as both address the transformation of proofs into canonical forms that reveal semantic content. The applies equally to both (in the LK system) and (in the LJ system), demonstrating the robustness of across these foundational frameworks. However, the process is not size-preserving; in , the size of the cut-free proof can grow doubly exponentially relative to the original proof's size, highlighting the computational intensity of normalization despite its theoretical elegance.

The Consistency Proof

Overview of the Approach

Gentzen's consistency proof for Peano arithmetic () establishes that is consistent relative to a weaker subsystem by demonstrating that no proof of a , such as 0=1, can exist within the system. The core strategy involves embedding into the system , normalizing proofs through cut-elimination to eliminate complex inferences, and assigning ordinals less than ε₀ to these normalized proof trees based on their structure. If a were derivable, the normalization process would induce a strictly descending sequence of such ordinals, which is impossible due to the well-ordering of the ordinals below ε₀. This ordinal assignment ensures that the proof-theoretic strength of is captured without invoking full transfinite methods beyond the specified bound. The proof is conducted within (PRA) augmented with quantifier-free (QI) up to ε₀, a system considerably weaker than itself. PRA provides the finitistic base for primitive recursive functions and basic arithmetic, while the QI schema allows induction over ordinal notations for formulas without quantifiers, enabling the handling of the ordinal descent argument without presupposing PA's full induction axiom. This relative consistency result aligns with by reducing PA's consistency to principles justifiable on finitistic grounds, though the use of sparked debate regarding its acceptability. The key steps begin with translating PA's axioms and rules into the sequent calculus , where proofs are represented as tree-like derivations. Cut-elimination then transforms any proof into a normal form without cut inferences, preserving the derivability of the end while simplifying the structure for ordinal valuation. Ordinals are assigned recursively to subproofs, reflecting the complexity of logical operations and ensuring that reductions decrease the ordinal measure. The impossibility of infinite descent follows from the well-foundedness of the ordinal segment below ε₀, thereby ruling out contradictory derivations. This approach was detailed in Gentzen's 1936 paper, published after revisions prompted by Paul Bernays' objections concerning the original infinite descent argument. The ordinal ε₀ is defined as the least fixed point of the function α ↦ ω^α, representing the supremum of ordinals expressible in Cantor normal form using finite terms with base ω. Gentzen developed a for these ordinals using finite sequences to encode proof complexities up to this limit.

Transfinite Induction up to ε₀

In Gentzen's consistency proof for Peano arithmetic, the key methodological tool is the principle of quantifier-free along the ordinal ε₀, which extends (PRA) to handle the well-ordering required for the argument without invoking full second-order principles. This induction schema states: for any quantifier-free A(α) in the language of PRA, if A(0) holds, and for all β < α, A(β) implies A(β + 1), and for all limit ordinals of the form ω^β < α, A(β) implies A(ω^β), then A(α) holds for all α < ε₀. The schema is restricted to quantifier-free instances to avoid the expressive power of full quantifier induction, ensuring the system remains finitistically interpretable while sufficient for the proof. The underlying is defined in Cantor normal form, where ordinals less than ε₀ are built from finite , , and with base ω. Specifically, is defined recursively as α + 0 = α and α + (β + 1) = (α + β) + 1, with limit cases following the ; as α · 0 = 0 and α · (β + 1) = (α · β) + α; and as ω^0 = 1, ω^(β + 1) = ω^β · ω, with ω^λ for limits λ being the supremum of ω^μ for μ < λ. The ordinal ε₀ is the least fixed point of the function, satisfying ω^{ε₀} = ε₀, and serves as the supremum of 1, ω, ω^ω, ω^{ω^ω}, and so on. This structure ensures ε₀ is well-ordered, with no infinite descending sequences, which is presupposed but justified by the itself in the proof context. The role of this is to establish the well-foundedness of the ordinal assignments to proofs and sequents, thereby proving that the procedure terminates and no like 0 = can be derived. Gentzen's system, formalized as PRA augmented with this quantifier-free up to ε₀ (denoted PRA + TI_{<ε₀}^{QF}), proves the of Peano arithmetic relative to PRA augmented with quantifier-free up to ε₀ (PRA + TI_{<ε₀}^{QF}), avoiding circularity by not requiring the full strength of . Technically, ordinals are assigned to sequents based on the height of proof trees and the "critical formulas" involving cuts, where the ordinal of a is determined by the maximum cut-rank (the of formulas in cuts) exponentiated by ω to the height, ensuring strict descent under reductions. For instance, initial sequents receive small ordinals, such as 0 for certain atomic axioms, while more complex derivations receive larger ordinals approaching ε₀.

The Reduction Procedure

In Gentzen's consistency proof, the reduction procedure begins with a hypothetical cut-free proof of the contradiction $0=1 in the sequent calculus formulation of Peano arithmetic (PA). This proof, obtained via the cut-elimination theorem, is then subjected to further normalization steps involving inversion reductions and contraction reductions. Inversion reductions address inferences involving logical connectives or quantifiers by propagating the conclusion downward through the proof tree, while contraction reductions eliminate redundant assumptions in sequents with multiple occurrences of the same formula. Each such reduction transforms the proof into a new derivation where the assigned ordinal measure strictly decreases at critical steps, ensuring progress toward a normal form. The ordinal assignment to a proof is defined recursively on the structure of the proof tree: the ordinal of the entire tree is the supremum of the ordinals assigned to its immediate subtrees, constructed using the Veblen normal form hierarchy up to \varepsilon_0. This assignment leverages the well-ordering of the class of all finite rooted trees under a relation, where the rank of a tree is determined by the ranks of its subtrees plus an additive finite component. For proofs in , all such ordinals remain below \varepsilon_0, the least fixed point of the function \alpha \mapsto \omega^\alpha. Quantifiers are handled through eigenvariable conditions, ensuring that fresh variables introduced in existential elimination or universal introduction do not appear free in auxiliary formulas, thereby maintaining the subformula property and preventing circularities in the ordinal computation. Applying the reduction procedure iteratively to the assumed cut-free proof of $0=1 generates a of proofs with associated ordinals \beta_0 \geq \beta_1 \geq \cdots < \varepsilon_0, where strict decreases (\beta_n < \beta_{n-1}) occur at critical steps, such as those resolving non-atomic inferences. These critical sequences cannot descend infinitely due to the well-foundedness of the ordinals below \varepsilon_0. The termination argument proceeds by up to \varepsilon_0: assuming a in PA leads to an infinite descending , which would imply the of a minimal "false" ordinal in this well-ordering, resulting in an . Thus, no such proof of $0=1 can exist, establishing PA's consistency.

Implications and Comparisons

Relation to Hilbert's Program and Gödel's Theorems

Gentzen's consistency proof, published in 1936, was explicitly motivated by Kurt Gödel's 1931 incompleteness theorems, which demonstrated that Peano arithmetic (PA) cannot prove its own consistency using only finitary methods formalizable within the system itself. This work aligned with David Hilbert's program, which sought to secure the foundations of mathematics through finitary consistency proofs for ideal systems like PA, treating infinitary elements as mere idealizations justified by concrete finitary means. Gentzen achieved a proof of PA's consistency within primitive recursive arithmetic (PRA), a weak finitary subsystem, but incorporated transfinite induction up to the ordinal ε₀, which Hilbert might have classified as ideal rather than strictly finitary. In a 1938 lecture, Gödel reinterpreted Gentzen's approach as compatible with an extended version of Hilbert's program, emphasizing relativized consistency results over absolute finitary proofs. Gentzen's method responded directly to Gödel's second incompleteness theorem by establishing consistency in an external, weaker system: PRA augmented with quantifier-free transfinite induction up to ε₀ (PRA + QI(ε₀)). This demonstrates that PA is interpretable in PRA + QI(ε₀), thereby bypassing the theorem's barrier against self-verification while providing a relative consistency result: Con(PA) if and only if Con(PRA + QI(ε₀)). Gödel's theorems preclude an absolute finitary proof of Con(PA), but Gentzen's relativization advances Hilbert's ideals by showing that PA's consistency follows from principles closer to finitary intuition, albeit with controlled transfinite extensions. A key debate arose from Paul Bernays' 1936 objection to Gentzen's initial 1935 manuscript, which implicitly relied on non-constructive elements akin to the fan theorem for handling potentially proof trees, prompting its withdrawal. Gentzen resolved this in his publication by employing finitary representations of ordinals to justify the termination of the cut-elimination process without invoking infinite proofs directly. In a 1938 clarification, Gentzen further emphasized that these ordinal notations remain finitary objects, ensuring the proof's alignment with Hilbert's finitary standards despite the .

Other Consistency Proofs of Arithmetic

Following Gerhard Gentzen's pioneering relative proof for Peano arithmetic () in 1936, he developed two additional proofs. In 1938, Gentzen presented a revised proof for elementary using ordinal notations in normal form, which represent finite ordinal terms up to \epsilon_0 within the , establishing consistency relative to a system with along these notations. In 1943, during his work, Gentzen outlined another proof relying on quantifier-free induction up to the ordinal \epsilon_0, showing that this schema is derivable within PA itself, to bound the length of purported inconsistent derivations. In 1940, provided an independent proof for first-order , employing up to \epsilon_0 in a typed theory based on his epsilon-calculus, which formalizes higher-type functionals and achieves comparable ordinal strength to Gentzen's methods but through a distinct syntactic framework. Kurt Gödel's Dialectica interpretation, introduced in 1958, yields a relative proof for Heyting (HA) with respect to a quantifier-free system of functionals of finite type, and by the double-negation translation, this extends to the of PA relative to the same functional base theory, avoiding explicit ordinals in favor of choice functions and Skolem terms. In 1959, I. N. Khlodovskii constructed a finitary consistency proof for using Herbrand's on the expansion of proofs, which analyzes derivations via infinite disjunctions equivalent in strength to \epsilon_0- but without invoking explicit ordinals, relying instead on combinatorial bounds from Herbrand sequences. Gaisi Takeuti's 1975 work generalized techniques for proofs, extending Gentzen-style methods to subsystems of and providing a framework for proving well-foundedness of ordinal notations up to higher ordinals like the Feferman-Schütte ordinal \Gamma_0, applicable to fragments of and beyond. These post-Gentzen proofs all establish relative consistency of (or fragments thereof) to weaker base theories involving , functionals, or , varying in their ordinal upper bounds (typically \epsilon_0 or \omega^\omega) and formal systems (, typed, or functional); however, none achieve absolute finitary in Hilbert's strict sense, as they transcend primitive recursive methods. In subsequent decades, explicit new consistency proofs for PA proper became sparse, with research shifting toward ordinal analyses of stronger theories, though Takeuti's generalization marked a key extension of the approach.

Legacy and Further Developments

Ordinal Analysis

Ordinal analysis is a branch of proof theory that assigns well-ordered ordinals to formal theories as a measure of their proof-theoretic strength, particularly in establishing relative via . For a T, the proof-theoretic ordinal |T| is defined as the least ordinal \alpha such that the base theory of (PRA) extended by primitive recursive up to \alpha proves the of T, denoted \Con(T). In the case of Peano arithmetic (PA), Gentzen established that |PA| = \epsilon_0, where \epsilon_0 is the least fixed point of the on ordinals, satisfying \omega^{\epsilon_0} = \epsilon_0. Gentzen's consistency proof for in 1936 marked the inception of , providing the first explicit computation of a proof-theoretic ordinal through a of proof and an ordinal that assigns ordinals to derivations. This approach demonstrated that the consistency of follows from along well-orderings up to \epsilon_0, with the ensuring that cut-elimination preserves ordinal assignments below this bound. Gentzen's method highlighted how ordinal notations, such as those in Cantor normal form, could capture the complexity of proofs without relying on impredicative principles. The core method of ordinal analysis involves embedding proofs into ordinal structures, typically via or cut-elimination procedures, to show that the consistency of T is provable by up to |T|. Proofs are assigned ordinals based on their derivation length or complexity, and reduces them while maintaining well-foundedness below |T|, thereby yielding \Con(T) in a weaker . This technique extends to stronger systems; for instance, the proof-theoretic ordinal of the subsystem of \Pi^1_1-CA_0 (comprehension axiom for \Pi^1_1 formulas) is given by Buchholz's ordinal \psi_0(\Omega_\omega), which lies far beyond \epsilon_0 in the Veblen hierarchy extended by large cardinals notations. Historically, ordinal analysis evolved through refinements in the and , with Kurt Schütte developing infinitary proof systems and and Schütte identifying \Gamma_0 as the limit of predicative analysis in 1968. In the 1970s and 1980s, Wolfram Pohlers advanced the field by formalizing ordinal analyses using and introducing the method of local predicativity, which systematically bounds the strength of impredicative systems like iterated inductive definitions. Pohlers' work culminated in comprehensive treatments that clarified the role of ordinal diagrams and hierarchies in consistency proofs. Feferman, while contributing to predicative bounds, critiqued aspects of the methodology, emphasizing that ordinal analyses must align with intuitive notions of predicativity to avoid overstepping foundational limits, as the choice of base systems and induction principles affects the resulting ordinals. A key feature of proof-theoretic ordinals is their uniqueness up to notation systems, ensured by standard representations like Cantor normal form, which provide a canonical way to compare well-orderings. This allows to gauge the relative strengths of theories—such as being weaker than \Pi^1_1-_0 since \epsilon_0 < \psi_0(\Omega_\omega)—without constructing full models, offering a finitistic alternative to semantic proofs.

Applications and Extensions

Gentzen's consistency proof, through its establishment of cut-elimination and up to ε₀, has found significant applications in demonstrating the independence of certain theorems from . A prominent example is , formulated in 1944, which asserts that every Goodstein sequence—defined by iteratively replacing powers in hereditary base representations with higher bases while subtracting 1—eventually reaches zero. Despite the sequences' hyper-exponential growth, their termination requires induction along ε₀, rendering the theorem unprovable in , as shown by Kirby and Paris in 1982 using model-theoretic techniques that embed into models where such sequences do not terminate. Building on Gentzen's methods, Gödel developed the Dialectica in 1958 as a functional interpretation that translates classical into a quantifier-free system of higher-type functionals, yielding a of PA's relative to this system. This , which extends Gentzen's negative translation and cut-elimination ideas, has been widely applied in to classify theorems by the strength of subsystems of needed for their proofs, often revealing uniform bounds extractable from non-constructive proofs. In the 1980s, Wilfried Buchholz advanced these ideas through hydra games, which generalize Kirby-Paris hydras to model termination phenomena associated with higher ordinals, and collapsing functions that embed large ordinals into smaller notation systems for proof-theoretic . These tools facilitated ordinal analyses beyond ε₀, measuring the consistency strength of stronger systems. Extending this in the 1990s and 2000s, Michael Rathjen conducted ordinal analyses of impredicative systems, such as Kripke-Platek with impredicative comprehension, developing sophisticated notation systems that collapse Mahlo cardinals to finite levels, thereby determining precise proof-theoretic ordinals for these theories. On the computational side, Ulrich Kohlenbach's proof mining program in the 1990s employs a functional interpretation—a variant of Gödel's Dialectica approach—to extract effective uniform bounds from classical proofs in PA and , even when the original proofs lack constructive content. For instance, applied to fixed-point theorems in spaces, it yields explicit rates of , enhancing applications in and optimization. Post-1969 developments have included refined ordinal notations for Kripke-Platek , enabling analyses of admissible ordinals and their role in recursion theory. Furthermore, connections to termination proofs in programming languages have emerged, with formal verifications of Gentzen's consistency proof in the proof assistant during the 2010s confirming its details through machine-checked cut-elimination and ordinal induction steps.

References

  1. [1]
    [PDF] Die Widerspruchsfreiheit der reinen Zahlentheorie - Digizeitschriften
    Titel: Die Widerspruchsfreiheit der reinen Zahlentheorie. Autor: Gentzen, G. Jahr: 1936. PURL: https://resolver.sub.uni-goettingen.de/purl?235181684_0112 ...
  2. [2]
    The Development of Proof Theory
    Apr 16, 2008 · Gentzen's last proof determined the “proof-theoretic ordinal” of Peano arithmetic, namely the one that is needed to prove consistency, with the ...
  3. [3]
    [PDF] INTRODUCTION TO THE THEORY OF PROOFS 3A. The Gentzen ...
    We outline here the proof of (basically) the strongest consistency result which can be shown finitistically. Definition 3J.1 (Primitive Recursive Arithmetic, I) ...
  4. [4]
    [PDF] The Consistency of Arithmetic - Timothy Y. Chow
    The crux of Gentzen's consistency proof is something known as the ordinal number 0. Some accounts of 0 make it seem ''even more infinitary'' than the set of ...
  5. [5]
    Über das Unendliche | Mathematische Annalen
    Vortrag, gehalten am 4. Juni 1925 gelegentlich einer zur Ehrung des Andenkens an Weierstraß von der Westfälischen Mathematischen Gesellschaft veranstalteten ...
  6. [6]
    Grundzüge der Theoretischen Logik - SpringerLink
    In stock Free deliveryGrundzüge der Theoretischen Logik. Overview. Authors: D. Hilbert ,; W. Ackermann. D. Hilbert. View author publications. Search author on: PubMed Google Scholar.
  7. [7]
    Gödel's Incompleteness Theorems
    Nov 11, 2013 · The article was published in January 1931 (Gödel 1931; helpful introductions to Gödel's original paper are Kleene 1986 and Zach 2005). The ...
  8. [8]
    Untersuchungen über das logische Schließen. I
    Download PDF · Mathematische Zeitschrift Aims and scope Submit manuscript. Untersuchungen über das logische Schließen. I. Download PDF. Gerhard Gentzen. 1789 ...
  9. [9]
  10. [10]
    Structural Cut Elimination: I. Intuitionistic and Classical Logic
    We present new variants of known proofs of cut elimination for intuitionistic and classical sequent calculi. In both cases the proofs proceed by three ...
  11. [11]
    [PDF] Herbrand's theorem as higher order recursion
    Feb 19, 2020 · We now turn to the relationship between cut elimination and Herbrand's theorem. 2.2. Herbrand's theorem and cut elimination ... dual (cut) formula ...
  12. [12]
    On the elimination of quantifier-free cuts - ScienceDirect.com
    When investigating the complexity of cut-elimination in first-order logic, a natural subproblem is the elimination of quantifier-free cuts.
  13. [13]
    Die Widerspruchsfreiheit der reinen Zahlentheorie
    Die Widerspruchsfreiheit der reinen Zahlentheorie. Published: December 1936. Volume 112, pages 493–565, (1936); Cite this article. Download PDF · Mathematische ...
  14. [14]
    [PDF] Did Gentzen Prove the Consistency of Arithmetic? - Daniel Waxman
    In 1936, Gerhard Gentzen famously gave a proof of the consistency of Peano arithmetic. There is no disputing that Gentzen provided us with a mathematically.
  15. [15]
    [PDF] Where is the Gödel- Point hiding: Gentzen's Consistency Proof of ...
    As far as Gentzen's 1936 proof is concerned, the idea and the results are preserved in this paper. However, Gentzen omitted quite a few proofs and provided ...
  16. [16]
    Proof Theory - Stanford Encyclopedia of Philosophy
    Aug 13, 2018 · The rather famous ordinal that emerged in Gentzen's consistency proof of PA is denoted by ε 0 . It refers to first ordinal ...Development of · Appendix D · Provably computable functionsMissing: QI | Show results with:QI
  17. [17]
    Gentzen's consistency proof without heightlines - SpringerLink
    Feb 6, 2013 · To show termination of the reduction procedure an ordinal assignment based on techniques of Howard for Gödel's T is used. Download to read ...Abstract · Author Information · About This Article<|control11|><|separator|>
  18. [18]
  19. [19]
    Full article: Gödel, Gentzen, and Constructive Consistency Proofs
    Gentzen is mentioned in the Yale lecture, with the remark that the consistency proof is obtained 'very easily' as a byproduct of the functional interpretation ( ...Missing: significance | Show results with:significance
  20. [20]
    [PDF] Proof Theory and the Art of Ordinal Analysis
    • Gentzen also showed that his result is best possible: PA. proves transfinite induction up to α for arithmetic predicates for any α<ε0. PROOF THEORY AND THE ...
  21. [21]
    (PDF) The art of ordinal analysis - ResearchGate
    Ordinal-theoretic proof theory came into existence in 1936, springing forth from Gentzen s head in the course of his consistency proof of arithmetic. The ...
  22. [22]
    [PDF] Ordinal analysis without proofs - andrew.cmu.ed
    Following Gentzen's lead, we would like to say that the proof theoretic ordinal of T is bounded by α when there is a finitary proof of the following: Whenever T ...
  23. [23]
    [PDF] Proof theory of IDs 8.09 - Mathematics
    The first breakthrough on the problems of ordinal analysis for the classical systems was made by Pohlers (1975) to give ordinal upper bounds for the finite IDn ...
  24. [24]
    [PDF] Predicativity - Mathematics
    What is predicativity? While the term suggests that there is a single idea involved, what the history will show is that there are a number of ideas of.