Fact-checked by Grok 2 weeks ago

Church–Turing thesis

The Church–Turing thesis is a central hypothesis in computability theory asserting that every effectively calculable function on the natural numbers can be computed by a Turing machine, and conversely, every function computable by a Turing machine is effectively calculable. Formulated independently in 1936 by Alonzo Church through his concept of λ-definability and by Alan Turing via his abstract model of computation known as the Turing machine, the thesis equates the intuitive notion of an algorithmic process with the formal capabilities of these equivalent systems. This thesis emerged in the context of addressing Hilbert's , the challenge of determining whether there exists an to decide the truth of any mathematical statement in . and Turing each demonstrated that no such general exists by showing the problem is undecidable within their respective frameworks, while establishing that λ-definable functions and Turing-computable functions coincide with Kleene's recursive functions. Although the thesis cannot be rigorously proved due to its reliance on the informal concept of "effective calculability," it has gained near-universal acceptance in mathematics and , as all proposed models of computation have proven equivalent to the in expressive power, with no known counterexamples to its claims. The implications of the Church–Turing thesis extend beyond pure logic to define the boundaries of computation, influencing fields such as , , and by suggesting that certain problems are inherently unsolvable by any algorithmic means. Variations, such as the physical Church-Turing thesis, extend the idea to claim that no physical device can perform computations beyond those of a , though this remains a subject of ongoing debate in and research.

Core Formulation

Original Statements

The Church–Turing thesis originated amid efforts to address the , a central challenge in David Hilbert's program for the formalization of , which sought a general to decide whether any given mathematical statement is provable within a formal . In 1936, and independently developed formal models of computation to tackle this problem, each proposing that their models precisely captured the intuitive notion of effective calculability, thereby demonstrating the Entscheidungsproblem's unsolvability. Alonzo Church's formulation appeared in his short paper "A Note on the ," where he identified effective calculability with functions definable in his λ-calculus system, a formalism for expressing higher-order functions through abstraction and application. Specifically, Church stated that the λ-definable functions of positive integers (those constructible via λ-terms) are exactly the computable functions, meaning every effectively calculable function on positive integers coincides with one obtainable through repeated substitution and application in λ-calculus. This assertion served as the basis for proving the undecidability of the in the functional calculus of . In the same year, Alan Turing published "On Computable Numbers, with an Application to the Entscheidungsproblem," introducing an abstract device now known as the Turing machine to simulate the discrete steps of a human computor's mental process on paper. Turing described computable numbers as "the real numbers whose expressions as a decimal are calculable by finite means," arguing that any such computation could be mechanized by a Turing machine operating on an infinite tape divided into squares, with a read-write head moving left or right according to a finite table of instructions. He posited that this model encompasses all effectively calculable processes, as human computation relies on finite memory and step-by-step symbol manipulation, thereby extending the thesis to claim that every effectively calculable function is Turing-computable. The phrase "effectively calculable" denoted functions computable via a finite, deterministic procedure akin to general recursive functions, which had introduced in 1934 to broaden the scope beyond primitive recursive functions—known since 1928 but insufficient for capturing all intuitive computations, such as the . Church and Turing's models were soon proven equivalent to general recursiveness, solidifying the thesis's core claim.

Equivalence to Other Models

The Church–Turing thesis gains support from the fact that multiple independently developed models of from all capture the same class of computable s, demonstrating their mutual equivalence in expressive power. A on numbers is considered computable if there exists a —or an equivalent model such as the or μ-recursive functions—that, starting from the input encoded appropriately, halts after finitely many steps and produces the output. Alonzo Church's lambda calculus, introduced in 1936, defines computability through functional abstraction and application, where functions are expressed as λ-terms that can be reduced via β-reduction. Church showed that the effectively calculable functions in this system coincide with the general recursive functions. Stephen C. Kleene's μ-recursive functions, formalized in 1936, build on primitive recursive functions extended by a minimization operator (μ) to handle unbounded search. Kleene proved that these μ-recursive functions are precisely the partial functions computable by Turing machines, establishing a direct equivalence between his model and Turing's. Emil L. Post's 1936 model of finite combinatory processes, based on production systems that generate and transform finite strings according to rules, was independently shown to simulate and thus compute the same class of functions. Post's earlier tag systems, a restricted form of such production systems where rules append fixed strings and delete from the front, were later demonstrated by to be Turing-complete, as any can be simulated by a 2-tag system. These equivalences, achieved through explicit simulations (e.g., encoding tapes as terms or calls), underscore the thesis by showing that diverse formalizations of "effective computation" converge on the same boundaries.

Historical Development

1930s Origins

The Church–Turing thesis emerged in the as a response to foundational challenges in , particularly within the framework of , which sought to establish the of mathematical systems through finitary methods and resolve key decision problems. Gödel's incompleteness theorems of 1931 played a pivotal role in this context by revealing inherent limitations in formal axiomatic systems. Specifically, Gödel showed that in any consistent capable of expressing basic arithmetic, there are statements that are true but neither provable nor disprovable within the system, thereby undermining the completeness aspirations of and prompting a precise delineation of what could be mechanically computed or proved. Alonzo Church contributed significantly to formalizing computability during this period through his development of lambda calculus. In 1932, Church published a foundational paper introducing lambda calculus as a system of postulates for logic, where functions are treated as first-class objects via abstraction and application, providing a novel framework for expressing mathematical operations. He extended this work in 1933 with a second paper, refining the postulates to address issues in logical foundations and demonstrating its potential for defining recursive processes without relying on traditional set-theoretic assumptions. These developments were motivated by the need to capture "effective calculability" in the face of Hilbert's Entscheidungsproblem, which asked whether there exists an algorithm to determine the provability of any mathematical statement in a formal system. In 1936, Church applied lambda calculus directly to the Entscheidungsproblem, proving its unsolvability by showing that not all predicates in arithmetic are lambda-definable, thus establishing a negative solution independent of Gödel's earlier results. Concurrently, , working separately in , introduced the concept of in his 1936 paper, abstract devices that model computation through a of states, a tape, and a read-write head, capable of simulating any mechanical calculation. Turing used this model to demonstrate that the Entscheidungsproblem has no general solution, as there exist functions that no Turing machine can compute, such as the halting function. Turing's analysis explicitly referenced Church's recent work on effective calculability, noting its alignment with his own notion of computable numbers—real numbers whose digits can be produced by a finite mechanical process. The convergence of these ideas was facilitated by close collaboration among , Turing, and Church's student Kleene at around 1936–1937. Turing arrived at Princeton as a graduate student in 1936, where he interacted extensively with Church and Kleene, exchanging ideas on recursive functions and ; Kleene, building on Church's , had earlier formalized general recursive functions in 1935–1936, further linking these models. Through correspondence and discussions, they recognized the equivalence of lambda-definability, recursive functions, and Turing computability, laying the groundwork for the thesis that these formalisms capture all effective methods of computation.

Mid-20th Century Refinements

In the years following , refined the Church–Turing thesis in his 1948 National Physical Laboratory report "Intelligent Machinery," where he bridged the gap between abstract logical computing machines (LCMs) and practical computing machines (PCMs). Turing argued that PCMs, such as those using electronic components for discrete operations, possess computational power equivalent to LCMs for all effective procedures, provided they incorporate mechanisms for unbounded storage and sequential control. This linkage emphasized the thesis's applicability to real-world , positing that any intuitively could be realized by such machines . Alonzo Church provided a key clarification in his 1956 textbook Introduction to Mathematical Logic, explicitly framing the thesis as an empirical subject to potential revision based on future developments in computation theory or practice. Church stressed that while the equivalence of lambda-definability, recursive functions, and Turing computability offered strong inductive support, the thesis remained a grounded in the consensus of effective methods rather than a provable . This perspective underscored the thesis's role as a foundational yet provisional principle in . Stephen Kleene, in his Introduction to Metamathematics (1952), compiled extensive evidence from recursive function theory and to bolster the , portraying it as a maximally plausible rather than an indemonstrable primitive, and introduced the term "Church's ." These exchanges solidified the 's acceptance within the logic community while acknowledging its non-formal nature. The emergence of computers in the late and early offered early empirical corroboration, as machines like (completed in 1945) demonstrated practical implementations of Turing-complete architectures. ENIAC's ability to perform conditional branching, looping, and arbitrary reconfiguration via plugboard wiring exemplified a physical instantiation of a , albeit with finite memory constraints; it could simulate any algorithmic process given adequate scaling.

Evidence and Validation

Empirical Computability

The empirical support for the Church–Turing thesis arises from the consistent observation that every effectively calculable function investigated in mathematics and the sciences has been found to be computable by a Turing machine, with no exceptions identified despite decades of scrutiny. This body of evidence includes analyses of diverse computational paradigms, such as recursive functions, lambda-definability, register machines, Post systems, and Markov algorithms, all of which align with Turing computability. A comprehensive survey of this alignment is provided in Kleene's examination of effective calculability, emphasizing the uniformity across formal systems. Modern digital computers exemplify this thesis in practice, functioning as universal Turing machines capable of simulating any other Turing machine given sufficient resources and appropriate programming. This universality enables them to execute a vast array of computations, from numerical simulations to symbolic manipulations, mirroring the theoretical model's expressive power. For instance, the AKS primality testing algorithm, developed in , determines whether a number is prime in deterministic time using algebraic identities over finite fields, a process fully realizable on a . Likewise, algorithms, such as , which recursively divides and recombines lists to achieve ordered output in O(n log n) time, operate entirely within the bounds of Turing computability. Ongoing research in physics reinforces this empirical foundation, as no mechanisms enabling —computations beyond Turing limits—have been observed or theoretically supported within established physical laws, such as those governing discrete . Proposals for , including those involving relativistic effects or infinite precision, remain speculative and incompatible with empirical data from . This absence of counterexamples across theoretical and practical domains underscores the thesis's robustness as a for effective .

Theoretical Justifications

One key theoretical justification for the Church–Turing thesis arises from Kleene's normal form theorem, which establishes a precise equivalence between the class of partial recursive functions and those computable by Turing machines. The theorem states that every partial recursive function can be represented in a canonical form using a primitive recursive predicate T (Kleene's T-predicate) and a primitive recursive extraction function U, such that for any partial recursive function \phi_e, there exists an index e where \phi_e(x) = U(\mu y \, T(e, x, y)), with \mu denoting the minimization operator. This normal form demonstrates that the computational power of recursive definitions aligns exactly with that of Turing machines, providing a formal bridge between Church's \lambda-definability and Turing's machine model. Rice's theorem further supports the thesis by revealing fundamental limits inherent to the class of computable functions, implying that any non-trivial semantic property of recursively enumerable sets is undecidable. Formulated in 1953, the theorem asserts that for any non-empty proper subset C of the recursively enumerable sets, the set of indices \{e \mid W_e \in C\} (where W_e is the domain enumerated by the e-th ) is not recursive, provided the property is independent of the specific syntax of the program. This result underscores that capture the full extent of decidable properties, as properties beyond trivial ones (emptiness or totality) cannot be algorithmically determined within the model, reinforcing the thesis's delineation of boundaries. The speed-up theorem, developed by Blum in 1967, provides additional evidence by showing that the model accommodates inherent inefficiencies in computation, aligning with the thesis's claim that no stronger process exists. The proves that for any total computable measure C, there exists a f such that for every P computing f, there is another Q computing f with C_Q(n) < C_P(n) for all sufficiently large n, where C_P(n) denotes the complexity of P on input n. This demonstrates that optimal algorithms are for certain functions within the , suggesting that the model's resource bounds reflect true limits rather than artifacts of the formalism. Complementing this, the function, introduced by Radó in , defines \Sigma(n) as the maximum number of 1's produced on the by any halting n-state, 2-symbol starting from a blank . Since \Sigma(n) grows faster than any total , it is uncomputable, yet its relies solely on the halting behavior of , thereby illustrating that the identifies the precise frontier beyond which no effective can proceed. Finally, Turing's own argument from human cognition bolsters the thesis by modeling mental computation as a , finite-step akin to a . In his analysis, Turing posited that a "computer" performs calculations through a sequence of bounded operations, each involving scanning a finite configuration and applying a rule in finite time, without unbounded discretion or infinite precision. This finite-state, step-by-step mechanism mirrors the Turing machine's operation, implying that if humanly effective methods are captured by the model, then the thesis holds for all mechanical computation.

Variations and Extensions

Strong and Weak Forms

The weak form of the Church–Turing thesis, often referred to as the classical formulation, asserts that a function from natural numbers to natural numbers is effectively calculable by a human using only paper and pencil if and only if it is computable by a . This version captures the intuitive notion of algorithmic computation in mathematics and logic, equating "effective calculability" with the formal model introduced by in 1936 and independently by via λ-definability in the same year. In practice, the weak thesis is frequently employed as a definitional tool in , where Turing computability serves as the standard for identifying effectively computable functions, as noted by Kleene in his 1952 introduction of the term "Church's thesis." The strong form of the thesis advances a bolder claim, positing that every physically realizable computation—carried out by any discrete, deterministic mechanical device adhering to the laws of physics—is computationally equivalent to a , meaning it cannot exceed the power of . This extension, sometimes termed the physical Church-Turing thesis, was formalized by Gandy in 1980 through four principles governing the behavior of mechanical systems, including finite speed, discrete states, and local action, which he argued align physical computation with . Proponents view it as a natural generalization of the weak form to real-world devices, implying that no physical computer can solve problems beyond those solvable by . Critiques of the strong form highlight its susceptibility to challenges from non-standard physical interpretations, where certain systems might produce non-Turing-computable outputs. A seminal example is the work of Pour-El and Richards (1981), who constructed a solution to the wave equation with computable initial data whose unique solution is non-computable, thereby questioning whether all physical processes are bounded by Turing equivalence. Such results suggest that the strong thesis may falter if physical laws permit behaviors not captured by Gandy's axioms, like infinite precision or supertasks, though the weak form remains unchallenged in its mathematical domain.

Physical and Relativistic Variants

The physical Church–Turing thesis posits that any computation feasible under the laws of physics can be performed by a , thereby extending the original thesis to physical realizability. This variant emphasizes that the fundamental principles of physics impose computational limits equivalent to those of , preventing the construction of devices capable of . In 1985, formalized a quantum mechanical version of this principle, arguing that supports the Church–Turing hypothesis by allowing universal quantum computers that simulate any physical process but remain within Turing-equivalent bounds. A relativistic variant of the thesis incorporates , asserting that the finite precludes super-Turing computations in physically realistic spacetimes. According to this view, no device can perform an infinite number of operations in finite due to causal constraints and the light-speed limit, thus upholding Turing-level even in relativistic settings. John Earman and John D. Norton explored this in , analyzing supertasks in Malament-Hogarth spacetimes—regions near rotating black holes where infinite computations might seem possible from one observer's perspective—but concluded that such scenarios fail to yield verifiable super-Turing results without violating physical realism. Quantum Turing machines provide further evidence for the physical thesis, as they demonstrate computational power equivalent to classical Turing machines despite incorporating and entanglement. Ethan Bernstein and established in 1997 that quantum Turing machines can be efficiently simulated by classical probabilistic ones for decision problems, confirming that quantum computation does not exceed Turing limits in terms of solvability, though it offers speedups for specific classes like . Counterarguments to the physical thesis invoke hypothetical constructs like or wormhole-based computing, which purportedly enable via relativistic . For instance, in Malament-Hogarth spacetimes, a computer orbiting a could perform infinite steps in finite external time while an infalling observer receives the output quickly, potentially solving the . However, these models remain unrealized and face significant physical obstacles, including the and information loss paradoxes, rendering them speculative rather than viable challenges to the thesis.

Implications and Applications

Philosophical Aspects

The Church–Turing thesis carries significant implications for the , particularly in debates over mechanism and whether human cognition is inherently Turing-computable. John Searle's argument illustrates this by positing a person following syntactic rules to manipulate symbols without understanding their meaning, thereby challenging the notion that computational processes alone can produce genuine semantics or . This critique leverages the thesis to argue that, since all effective computations are Turing-equivalent, no program can instantiate true mental states, though extensions to interactive or self-reflective models may evade this limitation by incorporating environmental dynamics beyond . In the , the thesis intersects with tensions between , which posits abstract mathematical objects independent of human construction, and , which views mathematics as rule-based manipulation. It aligns more closely with intuitionistic perspectives by stipulating that effective proofs must be mechanically verifiable and constructive, rejecting non-constructive proofs as non-effective. Leivant's shows that, within intuitionistic frameworks like Heyting arithmetic, the thesis supports for all total number-theoretic functions through finite, effective procedures, yielding results such as the categoricity of intuitionistic arithmetic and constructive analogs of classical theorems. Epistemologically, the is best understood as an empirical rather than a synthetic a priori , lacking formal provability due to its grounding in the informal of "effectiveness." Roman Murawski outlines supporting arguments, including the equivalence of multiple formalisms (Turing machines, recursive functions) and the absence of counterexamples, positioning it as a well-confirmed akin to scientific theories, though inherently unprovable because it bridges precise with vague . reinforces this by reformulating the to emphasize practical executability by idealized agents, bridging theoretical limits with empirical . The thesis also fuels debates on whether human creativity and intuition surpass Turing limits, as argued by in his Gödelian case against computationalism. Penrose contends that mathematicians can discern the truth of Gödel sentences non-algorithmically, implying non-computable mental processes rooted in quantum effects. Standard computationalist refutations, however, highlight flaws in this reasoning, such as the enthymematic assumption that human insight escapes formal systems and the failure to account for in practice, thereby upholding the thesis's scope for .

Role in Proofs and Theory

The Church–Turing thesis plays a central role in by providing an informal justification for using as the when proving undecidability results. In particular, it allows researchers to argue that if a problem is undecidable for , then it is undecidable by any , as the thesis posits that all effectively computable functions are Turing-computable. This assumption underpins Alan Turing's 1936 proof of the halting problem's undecidability, where he demonstrated that no can determine whether an arbitrary halts on a given input, thereby extending the result to all algorithmic procedures via the thesis. In , the thesis justifies the use of Turing machines (or equivalent models) to define complexity classes like and , assuming that these machines capture all feasible computations. For instance, the , which asks whether every problem verifiable in polynomial time by a is also solvable in polynomial time by a deterministic one, relies on the thesis to equate nondeterministic Turing machine power with any physically realizable computation model. This equivalence ensures that resolutions to P versus NP would hold across computational paradigms, as deviations would contradict the thesis's claim that Turing machines suffice for effective computation. The thesis also informs software verification by asserting that properties verifiable by any effective procedure are checkable by a Turing machine, though often undecidably so in practice. In formal methods, this justifies reducing verification tasks to Turing machine simulations, such as model checking algorithms that explore state spaces, while acknowledging limits like undecidability for full program halting. For example, formal proof assistants have been used to develop and verify models of Turing machines and related computability concepts, leveraging the thesis to confirm that such verifications align with all algorithmic possibilities. Furthermore, the thesis supports the definition of recursive functions and Church numerals in programming languages, grounding their in the equivalence between and Turing machines. Church numerals, which encode natural numbers as lambda terms (e.g., the numeral for 2 as \lambda f.\lambda x. f(f(x))), enable the expression of arithmetic in pure functional languages like , demonstrating that these languages compute exactly the recursive functions identified by the thesis. This connection ensures that language features based on or primitive recursion capture all effectively computable operations without exceeding Turing limits.

Beyond Computability

Hypercomputation Concepts

Hypercomputation refers to theoretical models of computation that surpass the limitations imposed by the Church-Turing thesis, which posits that any effectively calculable function can be computed by a Turing machine. These models explore hypothetical extensions beyond standard Turing computability, often by relaxing assumptions about time, space, or physical realizability, thereby challenging the universality of Turing machines. While they provide mathematical insights into degrees of unsolvability, their relevance to the thesis lies in questioning whether physical or idealized processes could enable computations of non-Turing-computable functions, such as those requiring oracles or infinite resources. One foundational concept in is the , introduced by as a theoretical extension of the . An operates like a standard but includes access to an external "," a hypothetical device that instantaneously answers queries about membership in a given set, such as the . Turing proposed this in the context of ordinal logics to analyze the limits of formal systems, where the allows the machine to solve problems undecidable by ordinary , like the . For instance, a with a halting can decide the for all , effectively computing functions beyond the recursive hierarchy. This model illustrates relative computability, where degrees of unsolvability are stratified, but it remains purely abstract, as no physical is specified. Building on transfinite ordinals, infinite-time Turing machines (ITTMs) extend computation into transfinite time, allowing infinitely many steps over ordinal time scales. Developed by Joel David Hamkins and Andy Lewis, ITTMs modify the to run for limit ordinals, where at limit stages, the tape content stabilizes to the limit of previous configurations, and the state and head position are determined by liminf rules. These machines can compute non-recursive well-orderings and real numbers, such as the function, which grows faster than any and is non-computable in finite time. For example, ITTMs converge on the set of writable reals, which includes non-arithmetical sets, demonstrating that allowing transfinite yields a broader class of s. However, the halting times for ITTMs are bounded by the Church-Kleene ordinal, limiting their power relative to higher ordinals. This framework relates to the Church-Turing thesis by probing whether "effective computability" could encompass supertasks in idealized infinite processes. Analog hypercomputers represent another approach, leveraging continuous physical processes rather than discrete steps to potentially exceed Turing limits. A prominent example involves Malament-Hogarth (MH) spacetimes in , where an observer can experience infinite along a worldline while a distant observer sees only finite . Proposed by Mark Hogarth, these spacetimes arise in solutions to Einstein's field equations, such as those involving rotating black holes, allowing a computer along the infinite-time worldline to perform supertasks, like solving the by simulating all possible computations in "eternal" . For instance, in an MH spacetime, a could enumerate and check all inputs for a given program in infinite steps, outputting a solution in finite external time. This suggests might permit without violating local , though it requires precise control over gravitational fields. Despite their theoretical intrigue, models face significant critiques regarding physical realizability. Most proposals, including oracle machines and ITTMs, demand infinite resources or idealized conditions that contradict known physics, such as infinite precision or time, rendering them unphysical. Analog models like those in MH spacetimes are inconsistent with , as quantum effects introduce discreteness and uncertainty that prevent the required infinite computations without errors accumulating to undermine reliability. Martin Davis has argued that such schemes violate fundamental physical principles, akin to machines, emphasizing that no proposed hypercomputer evades the Church-Turing limits under realistic constraints. These critiques reinforce the thesis by highlighting that effective remains bounded by Turing machines in any plausible physical universe.

Non-computable Examples

The , which asks whether a given will halt on a given input, was proven undecidable by in 1936, demonstrating a fundamental limit on what can be computed algorithmically. proceeds by : assuming an algorithm exists to decide halting leads to a contradiction via a self-referential machine that loops if it would halt and halts otherwise. This result directly supports the Church–Turing thesis by showing that no effective procedure can solve the problem for all cases, as the thesis posits that capture all computable functions. The function, denoted Σ(n), measures the maximum number of 1's that an n-state can produce on a blank tape before halting. Introduced by Tibor Radó in , it grows faster than any total , rendering it non-computable. Radó showed that computing Σ(n) would require solving the for all n-state machines, which is impossible; thus, while values like Σ(1) = 1 and Σ(2) = 4 are known, higher values elude algorithmic determination. Kolmogorov complexity, K(x), defines the complexity of a string x as the length of the shortest that outputs x on a . formalized this in as one of three approaches to quantifying , emphasizing algorithmic descriptions over statistical measures. It is uncomputable because determining K(x) involves deciding whether a program halts and produces x, tying directly to the undecidability of the ; moreover, K(x) is not even approximable from above in a computable way for all x. The (PCP) asks whether, for given lists of strings over an , there exists a sequence of indices such that the concatenations from each list match. Proposed by Emil Post in 1946 as a variant of an unsolvable problem, PCP is undecidable for two or more pairs of strings. Post's proof reduces the to PCP by encoding computations into string matches, showing that a solver for PCP would imply a halting , which contradicts Turing's result. For instance, consider pairs (1, 111), (10111, 10), (10, 0) over the alphabet {0,1}; a exists via the sequence 2,1,1,3 yielding 101111110 on both top and bottom, but no general can verify this for arbitrary instances.

References

  1. [1]
    [PDF] What is the Church-Turing Thesis?
    Church-Turing Thesis: The partial functions over the natural numbers that can be effectively computed are exactly the functions that can be computed by. Turing ...<|control11|><|separator|>
  2. [2]
    [PDF] A Note on the Entscheidungsproblem Alonzo Church The Journal of ...
    Mar 3, 2008 · The Entscheidungsproblem is the problem to find an effective method to determine if an expression is provable in a system of symbolic logic.
  3. [3]
    [PDF] ON COMPUTABLE NUMBERS, WITH AN APPLICATION TO THE ...
    The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means.
  4. [4]
    AlanTuring.net The Turing-Church Thesis
    ### Definition and Historical Context of the Church-Turing Thesis
  5. [5]
    [PDF] “The Church-Turing “Thesis” as a Special Corollary of Gödel's - CUNY
    Traditionally, many writers, following Kleene (1952) , thought of the Church-Turing thesis as unprovable by its nature but having various strong arguments ...Missing: primary | Show results with:primary
  6. [6]
    The Church-Turing Thesis (Stanford Encyclopedia of Philosophy)
    Jan 8, 1997 · The Church-Turing thesis concerns the concept of an effective or systematic or mechanical method, as used in logic, mathematics and computer science.
  7. [7]
    [PDF] An Unsolvable Problem of Elementary Number Theory Alonzo ...
    Mar 3, 2008 · The unsolvable problem involves finding an effectively calculable function where f(x1, x2,...) = 2 is a condition for a proposition, and not all ...Missing: note | Show results with:note<|control11|><|separator|>
  8. [8]
    General recursive functions of natural numbers. - EuDML
    Kleene, SC. "General recursive functions of natural numbers.." Mathematische Annalen 112 (1936): 727-742.
  9. [9]
    [PDF] Finite Combinatory Processes-Formulation 1 Emil L. Post The ...
    May 9, 2007 · Finite Combinatory Processes-Formulation 1. Emil L. Post. The Journal of Symbolic Logic, Vol. 1, No. 3. (Sep., 1936), pp. 103-105. Stable URL ...
  10. [10]
    [PDF] Recursive Unsolvability of Post's Problem of "Tag" and other Topics ...
    May 14, 2007 · We will show (Theorem 11) that any Turing machine can be represented, in a simple sense, as a "Tag" system, obtaining the unsolvability of the ...
  11. [11]
    Turing and the computer - Oxford Academic - Oxford University Press
    One of Turing's most accessible formulations of the Church–Turing thesis is found in a report written in 1948, 'Intelligent Machinery': LCMs [Turing machines] ...
  12. [12]
    Introduction To Metamathematics : Stephen Cole Kleene
    Jun 20, 2023 · Introduction To Metamathematics. by: Stephen Cole Kleene. Publication date: 1952-01-01. Publisher: D. Van Nostrand Company Inc. Collection ...
  13. [13]
    ENIAC Turing Completeness - Department of Computer Science
    As with any physical realization of a Turing complete machine, there are finite limitations on this implementation. Specifically, the set of states is ...
  14. [14]
    [PDF] COMPUTING MACHINERY AND INTELLIGENCE - UMBC
    A. M. Turing (1950) Computing Machinery and Intelligence. Mind 49: 433-460. COMPUTING MACHINERY AND INTELLIGENCE. By A. M. Turing. 1. The Imitation Game. I ...
  15. [15]
    [PDF] PRIMES is in P - Microsoft
    research/btp2002/primality.html. [KSS]. A. Kalai, A. Sahai, and M. Sudan, Notes on primality test and analysis of AKS,. Private communication, August 2002 ...Missing: original | Show results with:original
  16. [16]
    Computation, hypercomputation, and physical science - ScienceDirect
    Finally, (d) regardless of whether we accept this condition, the prospects for a scientific theory of hypercomputation are exceedingly poor because physical ...
  17. [17]
    (PDF) The Myth of Hypercomputation - ResearchGate
    Feb 10, 2015 · We challenge this notion, arguing that current evidence does not support the idea that quantum technologies enable hypercomputation.
  18. [18]
    CLASSES OF RECURSIVELY ENUMERABLE SETS AND THEIR ...
    H. G. RICE. 1. Introduction. In this paper we consider classes whose elements are re- cursively enumerable sets of non-negative integers.
  19. [19]
    A Machine-Independent i - ACM Digital Library
    The problem is to characterize the complexity of computable functions. The theory developed here is expanded along lines suggested by Rabin's axiomatic approach.
  20. [20]
    [PDF] On Non-Computable Functions - Gwern.net
    The construction of non-computable functions used in this paper is based on the principle that a finite, non-empty set of non-negative integers has a.
  21. [21]
    Quantum theory, the Church–Turing principle and the universal ...
    Abstract. It is argued that underlying the Church–Turing hypothesis there is an implicit physical assertion. Here, this assertion is presented explicitly as a ...
  22. [22]
    [PDF] Supertasks in Pitowsky and Malament-Hogarth Spacetimes
    Jan 25, 2005 · Forever Is a Day: Supertasks in Pitowsky and Malament-Hogarth. Spacetimes. John Earman; John D. Norton. Philosophy of Science, Vol. 60, No. 1 ...
  23. [23]
    Quantum Complexity Theory | SIAM Journal on Computing
    Therefore, there is no possibility of giving a mathematical proof that quantum Turing machines are more powerful than classical probabilistic Turing machines ( ...
  24. [24]
    [PDF] Physical Hypercomputation and the Church-Turing Thesis
    The only known exception is given by Pour-El and Richards (1981) who constructed examples for the wave equation in which the initial data is Turing-.
  25. [25]
    [PDF] From the Chinese Room Argument to the Church-Turing Thesis
    Abstract. Searle's Chinese Room thought experiment incorporates a number of assumptions about the role and nature of programs.
  26. [26]
    Variations on a Thesis: Intuitionism and Computability - Project Euclid
    Traditionally, the intuitionists' attitude toward CT has been strongly negative; it was thought that Church's The- sis was obviously false. The fact that it is ...
  27. [27]
    [PDF] Church's thesis and its epistemological status - Biblioteka Nauki
    Abstract. The aim of this paper is to present the origin of Church's thesis and the main arguments in favour of it as well as arguments against it.Missing: 1997 | Show results with:1997
  28. [28]
    [PDF] A Refutation of Penrose's Gödelian Case Against Artificial Intelligence
    This paper refutes Penrose's Gödelian case against AI, arguing it has technical glitches, is enthymematic, and is similar to a previous failed argument.
  29. [29]
    [PDF] The P versus NP problem - Clay Mathematics Institute
    The theory of NP-completeness has its roots in computability theory, which originated in the work of Turing, Church, Gödel, and others in the 1930s. The.
  30. [30]
    Verified Programming Of Turing Machines In Coq
    Sep 14, 2018 · Moreover, it is common to employ the Church-Turing thesis, to (informally) conclude that a function is Turing-computable. Reasons for that ...
  31. [31]
    A NOTE ON THE ENTSCHEIDUNGSPROBLEM
    In a recent paper1 the author has proposed a definition of the commonly used term "effectively calculable" and has shown on the basis of this definition.
  32. [32]
    [PDF] Systems of logic based on ordinals (Proc. Lond. Math. Soc., series 2 ...
    Turing considered several natural ways in which ordinal logics could be constructed: (i) A p, obtained by successively adjoining statements directly overcoming ...
  33. [33]
    Does general relativity allow an observer to view an eternity in a ...
    I investigate whether there are general relativistic spacetimes that al- low an observer g to collect in a finite time all the data from the worldline of ...
  34. [34]
    [PDF] Three approaches to the quantitative definition of information
    Dec 21, 2010 · There are two common approaches to the quantitative definition of. "information": combinatorial and probabilistic. The author briefly de-.
  35. [35]
    A VARIANT OF A RECURSIVELY UNSOLVABLE PROBLEM
    A VARIANT OF A RECURSIVELY UNSOLVABLE PROBLEM. EMIL L. POST. By a string on a, 6 we mean a row of a's and 6's such as baabbbab. It may involve only a, or 6 ...