Fact-checked by Grok 2 weeks ago

Digital infinity

Digital infinity, also referred to as infinity, is a core principle in describing the human capacity to produce an unbounded array of hierarchically structured linguistic expressions from a finite of elements through recursive computational processes. This property enables languages to generate infinite sentences while adhering to finite rules and vocabulary, distinguishing from non-recursive animal signaling systems. The concept was formalized by as a defining feature of the faculty, positing that it arises from a primitive recursive operation called Merge, which combines syntactic objects to form new sets, allowing for endless hierarchical complexity. In Chomsky's biolinguistic framework, digital infinity represents an evolutionary innovation, potentially emerging from a single genetic mutation that introduced into human cognition around 50,000–100,000 years ago, coinciding with the "" in symbolic behavior. This system interfaces discrete phonological and syntactic components to convey meaning, ensuring that linguistic outputs are both infinite in variety and finitely interpretable. Digital infinity underscores the computational nature of language, analogous to the generation of natural numbers from basic arithmetic operations, and has profound implications for understanding , , and . While Chomsky attributes it to an innate , alternative evolutionary models emphasize cultural and social factors, such as symbolic play and collective intentionality, in its development. Ongoing research explores its neural basis and parallels in non-linguistic domains like and music.

Definition and Fundamentals

Core Definition

Digital infinity, also referred to as discrete infinity or the infinite use of finite means, denotes the capacity of human language to produce an unbounded number of distinct expressions from a of elements, including phonemes, morphemes, and syntactic rules. This property enables speakers to generate sentences of arbitrary length and complexity without limit, such as progressing from six-word to seven-word structures indefinitely, while maintaining discreteness—no intermediate or fractional expressions are possible. The mechanism underlying digital infinity relies on and combinatorial operations within the , which permit the repeated and combination of linguistic units into hierarchical structures. For instance, relative clauses can be nested recursively, as in "The cat that chased the mouse that ate the cheese ran away," allowing this to extend endlessly for greater elaboration. A finite and rule set thus yield variety, distinguishing human language's generative power from more limited systems. In , digital infinity stands as a core feature that sets human language apart from , which typically involves finite signals lacking such unbounded . This concept underpins Chomsky's , positing an innate computational system for and use.

Discrete vs. Analog Infinity

refers to the capacity of linguistic systems to generate an unbounded number of expressions from a of discrete, countable units, such as phonemes, through combinatorial rules and , without degradation in structure or meaning. This property, often phrased as the "infinite use of finite means," enables languages to produce novel sentences indefinitely while maintaining precise distinctions. In , phonemes serve as these basic units, characterized by oppositions—for instance, the distinction between voiced and unvoiced sounds, like /b/ versus /p/—which allow for systematic recombination into words and larger structures. In contrast, analog infinity arises in continuous systems, such as gestural communication, where signals vary smoothly across an range of values, like fluid hand movements in signing. These systems permit theoretically endless variations but lack the discrete boundaries that ensure fidelity, leading to practical limits in expressiveness. Analog representations are susceptible to blending and degradation, which limit combinatorial possibilities and constrain their utility for truly , degradation-free expressivity. The key distinction lies in how these systems handle unbounded generation: discrete infinity in language avoids information loss by operating on stable, symbolic units that can be recursively embedded without error amplification, whereas analog infinity confronts inherent degradation. This structural difference underscores why discrete mechanisms uniquely support the open-ended observed in human .

Historical Origins

Pre-20th Century Insights

In his 1632 work Dialogo sopra i due massimi sistemi del mondo, observed that the tongue, despite relying on a finite number of sounds, possesses the capacity to generate an variety of words and meanings, likening this process to the artistic creation of diverse effects from limited materials. This insight highlighted as a uniquely faculty capable of boundless expression through constrained elements, distinguishing it from the repetitive patterns observed in natural phenomena. René Descartes, in the 1637 Discourse on the Method of Rightly Conducting One's Reason and of Seeking Truth in the Sciences (Part V), further elaborated on the innate linguistic capacity of the human mind, emphasizing its creative power to invent and use signs or words for flexible communication across novel situations. Descartes contrasted this with animals and machines, which might mimic speech—such as parrots uttering words—but lack the rational soul required for true understanding or innovative application of language, thereby underscoring the mind's inherent ability to transcend mechanical imitation. Antoine Arnauld and Claude Lancelot's 1660 Grammaire générale et raisonnée (also known as the Port-Royal Grammar) built on these ideas by positing universal rational structures underlying all languages, which enable the endless combination of finite elements to express thoughts. The authors argued that these logical frameworks reflect the mind's innate , allowing for the systematic generation of expressions that mirror human cognition's depth and versatility. Wilhelm von Humboldt, in his 1836 work Über die Verschiedenheit des menschlichen Sprachbaues und ihren Einfluss auf die geistige Entwicklung des Menschengeschlechts, explicitly articulated that language makes "infinite use of finite means," emphasizing its generative power to produce an unbounded set of expressions from limited resources. Collectively, these 17th- and 19th-century perspectives framed as a defining marker of human rationality, setting it apart from the mechanical repetition evident in animal behavior or natural processes, and laying early groundwork for recognizing of infinite use of finite means in linguistic expression.

Mid-20th Century Formulations

In the early 20th century, laid foundational groundwork for conceptualizing as a of discrete , where each consists of a signifier (a sound-image) and a signified (a ) united arbitrarily without any necessary natural connection. This arbitrariness stems from , with the value of each determined purely by its differential relations to other within the , rather than intrinsic properties. Saussure emphasized the discrete nature of these units—such as phonemes and morphemes—identified through auditory and articulatory analysis, forming a self-contained whole governed by synchronic relations. This structure enables combinatorial potential, as finite discrete elements combine linearly (syntagmatically) and associatively to generate diverse expressions, highlighting the capacity for variety from limited components. Building on structuralist principles in the and , advanced the formalization of units in through his distinctive features theory, decomposing phonemes into bundles of oppositions such as vocalic/non-vocalic or nasal/non-nasal. These features, categorized into prosodic and inherent types, operate on a yes-no basis, allowing listeners to distinguish sounds efficiently with minimal decisions—for instance, five choices suffice to identify 15 consonants. By reducing phonemic inventories to a of 12 inherent features across languages, Jakobson's approach formalized phonemes as , simultaneous bundles that constrain and enable systematic sequencing within syllables and words. This framework underscores the generative capacity of elements, permitting combinations of sounds from a finite while eliminating redundancies in phonological patterning. Early developments in formal logic, particularly Kurt Gödel's of , indirectly illuminated the challenges of discrete systems by demonstrating limits in axiomatic formalization, influencing the adoption of recursive methods from in linguistic modeling. Gödel's work, which showed that sufficiently powerful formal systems cannot prove their own consistency without external assumptions, highlighted the recursive structures inherent in and , such as countable infinities (ℵ₀), that parallel the unbounded yet discrete nature of linguistic expressions. These insights from bridged and , emphasizing as a for generating sequences from finite rules, a concept that resonated in evolving linguistic theories. Following , underwent a pivotal shift toward rigorous formal models in the , solidifying discrete as a of through integrations of and logical . This era saw the consolidation of binary and relational frameworks from earlier into systematic approaches, treating as a countable generated by discrete operations, distinct from continuous analog processes. Such formulations marked a transition from descriptive to axiomatic methods, establishing discrete as essential for explaining 's productivity.

Role in Linguistics

Chomsky's Framework

Noam Chomsky's foundational contributions to linguistics in the 1950s and 1960s, particularly in his 1957 book , introduced the concept of digital infinity—also termed discrete infinity—as a core property of human language, where a of rules can generate an infinite array of sentences. This generative capacity served as key evidence for an innate (UG), a biologically endowed system that enables speakers to produce and comprehend novel utterances beyond any finite experiential input. Chomsky argued that language is not a product of rote but of discrete, rule-based computations that recursively build syntactic structures, distinguishing human linguistic ability from systems limited to finite repertoires. Central to Chomsky's framework is the argument, which posits that children acquire complex grammatical knowledge despite exposure to only a limited and often degenerate sample of linguistic data from their environment. This "poverty" implies that learners must rely on innate, hardwired mechanisms to infer recursive rules capable of yielding digital infinity, as no amount of finite input could otherwise account for the rapid and uniform mastery of across diverse cultures. In works like Aspects of the Theory of Syntax (1965), Chomsky elaborated that such acquisition challenges empiricist models, reinforcing the need for a that discretely processes hierarchical structures to handle unbounded sentential complexity. A pivotal element in Chomsky's evolving theory is the Merge operation, introduced in his 1995 , which serves as the fundamental recursive mechanism for combining discrete lexical elements into hierarchical syntactic objects. Merge operates by iteratively selecting and uniting two syntactic units—such as words or phrases—without bounds, thereby generating the infinite expressive potential of from a finite and rule set. This operation underscores digital infinity's role in , as it minimally encodes the brain's capacity for unbounded computation while adhering to principles of economy and efficiency in linguistic design. Chomsky's framework emerged amid the of the 1950s, directly challenging behaviorist theories that viewed as mere imitation shaped by external reinforcement, as exemplified by B.F. Skinner's (1957). In his influential 1959 review of Skinner's work, Chomsky critiqued such stimulus-response models for failing to explain the creative, productivity of speech, instead advocating for internal mental processes that harness digital infinity through innate structures. This critique positioned as a , emphasizing and discreteness as hallmarks of rather than associative learning.

Generative Mechanisms

In , recursion serves as a core mechanism for achieving digital infinity by enabling the self-embedding of , allowing a of rules to produce of unbounded and complexity. , such as S → NP VP (where S is a , NP a , and VP a ) and VP → V S (with V as a ), permit recursive application, generating nested constructions like relative clauses within clauses without imposing depth limits. This recursive property ensures that from a limited grammar, an infinite array of hierarchical structures can emerge, as demonstrated in analyses of English syntax where embeddings like "The man who saw the dog that chased the cat ran away" can extend indefinitely. Transformational grammar extends this capacity through discrete operations applied to underlying phrase structures, introducing systematic variations in sentence production while maintaining semantic integrity. Rules for movement, such as subject-auxiliary inversion in questions (e.g., transforming "You can go" to "Can you go?"), and deletion, like the removal of redundant elements in coordination, operate on base structures to yield diverse surface forms from a finite base. These transformations, being rule-governed and finite in number, amplify the generative power by creating novel syntactic arrangements without requiring an lexicon or rule set. During the and , generative semantics refined these ideas by positing deep structures as abstract, semantically rich digital representations that undergo finite transformational processes to produce observable surface structures. Proponents argued that semantic relations are encoded directly in these deep forms, with transformations serving as computational steps to derive phonetic realizations, thus realizing through layered, manipulations rather than purely syntactic alone. A practical illustration of these mechanisms is how a finite —estimated at around word families for an average adult English speaker—combined with recursive rules and transformations, generates entirely novel sentences, such as hypothetical future scenarios like "If the that learns from which evolves over time were to predict outcomes beyond current models, society might adapt in unforeseen ways." This capacity underscores Chomsky's as the theoretical enabler, providing the innate framework for such finite-to-infinite mapping across languages.

Computational Connections

Turing Machines and Digital Computation

In 1936, introduced the as an abstract , consisting of an infinite tape divided into cells, a read/write head that moves left or right, a of states, and a table of transition rules dictating actions based on the current state and scanned symbol from a finite alphabet. This device, operating with finite control mechanisms, can compute any by generating potentially infinite sequences on the tape, such as decimal expansions of computable real numbers like π or . The model's power lies in its ability to produce unbounded outputs from bounded resources—discrete symbols and rules—directly paralleling digital infinity, where finite means yield infinite variety without reliance on continuous processes. Turing's framework underscored the discrete, step-by-step nature of effective , contrasting with analog methods by emphasizing symbolic manipulation in a . In his 1950 paper, Turing applied these principles to cognition, proposing that computers could simulate intelligent behavior, including , through where a converses indistinguishably from a . He envisioned with theoretically unlimited storage, capable of executing any discrete-state process if programmed appropriately, thus enabling unbounded of cognitive tasks like generating sentences. This vision reinforced the to linguistic systems, where finite grammatical rules discretely combine symbols to form expressions, devoid of analog gradations. The core parallel between and digital infinity in computation is the use of finite, discrete elements—states, symbols, and —to achieve generative unboundedness, a concept that influenced mid-20th-century . In the 1960s, drew on Turing's computational models to argue for the digital nature of the mind, positing that human grammar operates as a akin to a , linking finite symbolic to the infinite generation of .

Computational Theory of Mind

The (CTM) posits that cognitive processes operate through discrete, rule-based manipulations of symbolic representations, akin to digital computation, enabling the generation of mental structures from finite resources. provides compelling evidence for this framework, as humans instinctively acquire and employ generative grammars that produce an unbounded array of novel expressions from a limited set of rules and vocabulary, demonstrating the mind's capacity for discrete symbolic processing. This view aligns with the concept of digital infinity, where finite computational mechanisms yield potentially endless outputs, much like recursive algorithms in programming. In the 1960s, philosopher advanced as a cornerstone of CTM, proposing that mental states are defined not by their physical realization but by their functional roles in computational systems—inputs, outputs, and relations to other states. This allows a finite neural architecture to support infinite cognitive possibilities, as mental content emerges from the patterned execution of formal operations, independent of the underlying hardware. Putnam's approach emphasized that the mind computes over abstract states, bridging philosophy and early by treating psychological predicates as machine table entries in a theoretical framework. Central to CTM is the syntax-semantics , where purely syntactic rules govern the combination of symbols, thereby conferring semantic content and enabling generative capacity in thought and communication. This mechanism underpins how formal languages in , such as those in modern large language models, approximate aspects of human through statistical , though they do not employ recursive syntactic rules and have been criticized by Chomsky for lacking the generative capacity of digital infinity. The 1980s and 1990s saw intense debates in that reinforced CTM's prominence, particularly against connectionist alternatives that modeled via distributed, analog-inspired neural networks lacking explicit symbolic rules. Proponents like argued that only discrete, syntax-driven computation could account for the systematicity and productivity of —such as understanding —contrasting with connectionism's sub-symbolic , which struggled to explain these discrete phenomena without supplementary mechanisms. These discussions, exemplified in critiques of parallel distributed models, ultimately bolstered CTM by highlighting its explanatory power for rule-based in . As of 2025, debates continue with the rise of large language models (LLMs), where some argue they demonstrate emergent linguistic abilities challenging traditional CTM, while Chomsky maintains they fail to exhibit true and creativity inherent in digital .

Philosophical and Cognitive Implications

Infinite Use of Finite Means

The concept of the infinite use of finite means originates from Wilhelm von Humboldt's 1836 formulation, where he portrayed as an endlessly productive system drawing from a limited repertoire of elements and rules to generate boundless expressions. revived and elaborated this idea in the , highlighting its role in linguistic creativity as the capacity to produce novel, non-imitative utterances that transcend rote repetition or direct environmental stimuli. This principle underpins key aspects of human cognition by allowing discrete symbolic systems to support abstract reasoning, the exploration of hypothetical scenarios, and the perpetual innovation of ideas unbound by immediate context. It fosters through the iterative sharing and refinement of concepts, enabling societies to build complex knowledge structures that accumulate indefinitely. A prime illustration appears in and , where a finite yields infinitely diverse narratives and metaphors, markedly differing from the constrained, finite signaling systems in that lack such generative novelty. This framework echoes conceptions of reason as an innate, productive force for human advancement, later integrated into contemporary philosophy of language via John Searle's speech act theory, which demonstrates how limited linguistic forms enable an infinite range of intentional communicative performances.

Criticisms and Debates

Usage-based linguistics presents a major empirical challenge to the notion of digital infinity in , positing that emerges from children's exposure to finite, item-based patterns in communicative interactions rather than from an innate capacity for generating infinite structures. Michael Tomasello's work in the , particularly his usage-based theory, emphasizes that children construct through and from concrete utterances, without relying on abstract recursive rules, thus questioning the universality of infinity as a linguistic endowment. Philosophically, John Searle's argument critiques the sufficiency of digital syntax for achieving genuine semantic understanding or infinite generative capacity. In his thought experiment, Searle illustrates that a system manipulating symbols according to formal rules—as in digital computation—can simulate linguistic output without comprehending meaning, thereby undermining claims that syntactic discreteness alone enables the infinite use of finite means in . The debate over recursion in the Pirahã language has further contested the universality of digital infinity, with linguist Daniel Everett arguing in the 2000s that this Amazonian language lacks recursive embedding, suggesting cultural and environmental factors shape grammar without innate infinite mechanisms. Everett's claims, based on extensive fieldwork, imply that recursion is not a linguistic universal but a learned feature, directly challenging Chomsky's framework of discrete infinity. However, subsequent analyses, including those using statistical models on Pirahã data, have contested Everett's findings, arguing that subtle recursive structures may exist despite surface appearances. Connectionism offers a theoretical to digital infinity by through distributed, gradient-based neural networks that learn patterns via statistical associations, eschewing strict discreteness in favor of emergent, analog-like representations. Post-2010 advancements in have bolstered this approach, with neural language models demonstrating effective handling of complex syntax without explicit recursive rules, gaining prominence in AI-driven research.

References

  1. [1]
    [PDF] A Turing Program for Linguistic Theory
    I- language is thus a system of d-infinity (discrete/denumerable/digital infinity) analogous to the natural numbers: A finite system that in principle can ...<|control11|><|separator|>
  2. [2]
    [PDF] Three Factors in Language Design
    Returning to the three factors of language design, adoption of a P&P framework overcomes a difficult conceptual barrier to shifting the burden of explanation.
  3. [3]
    [PDF] UNRAVELLING DIGITAL INFINITY | Chris Knight
    It is the procedure central to any conceivable system of 'discrete infinity'. Merge is recursive: it means combining things, combining the combinations and.
  4. [4]
    The “Chomskyan Era” - Chomsky.info
    For example, the most elementary property of the language faculty is the property of discrete infinity; you have six-word sentences, seven-word sentences but ...
  5. [5]
    [PDF] HOW RECURSIVE IS LANGUAGE
    ... recursion dates back to Chomsky (1957), who pointed out that natural language appears to have the property of discrete infinity: it is composed of discrete ...
  6. [6]
    [PDF] DISCRETE INFINITY AND THE SYNTAX-SEMANTICS INTERFACE
    For example, Chomsky (2007) writes that “An. I-language is a computational system that generates infinitely many internal expressions”. Chomsky also noted that ...
  7. [7]
    (PDF) Unravelling digital infinity - ResearchGate
    A zero-sum operation, digital infinity in linguistics is not unlike the number systems in mathematics: humans either have access to all numbers or access to ...
  8. [8]
    Gesture's role in speaking, learning, and creating language - PMC
    Gesture conveys meaning globally, relying on visual and mimetic imagery, whereas speech conveys meaning discretely, relying on codified words and grammatical ...Gesture's Role In Speaking... · 2. Gesture's Role In... · 4. Gesture's Role In...Missing: infinity | Show results with:infinity
  9. [9]
    Why is human voice's pitch continuous but musical pitches ... - Quora
    Apr 14, 2021 · I don't believe the human voice is continuous when put in contrast to musical pitches. The human voice is capable of singing words discrete. An ...Why is music constructed from notes with (mostly) unchanging pitch ...Is music mathematically infinite? - QuoraMore results from www.quora.comMissing: infinity | Show results with:infinity
  10. [10]
  11. [11]
  12. [12]
    [PDF] Discourse on the Method of Rightly Conducting one's Reason and ...
    In 1 you will find various considerations regarding the sciences; in 2 the main rules of the method that the author has sought; in 3 some of the moral rules he ...
  13. [13]
    Grammaire générale et raisonnée, contenant les fondemens de l'art ...
    Jul 24, 2008 · Grammaire générale et raisonnée, contenant les fondemens de l'art de parler ... [par] Antoine Arnauld [et Claude Lancelot] ; Publisher: Paris ...
  14. [14]
    The Galilean Challenge | Noam Chomsky - Inference Review
    It must, therefore, make infinite use of finite means ... ” Galileo Galilei, Dialogo sopra i due massimi sistemi del mondo (Florence: Gian Battista Landini, 1632) ...
  15. [15]
    [PDF] Course in general linguistics
    \ bol has a rational relationship with the thing signified (see p. 68). ) but language is a system of arbitrary signs and lacks the necessary. ; basis, the ...
  16. [16]
    [PDF] Fundamentals of Language - Monoskop
    All phonemes denote nothing but mere OTHERNESS. This lack of individual, particular signalization separates the distinctive features, and their combinations ...Missing: infinite | Show results with:infinite
  17. [17]
    [PDF] Infinity and the Foundations of Linguistics - CORE
    In a lecture in 2011 at Carleton University,. Chomsky claimed that “perhaps the most elementary property of human language is that it consists of a discrete ...
  18. [18]
    [PDF] Noam Chomsky Syntactic Structures - Tal Linzen
    The immediate goal of the new work was to formulate precise, explicit, "generative" accounts, free of intuition-bound notions. The fundamental aim in the ...
  19. [19]
    [PDF] ASPECTS OF THE THEORY OF SYNTAX
    For discussion, see Chomsky. (1964). A grammar of a language purports to be a description of the ideal speaker-hearer's intrinsic competence. If the grammar is,.
  20. [20]
    [PDF] The Minimalist Program - 20th Anniversary Edition Noam Chomsky
    alone: an operation that forms larger units out of those already constructed, the operation Merge. Applied to two objects α and β , Merge forms the new.
  21. [21]
    Review of B. F. Skinner's Verbal Behavior - Chomsky.info
    A Review of B. F. Skinner's Verbal Behavior. Noam Chomsky. In Leon A. Jakobovits and Murray S. Miron (eds.), Readings in the Psychology of Language, ...Missing: PDF | Show results with:PDF
  22. [22]
    Syntactic Structures, Noam Chomsky - Penn Linguistics
    No information is available for this page. · Learn why
  23. [23]
    [PDF] ASPECTS OF THE THEORY OF SYNTAX - DTIC
    The em- phasis in this study is syntax; semantic and phonological aspects of language structure are discussed only insofar as they bear on syntactic theory.
  24. [24]
    [PDF] ON GENERATIVE SEMANTICS' - George Lakoff - eScholarship
    Constraints holding at particular leveis define well-formedness conditions for those leveis, and so are analogous to McCawicy's node- acceptability conditions ...Missing: key | Show results with:key
  25. [25]
    How many words do you need to speak a language? - BBC
    Jun 23, 2018 · Typically native speakers know 15,000 to 20,000 word families - or lemmas - in their first language. Word family/lemma is a root word and all ...Missing: source | Show results with:source
  26. [26]
    [PDF] ON COMPUTABLE NUMBERS, WITH AN APPLICATION TO THE ...
    The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means.
  27. [27]
    [PDF] COMPUTING MACHINERY AND INTELLIGENCE - UMBC
    A. M. Turing (1950) Computing Machinery and Intelligence. Mind 49: 433-460. COMPUTING MACHINERY AND INTELLIGENCE. By A. M. Turing. 1. The Imitation Game. I ...
  28. [28]
    The Computational Theory of Mind
    Oct 16, 2015 · In the 1960s, Turing computation became central to the emerging interdisciplinary initiative cognitive science, which studies the mind by ...
  29. [29]
    Steven Pinker - How the Mind Creates Language - PhilPapers
    The Language Instinct: How the Mind Creates Language · Steven Pinker. Harper Perennial (1994/2007). @book{Pinker2007-PINTLI, author = {Steven Pinker}, editor ...Missing: computational | Show results with:computational
  30. [30]
    Hilary Putnam, Minds and Machines - PhilArchive
    Putnam, Hilary (1960). Minds and Machines. In Sidney Hook, Dimensions Of Mind: A Symposium. NY: NEW YORK University Press. pp. 138-164.Missing: computational | Show results with:computational
  31. [31]
    [PDF] ASPECTS OF THE THEORY OF SYNTAX - Colin Phillips |
    His view that a language "makes infinite use of finite means" and that its grammar must describe the processes that make this possible is, furthermore, an ...
  32. [32]
    Language and Freedom - Chomsky.info
    Language is a process of free creation; its laws and principles are fixed, but the manner in which the principles of generation are used is free and infinitely ...<|control11|><|separator|>
  33. [33]
    Searle's Theory of Speech Acts - jstor
    " There is an infinite variety of speech acts in a language, so far as there is an infinite variety of different sentences, but there is only a finite ...
  34. [34]
    [PDF] First steps toward a usage-based theory of language acquisition*
    Usage-based models of language focus on the specific communicative events in which people learn and use language. In these models, the psycholinguistic.
  35. [35]
    The usage-based theory of language acquisition (Chapter 5)
    In this chapter I provide a synoptic account of the usage-based approach to language acquisition, in both its functional and grammatical dimensions.
  36. [36]
    [PDF] The Chinese Room - rintintin.colorado.edu
    Suppose that instead of the computer inside the robot, you put me inside the room and, as in the original Chinese case, you give me more Chinese symbols with.
  37. [37]
    The Interpreter | The New Yorker
    Apr 9, 2007 · Everett's most explosive claim, however, was that Pirahã displays no evidence of recursion, a linguistic operation that consists of inserting ...
  38. [38]
    Bringing more data to language debate | MIT News
    Mar 9, 2016 · Among the questions at issue is whether the Pirahã language contains recursion, a process through which sentences (and thus languages) can be ...
  39. [39]
    [PDF] Neural Network Models of Language Acquisition and Processing
    A third major difference with traditional linguistic thinking is that neural networks do not represent discrete categories (be it phonemes, words, parts of.
  40. [40]
    [PDF] Connectionist perspectives on language learning, representation ...
    In this sense, connectionism is in conflict with some of the guiding assumptions of the generative linguistics framework, which has historically built on the ...