Fact-checked by Grok 2 weeks ago

Montague grammar

Montague grammar is a formal theory of semantics and its interface with , developed by the American philosopher and logician (1930–1971) primarily in his final three papers published between 1970 and 1973. It posits that natural languages can be analyzed using the same rigorous mathematical principles as formal languages, employing model-theoretic semantics to assign truth-conditional meanings to sentences based on their syntactic structure. Central to the framework is the principle of compositionality, which states that the meaning of a complex expression is determined by the meanings of its parts and the rules used to combine them, formalized as a between syntactic and semantic algebras. Montague's work emerged in the late 1960s at UCLA, amid debates in generative sparked by Noam Chomsky's syntactic theories and critiques of earlier semantic approaches like those of Jerrold Katz and . Influenced by logical traditions from , , and , Montague extended and possible-worlds semantics—concepts he had developed earlier for philosophical purposes—to handle phenomena such as quantification, tense, and propositional attitudes. His approach challenged the prevailing view that natural languages were too ambiguous or context-dependent for formal treatment, demonstrating instead that they could be mapped onto higher-order typed logics. The theory's syntax draws from , pioneered by Kazimierz Ajdukiewicz and , where expressions are categorized by their semantic types (e.g., noun phrases as functions from properties to truth values, treating them as generalized quantifiers rather than referring terms). Semantics is provided via interpretation in model structures that include possible worlds and times, enabling the treatment of intensional contexts like belief reports. Montague's seminal papers—"English as a Formal Language" (1970), "" (1970), and "The Proper Treatment of Quantification in Ordinary English" (PTQ, 1973)—illustrate these ideas through increasingly detailed fragments of English, with PTQ innovating lambda abstraction for binding variables in quantifiers, relative clauses, and questions. Montague grammar laid the foundation for contemporary formal semantics, influencing fields like , , and by the 1980s. It was adapted to various syntactic frameworks, including generalized phrase structure grammar (GPSG) and head-driven phrase structure grammar (HPSG), and inspired textbooks such as David Dowty, , and Stanley Peters' Introduction to Montague Semantics (1981). Despite initial resistance from Chomskyan linguists, who questioned its psychological reality, the framework's emphasis on truth conditions and entailments has become a of semantic theory, with ongoing developments in and alternative logics building upon its core insights.

Introduction

Definition and Scope

Montague grammar is a formal approach to the semantics of , developed by the American logician in the early 1970s, which treats natural languages as equivalent to the formal languages of logic in terms of their syntactic and semantic structure. This theory integrates syntax and semantics through a precise mathematical framework, assigning meanings to linguistic expressions in a way that parallels the interpretation of logical formulas. At its core, Montague grammar emphasizes truth-conditional semantics, where the meaning of a sentence is defined by the conditions under which it is true relative to a model of the world, thereby providing a systematic account of how linguistic forms convey factual content about reality. The scope of Montague grammar is primarily concerned with linking directly to their logical interpretations via the principle of compositionality, which states that the meaning of a complex expression is a function of the meanings of its constituent parts and the rules governing their syntactic combination. This is achieved through model-theoretic semantics, where expressions are interpreted in set-theoretic models that include domains of entities, possible worlds, and assignments of values to variables, enabling the theory to handle phenomena such as reference, predication, and inference in . Montague's foundational thesis, articulated in his 1970 paper "," asserts that "there is in my opinion no important theoretical difference between natural languages and the artificial languages of logicians; indeed I consider it possible to comprehend the syntax and semantics of both kinds of languages with a single natural and mathematically precise theory." Key components of Montague grammar include the use of higher-order intensional logic to represent meanings at various levels of complexity, allowing for the treatment of linguistic categories like noun phrases and verbs as functions over entities and propositions. Lambda abstraction serves as a central mechanism for constructing these functional meanings compositionally, enabling the uniform semantic treatment of quantifiers and modifiers without ad hoc rules. Additionally, intensional operators extend the framework to account for modal expressions, propositional attitudes (such as belief and knowledge), and tense by interpreting meanings as functions from possible worlds and times to truth values, thus capturing the context-dependent aspects of semantics.

Historical Development

Richard Montague, a logician and philosopher, developed the foundational ideas of what would become known as Montague grammar during the late 1960s and early 1970s, focusing on applying formal logical methods to semantics. His work was heavily influenced by the model-theoretic semantics established by , , and , with Montague having studied under Tarski at the . These influences led Montague to extend and to analyze the syntax-semantics interface in ordinary English, addressing longstanding challenges in formalizing meaning. Montague's key publications laid the groundwork for the theory. In 1970, he published "Universal Grammar" in Theoria, outlining a unified syntactic and semantic framework for formal and natural languages. That same year, "English as a Formal Language" appeared in Linguaggi nella Società e nella Tecnica, the proceedings of a conference on languages in society and technique held in Milan, where he rejected distinctions between formal and natural languages and proposed translation rules from English fragments to logical expressions. His most influential piece, "The Proper Treatment of Quantification in Ordinary English" (PTQ), was published posthumously in 1973, providing a detailed analysis of quantifiers and intensionality in English. The term "Montague grammar" was coined shortly after Montague's death by linguist Barbara H. Partee, who used it in her 1973 and 1975 papers to refer to his integrated syntactic-semantic system. This framework arose amid the dominance of Noam Chomsky's generative syntax in the 1960s, offering a precise semantic complement to Chomskyan structuralism during the "linguistics wars" between generative and interpretive semantics. Montague's sudden death on March 7, 1971, at the age of 40, prevented further elaboration, leaving his unpublished manuscripts and ideas to be compiled and disseminated by colleagues like Richmond Thomason. Early adoption in the 1970s was driven by Partee and others, who adapted Montague's methods for linguistic applications, fostering interdisciplinary collaboration between logicians and linguists that solidified formal semantics as a core subfield by the decade's end.

Core Principles

Compositionality

Compositionality serves as the foundational principle of Montague grammar, positing that the meaning of a complex expression is determined solely by the meanings of its constituent parts and the syntactic rules governing their combination. This adaptation of Frege's principle, originally articulated in his 1892 work Über Sinn und Bedeutung, ensures that semantic interpretations are systematically derived from syntactic structures without introducing extraneous factors. In Montague's framework, as detailed in his 1970 paper "Universal Grammar," this principle underpins the theory of reference, treating semantics as a direct extension of formal logical systems. The role of compositionality in Montague grammar is to establish a homomorphism between the syntactic algebra—comprising expressions and rules for their combination—and the semantic algebra, where meanings are functions applied to those expressions. This mapping guarantees that each syntactic operation corresponds to a semantic function, preserving the structural integrity of meaning construction across the grammar. Formally, the semantic value of any phrase is obtained by applying a designated semantic rule to the semantic values of its immediate constituents, in accordance with the syntactic rule that forms the phrase; this recursive process aligns syntax and semantics algebraically, as formalized in Montague's 1974 collected papers. The implications of this strict adherence to compositionality are profound, enabling a recursive definition of meaning that accounts for the infinite productivity of despite finite grammatical resources. By mirroring syntactic in semantic , Montague grammar handles arbitrarily complex expressions through modular, predictable derivations, contrasting sharply with holistic approaches that derive sentence meanings independently of constituent . This modularity facilitates scalability in semantic , allowing the grammar to generate novel interpretations systematically. Historically, maintained unwavering commitment to compositionality in his seminal works, such as "English as a Formal Language" (1970), extending it even to idioms and non-literal expressions through the mechanism of meaning postulates. These postulates are semantic axioms that constrain the interpretations of lexical items or fixed phrases, ensuring their integration into the compositional framework without violating the homomorphism; for instance, postulates like those relating predicates such as "" and "" enforce necessary truths while preserving overall recursivity. This approach, while maintaining strict compositionality, accommodates lexical irregularities by adjusting model-theoretic constraints rather than altering syntactic rules.

Intensional Logic

Intensional logic forms the semantic foundation of Montague grammar, extending classical extensional logic to account for phenomena where meaning depends not only on reference in the actual world but also on how expressions across hypothetical scenarios. In this , intensions represent the meanings of expressions as functions from points of evaluation—typically pairs of possible worlds and moments of time—to their extensions, the actual denotations in a given . This approach allows the grammar to model non-truth-conditional aspects of , such as , tense, and propositional attitudes, by treating semantic values as abstract entities sensitive to alternative possibilities rather than fixed objects or truth values. A core feature of Montague's is its adoption of possible worlds semantics, drawing inspiration from Kripke's work on , where the universe of discourse includes a set of possible worlds I alongside entities A and times J. Semantic types are enriched to include intensional categories like \langle s, t \rangle for propositions, enabling operators such as (\Box) to shift evaluation points across worlds, and attitude verbs like "believes" to embed subordinate clauses under alternative perspectives. In Montague's system, sentences denote intensions of type \langle s, t \rangle, interpreted as propositions—functions from worlds and times to truth values, effectively sets of world-time pairs where the sentence holds true. Noun phrases, in turn, denote generalized quantifiers: higher-order entities that take properties (themselves intensions of type \langle s, \langle s, e \rangle \rangle) as arguments, yielding truth values when applied to predicates. This structure ensures that quantification operates over intensional objects, preserving compositionality while accommodating scope interactions in embedded contexts. The intensional approach addresses failures of , where substituting co-referring terms changes , as in the sentence "John seeks a unicorn." Here, "unicorn" lacks an extension (no such exists) but has an (a true in some worlds), allowing the sentence a de re reading where John seeks an actual unicorn (false) or a de dicto reading where he seeks the concept of one (potentially true). Montague formalizes this via scope ambiguities in the logical , with the verb "seeks" taking an intensional object: de dicto as \textit{seek}'(j', \lambda P_{u} [\textit{unicorn}'(u') \land P(u')]) and de re as \exists u [\textit{unicorn}'(u') \land \textit{seek}'(j', u')], where primes denote intensions. Philosophically, Montague's intensional logic formalizes Gottlob Frege's distinction between (Sinn, the mode of presentation or ) and (Bedeutung, the extension), adapting it to a model-theoretic setting where senses are systematically related to references across possible worlds. This builds on Frege's 1892 analysis by embedding it within a and possible worlds framework, providing a precise tool for semantics that resolves paradoxes arising from opaque contexts.

Formal Framework

Syntactic Categories and Types

Montague grammar employs a categorial syntax inspired by , where syntactic expressions are classified according to semantic types that specify their argument structures and return values. This approach ensures a direct between syntax and semantics, with categories defined recursively. The basic semantic types are e for entities (individuals) and t for truth values, from which functional types are constructed: if \alpha and \beta are types, then \langle \alpha, \beta \rangle is the type of functions mapping elements of type \alpha to elements of type \beta. Key syntactic categories in Montague's system include sentences (S), which denote truth values of type t; common nouns (CN), which denote properties of type \langle e, t \rangle; verb phrases (VP), also of type \langle e, t \rangle; and noun phrases (), typically of type \langle \langle e, t \rangle, t \rangle when functioning as generalized quantifiers. Intransitive verbs () are assigned type \langle e, t \rangle, while transitive verbs (TV) take type \langle e, \langle e, t \rangle \rangle, reflecting their ability to combine with an object NP to form a VP. These categories are defined using slash notation from : for instance, a category A/B denotes an expression that combines with a B to its right to yield an A. In Montague's framework, as detailed in his analysis of English fragments, syntactic categories map directly to expressions via translation rules, enabling a compositional semantics without intermediate levels of representation. This system treats the grammar as a from to logical forms, where each category's type corresponds precisely to its in a model. To handle the dual role of NPs as both referential terms (type e) and quantifiers (type \langle \langle e, t \rangle, t \rangle), later developments in the Montague grammatical tradition incorporate type-raising, which lifts an NP of type e to a higher-order function over properties. For example, a proper name like "John" can be raised from denoting an entity to a quantifier that applies to predicates, allowing uniform treatment in constructions like coordination. Unlike Chomskyan generative grammar, which emphasizes deep structures and transformational rules to derive surface forms, Montague grammar is surface-oriented and relies on categorial rules and type assignments for direct syntactic-semantic alignment, avoiding the need for movement operations.

Semantic Rules and Lambda Calculus

In Montague grammar, semantic rules provide a compositional mechanism for assigning meanings to syntactic structures, drawing heavily on the typed lambda calculus to represent expressions as lambda terms. These terms denote functions over semantic types, where syntactic categories correspond to functional types such as t for truth values and e for entities, with complex types like (A \to B). The interpretation of a phrase is derived by applying lambda abstraction and beta-reduction, ensuring that meanings are computed bottom-up from lexical items to full sentences. This integration allows for the treatment of natural language phenomena like quantification through higher-order functions, as formalized in Montague's intensional logic IL. The core semantic operation is , stated as: if \alpha has type A \to B and \beta has type A, then \alpha \beta has type B, with the meaning \llbracket \alpha \beta \rrbracket = \llbracket \alpha \rrbracket (\llbracket \beta \rrbracket). This rule corresponds to syntactic combination in , where a A/B applies to an argument of category B to yield category A. Variable abstraction complements this by forming lambda terms: for a x of type A free in \alpha of type B, \lambda x . \alpha has type A \to B, x and denoting a function that maps entities to the denotation of \alpha with x substituted. Abstraction is particularly used for relative clauses, where a restricting phrase is abstracted over a to create a higher-order . These rules pair syntactic formations with their lambda-based translations in Montague's PTQ fragment. Quantification is handled by treating noun phrases (NPs) as higher-order functions of type ((e \to t) \to t), applying to predicates of type e \to t. The rule for NPs forms quantified expressions as \lambda P . \widehat{Q}(P), where Q is the of the common noun and \widehat{Q} lifts it to a generalized quantifier via , such as \lambda P . \forall x [Q(x) \to P(x)] for . In the full PTQ system, subject-predicate merger is handled via , determiner-noun combinations form NPs, relative clause attachment is enabled through (e.g., "such that" constructions as \lambda x . S' where S' contains x free), and quantifier is dealt with via or . These rules ensure that quantifiers scope over predicates compositionally without direct reference to models. The translates syntactic trees into lambda expressions of IL, systematically applying the rules above. Starting from lexical translations (e.g., verbs as predicates with free variables), it builds terms via application and abstraction, followed by beta-reduction to simplify (e.g., (\lambda x . \phi(x))(a) \to \phi(a)). For complex structures, quantifying-in introduces by substituting the NP's term into an abstracted sentence frame, yielding logical forms that capture through different reduction orders. This maintains the type discipline of the , ensuring well-formedness and compositionality in semantic derivation.

Model-Theoretic Interpretation

In Montague grammar, the model-theoretic interpretation is grounded in a formal that evaluates the meanings of linguistic expressions relative to possible worlds and times. A model is defined as a triple \langle D, D', V \rangle, where D is a non-empty of individuals (entities), D' is a set of indices representing possible worlds and moments of time, and V is a valuation that assigns denotations to basic expressions in the . This , introduced in Montague's foundational work on formal semantics, allows for an intensional framework that captures context-dependent meanings beyond simple extensional ones. Expressions in the grammar denote objects within domains determined by their syntactic types in the model. For instance, common nouns and intransitive verbs denote predicates as functions of type e \to t, mapping entities in D to truth values \{0,1\} (false or true) at a given index in D'; transitive verbs denote functions of type e \to (e \to t). Sentences, as expressions of type t, denote truth values relative to a specific world-time index, enabling the semantics to account for modal and temporal variations. This denotational approach ensures that meanings are model-relative, with proper names rigidly denoting the same individual across indices unless specified otherwise. The semantics is truth-conditional, where the meaning of a is its : a from indices in D' to truth values, representing a that holds true or false across possible worlds. thus serve as the semantic values for , providing a unified way to handle attitudes like or by evaluating truth at different indices. To refine these interpretations and encode lexical relations, meaning postulates impose constraints on the valuation V, such as the \forall p \forall x \, (\mathrm{know}(x, p) \to \mathrm{believe}(x, p)), which ensures that entails across models. Evaluation proceeds recursively through the syntactic , starting from lexical s and applying semantic rules bottom-up to derive the denotation of expressions compositionally. This homomorphism preserves meaning under syntactic combination, yielding the truth conditions for entire sentences in a given model.

Applications and Examples

Quantification in

One of the central challenges Montague addressed in his framework is the handling of scope ambiguities and superiority effects in quantified sentences, such as "Every linguist chased some ," where the relative scopes of "every" and "some" can yield distinct meanings (e.g., every linguist chased a (possibly different) , or there exists one chased by every linguist).[](https://archive.org/details/formalphilosophy00mont) These phenomena arise because quantifiers interact in ways that alone cannot capture without extensions, prompting Montague to treat noun phrases (NPs) as denoting generalized quantifiers—higher-order functions that relate properties or sets.[](https://archive.org/details/formalphilosophy00mont) In this approach, determiners like "every" and "some" are interpreted as binary relations between properties: "every" denotes the function \lambda P \lambda Q . \forall x (P(x) \to Q(x)), which holds if every entity satisfying property P also satisfies Q; similarly, "some" denotes \lambda P \lambda Q . \exists x (P(x) \wedge Q(x)), holding if there is at least one entity satisfying both.[](https://archive.org/details/formalphilosophy00mont) NPs combine the determiner with a common noun to form such a quantifier, which then applies to the property denoted by the verb phrase (VP). This higher-type semantics allows quantifiers to take scope compositionally without requiring movement rules, resolving ambiguities through alternative syntactic analyses that permute the order of application. The PTQ fragment illustrates this via specific translation rules, including in situ interpretation where quantifiers bind variables introduced by lambda abstraction in the scope position, or through a "quantifying-in" mechanism that embeds the quantifier's restrictor and nuclear scope directly into the sentence's .[](https://archive.org/details/formalphilosophy00mont) For instance, the sentence "Every horse runs" receives a syntactic bracketing [[ \text{every}, \text{horse} ], \text{runs} ], translated stepwise: the NP "every horse" denotes \lambda Q . \forall x (\text{HORSE}(x) \to Q(x)), which applies to the VP property \lambda y . \text{RUNS}(y), yielding the overall interpretation \forall x (\text{HORSE}(x) \to \text{RUNS}(x)).[](https://archive.org/details/formalphilosophy00mont) Scope interactions, such as in the example "Every linguist chased some bartender," are managed by syntactic rules that enforce linear precedence (capturing superiority effects, where the subject quantifier typically outscopes the object unless alternative bracketings are permitted), producing multiple possible s without ad hoc adjustments. Indefinite NPs, like "a horse," are treated as existential quantifiers akin to "some," denoting \lambda Q . \exists x (\text{HORSE}(x) \wedge Q(x)), but PTQ handles their variable scope—wide or narrow—through syntactic variations and anaphoric bindings using indexed pronouns (e.g., "him_i") to track de re or de dicto readings in complex sentences.[](https://archive.org/details/formalphilosophy00mont) For "A horse runs," the basic derivation mirrors the universal case but yields \exists x (\text{HORSE}(x) \wedge \text{RUNS}(x)), with broader scope in embedded contexts achieved via rules that lift the indefinite higher in the structure. This mechanism ensures that indefinites contribute existentially without presupposing uniqueness, aligning with their distributive behavior in .

Fragment of English Analysis

A fragment of English in Montague grammar refers to a simplified, that captures a of English and semantics, as developed in works like English as a Formal Language (EFL) and (UG), where lexical items and syntactic rules are directly mapped to semantic interpretations using typed lambda expressions. These fragments, such as the one in EFL, include a finite categorized by syntactic types and a set of inductive rules for combining expressions, ensuring compositionality in deriving meanings for basic declarative structures. The lexicon assigns each word a and a corresponding semantic type, often via abstractions that denote functions over individuals or properties. For instance, intransitive verbs like "walks" are assigned the type e \to t (from entities to truth values) with the denotation \lambda x . WALKS(x), where WALKS is a true of walking entities. Proper names, such as "", receive an initial referential type e but are semantically raised to generalized quantifiers of type (e \to t) \to t via \lambda P . P(j), treating them uniformly with quantified noun phrases. A full derivation for the sentence "John walks" proceeds via the syntactic rule combining a noun phrase (NP) with a verb phrase (VP) to form a sentence (S), with semantics given by function application: the raised denotation of "John", \lambda P . P(j), applies to the denotation of "walks", \lambda x . WALKS(x), yielding WALKS(j), a proposition true if John walks. Such fragments cover basic clauses through rules like NP + VP → S, as in simple declaratives, and extend to complex noun phrases (NPs) via adjective modification and relative clauses. For adjectives, a construction like "white horse" combines via intersection: the noun "horse" denotes \lambda x . HORSE(x) (type e \to t), modified by the adjective "white" as \lambda Q . \lambda x . WHITE(x) \land Q(x) (type (e \to t) \to (e \to t)), resulting in \lambda x . HORSE(x) \land WHITE(x). Relative clauses are handled through lambda abstraction, turning an open sentence into a modifier; for example, "horse that John rides" abstracts over the object position in "John rides y" to yield \lambda x . HORSE(x) \land RIDES(j, x), which intersects with the head noun. While these fragments effectively analyze declarative sentences and certain NPs, including those with adjectives and restrictive relative clauses, they are limited in scope, excluding phenomena like interrogatives, anaphora, and non-declarative moods. Quantifier meanings, as elaborated in PTQ, can be integrated into the for NPs but are not central to the basic fragment structure here.

Criticisms and Legacy

Key Limitations

One prominent limitation of Montague grammar lies in its static semantics, which inadequately addresses pragmatics by ignoring dynamic speaker context and indexicals such as "I" and "now". These elements are treated as fixed parameters akin to times and worlds, rather than as context-dependent features that shift with utterance circumstances. Furthermore, the framework provides no robust treatment of presupposition or implicature, as these phenomena involve non-truth-conditional content that cannot be fully captured through compositional truth conditions alone; early extensions attempted to incorporate them via additional logical dimensions, but these highlighted the need for non-compositional adjustments. In terms of intensionality, Montague grammar overgenerates equivalences in propositional attitudes, treating logically equivalent expressions—such as tautologies like " is ill or he is not ill"—as interchangeable, which fails to account for hyperintensional distinctions between synonymous but non-substitutable contents. This leads to difficulties with non-monotonic inference, where beliefs or knowledge do not preserve truth under substitution in intensional contexts, as the possible-worlds approach equates intensions that differentiates more finely. The rigid categorial system of Montague grammar creates a mismatch between and semantics, struggling to accommodate flexible phenomena like in languages such as or constructions that violate strict category-driven rules. This inflexibility arises because syntactic categories are tightly linked to semantic types, limiting the theory's ability to handle variations without rule extensions. Empirically, Montague grammar is confined to the truth-conditional core of , offering limited coverage of discourse-level phenomena such as anaphora resolution, where pronouns refer back across sentences in ways that require updating a shared rather than sentence-internal composition. This restriction prevents straightforward analysis of inter-sentential dependencies, as the model treats each sentence in isolation without mechanisms for cumulative representation. Philosophically, the framework commits to an ontology of possible worlds as primitive entities, raising concerns about since it posits unobservable realms to explain and intensionality, potentially inflating metaphysical commitments beyond empirical necessity.

Influence on Modern Semantics

Montague grammar laid the groundwork for contemporary formal semantics by establishing a model-theoretic approach that integrates syntax and semantics through and compositionality, influencing subsequent frameworks such as type-logical grammars and variants of . These developments treat linguistic expressions as typed lambda terms, enabling precise handling of scope ambiguities and quantifier interactions in . A key example is the widespread adoption in textbooks like and Kratzer's Semantics in Generative Grammar (1998), which adapts Montague's to generative syntax, providing a standard reference for training linguists in compositional meaning construction. In , Montague grammar provided the foundational principles for semantic parsing in , where sentences are mapped to logical forms for inference and execution. Early implementations in leveraged its lambda calculus-based semantics to build parsers for fragments of English, facilitating database querying and systems. This approach influenced modern semantic parsers, such as those using lambda dependency trees, which compose meanings incrementally to handle complex queries while maintaining truth-conditional accuracy. Significant extensions of Montague grammar include , developed by Kamp (1981) through Discourse Representation Theory and by Heim (1983) via File Change Semantics, which shift focus from static truth conditions to context updates for anaphora and resolution. , advanced by Steedman (2000), builds on Montague's categorial foundations by incorporating combinators for flexible word order and prosody, enhancing coverage of non-canonical structures without sacrificing compositionality. Philosophically, Montague grammar shaped debates on meaning and reference by demonstrating that natural language could be formalized with the rigor of logical systems, influencing situation theory as proposed by Barwise and Perry (1983), which refines intensionality through partial situations rather than full possible worlds. Its integration into cognitive science underscores compositionality as a core mechanism for mental representation, linking linguistic structure to broader theories of human reasoning and concept formation. In recent developments during the , Montague-inspired semantics informs systems, where symbolic lambda expressions from semantic parsing are combined with neural networks for robust compositional generalization in tasks like . These models address limitations in pure neural approaches by enforcing logical constraints, as seen in architectures that parse into executable forms for reliable .

References

  1. [1]
    Montague Grammar - an overview | ScienceDirect Topics
    Richard Montague, logician and philosopher, founded the theory of Montague grammar, a starting point for formal semantics. Montague studied at Berkeley under ...
  2. [2]
    None
    ### Biography Summary of Richard Montague (Focus on Semantics)
  3. [3]
    [PDF] Formal Semantics: Origins, Issues, Early Impact - New Prairie Press
    Jan 3, 2011 · Montague's work on the formal treatment of natural languages came only with his last three papers, “English as Formal Language” (EFL) (Montague.<|control11|><|separator|>
  4. [4]
    [PDF] Grammar. - Semantics Archive
    On PTQ. In a series of papers published between 1970 and 1973, the late Richard Montague developed a mathematical theory of semantics and its interface with ...Missing: primary | Show results with:primary
  5. [5]
    [PDF] Ms. February 2001. Partee, Barbara H. Montague grammar. To ...
    3.9.19 Montague Grammar. Montague grammar is a theory of semantics, and of the relation of semantics to syntax, originally developed by the logician Richard ...
  6. [6]
    Montague Semantics - Stanford Encyclopedia of Philosophy
    Nov 7, 2011 · Montague semantics is a theory of natural language semantics and of its relation with syntax. It was originally developed by the logician Richard Montague.
  7. [7]
    Universal grammar - MONTAGUE - 1970 - Wiley Online Library
    Universal grammar. RICHARD MONTAGUE,. RICHARD MONTAGUE. University of ... Download PDF. back. Additional links. About Wiley Online Library. Privacy Policy ...
  8. [8]
    [PDF] The Proper Treatment of Quantification in Ordinary English
    The aim of this paper is to present in a rigorous way the syntax and semantics of a certain fragment of a certain dialect of English.
  9. [9]
    [PDF] A Brief History of the Syntax-Semantics Interface in Western Formal ...
    “Formal semantics” arose a little later, with the logician. Richard Montague ... Montague developed a theory in his paper “Universal grammar” (Montague, 1970a) in ...
  10. [10]
    Compositionality - Stanford Encyclopedia of Philosophy
    Apr 8, 2004 · Groenendijk, Jeroen and Martin Stokhof, 1990, “Dynamic Montague Grammar”, in L. Kálmán et al. Proceedings of the 2nd Symposium on Logic and ...
  11. [11]
    [PDF] From Logic to Montague Grammar: - Semantics Archive
    Jan 13, 2014 · Montague, Richard. (1974) “Universal Grammar.” In Thomason, Richmond (ed) Formal. Philosophy: Selected Papers of Richard Montague. New Haven: ...
  12. [12]
    [PDF] Lecture 11. Syntactic Categories and Semantic Types.
    May 2, 2012 · Montague grammar uses an ad hoc syntactic system for. English that is based on the principles of categorial grammar. Although Montague's work ...
  13. [13]
  14. [14]
    [PDF] Lecture 2. Lambda abstraction, NP semantics, and a Fragment of ...
    Feb 21, 2005 · 02/21/05 1:15 AM. Lecture 2. Lambda abstraction, NP semantics, and a Fragment of English. 1. Lexical and Structural Ambiguity.Missing: S5 | Show results with:S5
  15. [15]
    Formal philosophy : selected papers of Richard Montague
    Apr 26, 2013 · Formal philosophy : selected papers of Richard Montague ; Publication date: 1974 ; Topics: Montague, Richard, 1930-1971, Philosophy, Language and ...
  16. [16]
    [PDF] R. Montagues "English as a Formal Language" - Sascha Brawer
    [Montague 1970] Montague, Richard: English as a Formal Language. In: Visentini et al. [Hrsg.]: Linguaggi nella Società e nella Tecnica. Milano: Edizioni di ...
  17. [17]
    None
    ### Summary of Montague Fragment from https://people.umass.edu/partee/MGU_2005/MGU052.pdf
  18. [18]
    [PDF] Lecture 10. Relative Clauses
    May 17, 2005 · Montague introduced a version of Quine's treatment of relative clauses as “sentential adjective phrases” derived from open sentences by lambda- ...
  19. [19]
    [PDF] Karttunen and Peters 1979
    The supposed counterfactual pre- supposition cannot, therefore, be classified in the same group with all. Page 8. 8. Lauri Karttunen and Stanley Peters the ...
  20. [20]
    Hyperintensional logic | Studia Logica
    Montague,Pragmatics and intensional logic,Semantics of Natural Languages (ed. ... About this article. Cite this article. Cresswell, M.J. Hyperintensional logic.
  21. [21]
    Closed Categories and Categorial Grammar - Project Euclid
    The idea to build the nature of this link into the foundations is Montague's, of course, but it came at the cost of a somewhat rigid theory of syntax. We ...Missing: critique | Show results with:critique
  22. [22]
    [PDF] Discourse Representation Theory 1090
    Feb 15, 2005 · Discourse representation theory as it was presented in Kamp (1981) addressed itself specifically at the problem of the previous section, ...Missing: limitations | Show results with:limitations
  23. [23]
    [PDF] Actualism, Ontological Commitment, and Possible World Semantics
    Here are some quotes representative of a widely shared belief. [Possible world semantics] carries a commitment to the reality of possible.Missing: critiques | Show results with:critiques
  24. [24]
    [PDF] Computational Coverage of Type Logical Grammar: The Montague ...
    Jun 23, 2016 · Type logical grammar (TLG) is a categorial theory of syntax and seman- tics in which words and expressions are classified by logical types. TLG ...
  25. [25]
    [PDF] Compositionality: categorial variations on a theme
    A categorial grammar consists of a universal and a language-specific component. ▷ universal: a type calculus, (N)L. ▷ language specific: a lexicon assigning ...
  26. [26]
    [PDF] Semantics in Generative Grammar
    The right of Irene Heim and Angelika Kratzer to be identified as authors of this work has been asserted in accordance with the Copyright, Designs and Patents ...
  27. [27]
    [PDF] Learning Compositional Semantics for Open ... - ACL Anthology
    The dominant approach for solving this task has been, since Montague (1970), to handcraft a semantic lexicon, using a logic to represent meanings and the lambda ...
  28. [28]
    [PDF] Prolog and Natural Language Semantics Notes for AI3/4 ...
    A good introduction to Montague Grammar and the use of logic for semantic repre- sentation.
  29. [29]
    [PDF] The Role of PROLOG (PROgramming and LOGic) in Natural ... - DTIC
    This report will discuss the role of the computer language PROLOG in. Natural Language Processing (NLP) both from theoretic and pragmatic viewpoints. The ...
  30. [30]
    [PDF] RGGU087 - KampHeim 2 with hw
    (1) (Heim 1982) Heim dissertation, Chapter 2. (2) (Kamp 1981) A theory of truth and semantic representation. (3) (Karttunen 1976) Discourse referents.
  31. [31]
    [PDF] File Change Semantics and the Familiarity Theory of Definiteness
    8 Heim (1983) argues that this view of what presuppositions are throws light on the behavior of presuppositions with respect to the so-called ``projection ...
  32. [32]
    [PDF] Mark Steedman and Jason Baldridge Categorial Grammar (CG ...
    Categorial Grammar (CG) distinguishes grammatical constituents by syntactic types, either functions or arguments, related to semantic types.
  33. [33]
    [PDF] SHIFTING SITUATIONS AND SHAKEN ATTITUDES - John Perry
    I: So the point is that the theory, as it is developed in Montague Grammar, at least, commits you to each possible world being a primitive object that carries ...
  34. [34]
    [PDF] lexical semantics and compositionality
    To appear in Invitation to Cognitive Science, second edition. Daniel ... frameworks variously related to Montague's original theory, "Montague Grammar".
  35. [35]
  36. [36]
    [PDF] Is Neuro-Symbolic AI Meeting its Promises in Natural Language ...
    The semantic parsing module translates an input question in natural language into an executable program ... Sarker, Neural-symbolic integration and the Semantic ...