Fact-checked by Grok 2 weeks ago

Phrase structure grammar

Phrase structure grammar is a foundational framework in for modeling the syntactic structure of sentences through a system of recursive rewrite rules that generate hierarchical constituent structures, typically represented as labeled binary-branching trees. Introduced by in his 1957 work , it builds on earlier traditions of to provide a formal mechanism for deriving well-formed sentences from an initial symbol, such as "Sentence," via rules like S → NP VP ( followed by verb phrase) and NP → Det N ( followed by ). These rules systematically expand nonterminal symbols into sequences of terminals (words) and other nonterminals, capturing the phrase-level organization essential to syntax, as seen in derivations for simple English sentences like "The cat sleeps," where the structure reflects nested phrases such as NP (The cat) and VP (sleeps). While phrase structure grammars effectively describe basic declarative sentences and structures, they face limitations in accounting for more complex phenomena like passives, questions, and ambiguities, which Chomsky argued require supplementary transformational rules to achieve descriptive adequacy in a full . Over time, the core idea of phrase structure has influenced subsequent theories, including for universal phrase-building principles and constraint-based approaches like (HPSG), which integrate lexical features and head-complement relations to model cross-linguistic variations more precisely. In modern , phrase structure rules remain central to syntactic analysis, supported by constituency tests such as (e.g., replacing a with "do so") and coordination, which empirically validate the hierarchical groupings they predict.

Fundamentals

Definition

Phrase structure grammar (PSG), also known as phrase structure analysis, is a formal linguistic framework for describing the syntactic structure of natural languages by generating sentences through hierarchical groupings of words into phrases. Introduced by , it models syntax as a where sentences are derived from an initial symbol by applying a set of rewriting rules that expand non-terminal symbols into strings of terminal and non-terminal symbols, ultimately yielding well-formed sentences composed of lexical items. Formally, a phrase structure grammar is defined by a finite vocabulary V of symbols (including both vocabulary for words and non-terminal vocabulary for syntactic categories), a finite set \Sigma of initial strings (typically a single start symbol such as S for ""), and a finite set F of production rules of the form X \to Y, where X and Y are strings over V, interpreted as instructions to rewrite X as Y in a . This generative mechanism operates through successive substitutions, starting from an element of \Sigma and applying rules until only terminal symbols remain, thereby producing the language's while capturing the constituency relation that groups words into larger units like noun phrases or verb phrases. In the Chomsky hierarchy of formal grammars, phrase structure grammars correspond to Type-2 (context-free) grammars, which allow rules where the left-hand side is a single non-terminal but distinguish from Type-3 regular grammars (limited to right-linear rules for simpler patterns) and Type-1 context-sensitive grammars (which permit context-dependent expansions for greater expressive power). For illustration, consider a simple phrase structure grammar with rules such as S \to [NP](/page/NP) \, [VP](/page/VP), [NP](/page/NP) \to Det \, [N](/page/N), [VP](/page/VP) \to [V](/page/V), Det \to the, [N](/page/N) \to cat, and [V](/page/V) \to sleeps; applying these successively from S generates the sentence "The cat sleeps" by first expanding to noun and verb phrases, then to determiner-noun and verb, and finally to lexical terminals.

Constituency Relation

In phrase structure grammar, the constituency relation refers to the syntactic organization in which a group of words functions as a single coherent unit, known as a constituent, that behaves as a whole within a larger structure. These units, such as noun phrases (NPs) or verb phrases (VPs), capture the hierarchical grouping of words that share syntactic properties and roles. The constituency relation is inherently hierarchical, allowing smaller constituents to embed within larger ones, thereby forming layered structures that reflect the recursive nature of . For instance, a may contain a and a nominal, which itself includes an and a noun, creating nested levels of organization. This embedding contrasts with dependency relations, which focus on direct head-dependent links between words without emphasizing phrase-level grouping. A clear example of the constituency relation appears in the phrase the big dog, where the entire sequence forms a constituent that can function as the subject of a sentence. Within this, big dog serves as a sub-constituent (a nominal), illustrating how modifiers and heads combine into larger units. Linguists identify constituents through specific tests that probe syntactic behavior. The substitution test replaces a potential constituent with a single word like it or a , as in substituting it for the big in a sentence while preserving . The movement test assesses whether a string can relocate as a unit, such as in cleft constructions like It was the big that barked. Finally, the coordination test checks if the string can join with another similar unit via a , for example, the big and the small forming parallel noun phrases. These tests collectively confirm the presence of constituency by demonstrating unified syntactic operations.

Historical Development

Early Foundations

The foundations of phrase structure grammar trace back to ancient linguistic traditions, particularly the work of the Indian grammarian , who around the 4th century BCE composed the , a comprehensive rule-based system for generating forms through approximately 4,000 succinct rules covering , , and . This text employed ordered rules and metarules to describe linguistic structures systematically, anticipating formal generative approaches by specifying how elements combine to form valid expressions, though it focused primarily on morphological derivations rather than hierarchical phrase structures as understood today. Mathematical precursors emerged in the early , with Axel Thue's 1914 paper introducing semi-Thue systems, which formalized through directed substitutions, providing a rigorous basis for rule-based transformations of symbol sequences. Building on this, American logician Emil Post developed production systems in the 1920s, as detailed in his 1921 dissertation, where he outlined canonical systems using production rules to generate sets of propositions from initial forms, establishing key concepts like derivability and normal forms that influenced later grammatical mechanisms. In the realm of , linguist advanced constituency concepts through his , introduced in his 1933 monograph , which proposed dividing sentences into binary layers of immediate constituents based on distributional environments, offering a practical method for syntactic units without relying on meaning. This approach emphasized observable linguistic patterns and hierarchical segmentation, laying groundwork for modern syntactic constituency. During the and , Bloomfield's student extended distributional methods in works like his 1954 paper "Distributional Structure," developing techniques to classify linguistic elements by their co-occurrence patterns and leading to segment-and-classify procedures for analyzing sentence structure through equivalence classes and transformations. These innovations in descriptivism provided empirical tools for syntactic description, directly informing subsequent formalizations of phrase structure.

Chomsky's Formulation

Noam Chomsky first systematically introduced phrase structure grammars as part of a broader exploration of models in his 1956 paper "Three Models for the Description of Language." In this work, he outlined three progressively more powerful frameworks for describing : finite-state (Markovian) models, phrase structure grammars, and transformational grammars. Chomsky argued that finite-state models were inadequate for capturing the complexities of s, such as long-distance dependencies and recursive structures, and positioned phrase structure grammars as a stronger alternative capable of generating hierarchical constituency relations through rewriting rules. However, he deemed even phrase structure grammars insufficient on their own for fully accounting for syntactic phenomena, necessitating the addition of transformations to handle relations between underlying and surface structures. Chomsky refined and elevated the role of phrase structure grammars in his seminal 1957 book , where he developed them into context-free phrase structure grammars as the foundational component of a generative approach to syntax. This formulation emphasized the generative power of context-free rules to produce deep structures representing the underlying syntactic organization of sentences, which could then be mapped to surface forms via obligatory and optional transformations. By integrating with a transformational apparatus, Chomsky shifted linguistic theory toward a mechanistic, rule-based model that prioritized explanatory adequacy over mere descriptive coverage, marking a departure from structuralist traditions. This integration allowed for the systematic generation of all and only the grammatical sentences of a , while excluding ungrammatical ones, thus establishing phrase structure as essential to the competence underlying human use. A pivotal advancement occurred in Chomsky's 1965 Aspects of the Theory of Syntax, which formalized as the base component within the Standard of . Here, were tasked with deriving deep structures from a finite set of lexical items and categories, providing the structural skeleton upon which transformations operate to yield surface structures interpretable by phonological and semantic systems. Chomsky emphasized that this base mechanism captures the innate linguistic knowledge enabling speakers to produce and understand novel sentences, integrating it with principles of universality and acquisition. This framework solidified 's centrality in syntactic , influencing subsequent developments in . By the 1970s, Chomsky's evolving framework led to a refinement of phrase structure grammars through the introduction of , which imposed stricter constraints on rule formulation to reflect cross-categorial uniformity in phrase building. This shift enhanced the elegance and restrictiveness of the model without altering its generative foundations.

Formal Components

Phrase Structure Rules

Phrase structure rules form the foundational mechanism in phrase structure grammar for generating by specifying how syntactic categories expand into sequences of constituents. These rules operate as rewrite rules, allowing a single non-terminal symbol to be replaced by a string of terminal and non-terminal symbols, thereby building hierarchical phrase structures from an initial symbol, typically for . The standard format of a phrase structure rule is X \to Y_1 Y_2 \dots Y_n, where X is a non-terminal on the left side, and the right side consists of one or more symbols Y_i, which can be either terminals or non-terminals. Non-terminals represent phrasal categories such as (noun phrase) or (verb phrase), which must be further expanded by additional rules, while terminals are lexical items, such as individual words like "cat" or "runs," that appear directly in the final sentence string without further rewriting. Recursion is a key property enabled by these rules, permitting the embedding of phrases within similar phrases to generate potentially infinite sentence structures. For instance, a rule like \text{NP} \to \text{Det} (\text{Adj})^* \text{N} allows determiners and adjectives to modify nouns, and recursive application of NP rules can embed one inside another, such as in relative clauses. Constraints on phrase structure rules ensure their practicality and generative power; the set of rules must be finite, yet capable of producing an infinite array of sentences through , and basic formulations avoid issues like left-recursion that could complicate in unrestricted forms. A simple set of phrase structure rules for generating sentences includes:
S \to \text{NP VP}
\text{NP} \to \text{Det N}
\text{VP} \to V \text{NP}
These rules can produce sentences like "The cat chased the mouse" by successively rewriting non-terminals until only remain.

Derivations and Tree Structures

In , sentences are generated through a process that starts with the initial symbol S and applies a sequence of rewriting rules to produce a terminal string of words. This stepwise replacement expands nonterminal symbols into combinations of nonterminals and until only lexical items remain, ensuring the output conforms to the grammar's structural constraints. For instance, a simple for the sentence "the cat chases the mouse" proceeds as follows: S → NP VP → Det N VP → the N VP → the cat VP → the cat V NP → the cat chases NP → the cat chases Det N → the cat chases the N → the cat chases the mouse. Phrase structure trees provide a graphical of these derivations, depicting the of constituents as a diagram with s labeled by syntactic categories such as S (sentence), (noun phrase), and (verb phrase). Branches from a illustrate immediate dominance relations, where a parent category directly expands into daughter constituents, while the leaves of the correspond to the words in the terminal string. These trees visualize the constituency relations by showing how phrases embed within larger units, with dominance indicating broader inclusion and precedence reflecting linear . Syntactic ambiguity arises when a single admits multiple valid derivations and corresponding tree structures, leading to distinct interpretations. A classic example is "I saw the man with the ," which yields two phrase structure trees: one attaching the prepositional phrase "with the " to the (indicating the speaker used a telescope to see the man), and another attaching it to the "the man" (indicating the man possessed the telescope). In the first tree, the structure is , with → V where modifies V; in the second, modifies within the object position. To illustrate a complete phrase structure tree, consider Chomsky's example "Colorless green ideas sleep furiously," which demonstrates grammaticality independent of semantic coherence. The derivation begins with S → NP VP, expands NP → Adj Adj N (yielding "colorless green ideas") and VP → V Adv (yielding "sleep furiously"), terminating at the lexical string. The resulting tree has S as the root, branching to NP (with successive Adj daughters "colorless" and "green" modifying N "ideas") and VP (with V "sleep" and Adv "furiously" as sisters), highlighting how the grammar assigns structure despite the sentence's nonsensical meaning.

Variants and Extensions

Context-Free Grammars

Context-free grammars (CFGs), also known as type-2 grammars in the Chomsky hierarchy, are phrase structure grammars where each production rule is of the form A \to \alpha, with A a single nonterminal symbol and \alpha any finite string of terminals and/or nonterminals. This structure ensures that the application of a rule depends only on the nonterminal being rewritten, without regard to its surrounding context, allowing for recursive and hierarchical descriptions of syntactic structures. Formalized by Noam Chomsky in his seminal 1957 work, CFGs provide a generative mechanism suitable for modeling the constituent structure of natural language sentences. A key standardization for CFGs is the , which restricts productions to either A \to BC (where B and C are nonterminals) or A \to a (where a is a ), excluding the except possibly for the start symbol. Every CFG can be equivalently transformed into this form without altering the generated , facilitating efficient algorithmic processing such as . This normal form simplifies proofs of language properties and enables cubic-time recognition algorithms by limiting rule complexity. In the Chomsky hierarchy, introduced in 1956, type-2 grammars generate context-free languages, which include all regular languages but extend to more complex patterns like balanced parentheses or nested constructions common in syntax. These languages capture the majority of natural language phenomena, such as recursive embedding, but are insufficient for certain dependencies, including cross-serial dependencies observed in languages like Swiss German, where multiple interleaved structures cannot be described without context sensitivity. Parsing with CFGs typically involves bottom-up dynamic programming, as exemplified by the , which determines membership in the language generated by a CFG in . The fills a triangular table where each cell [i,j] represents substrings spannning positions i to j, checking possible nonterminal derivations in O(n^3) time for input length n, making it practical for syntactic analysis. For illustration, consider a CFG fragment modeling basic English sentences with recursive noun phrases: \begin{align*} &S \to \text{NP VP} \\ &\text{VP} \to \text{V NP} \\ &\text{NP} \to \text{Det N} \mid \text{NP PP} \\ &\text{PP} \to \text{P NP} \\ &\text{Det} \to \text{the} \mid \text{a} \\ &\text{N} \to \text{[dog](/page/Dog)} \mid \text{[cat](/page/Cat)} \\ &\text{V} \to \text{chases} \mid \text{sees} \\ &\text{P} \to \text{with} \mid \text{in} \end{align*} This grammar generates sentences like "the chases a " and recursively extends noun phrases, such as "the with a in the house chases the ," demonstrating how CFGs handle unbounded in constituents like NPs to model hierarchical syntax.

Advanced Models

Head-driven phrase structure grammar (HPSG) represents a significant advancement in structure frameworks by emphasizing the role of lexical information in constraining through unification-based mechanisms. Developed by Carl Pollard and A. Sag, HPSG integrates detailed lexical rules with structure schemata, allowing for a highly declarative approach where are encoded as feature structures that unify during derivation. This model addresses limitations in earlier structure grammars by treating —combining phonological, syntactic, and semantic information—as the basic units, enabling flexible handling of phenomena like long-distance dependencies without resorting to transformations. For instance, in HPSG, a verb's frame specifies the feature structures of its complements, ensuring that structures emerge from lexical constraints rather than rigid rewrite rules. Lexical-functional grammar (LFG), proposed by Joan Bresnan, extends traditional phrase structure by positing parallel representational levels: constituent structure (c-structure), which captures phrase trees similar to those in context-free grammars, and functional structure (f-structure), which encodes grammatical relations like subject and object independently of linear order. This dual-level architecture overcomes issues in single-structure models by allowing mismatches between surface phrase organization and abstract functional roles, as formalized in the original system where c-structure trees map via annotation equations to attribute-value matrices in f-structure. For example, in languages with free word order, LFG maintains consistent f-structures while varying c-structures, providing a more modular account of syntax that integrates well with morphology and semantics. The framework's constraint-based nature facilitates computational implementation and cross-linguistic analysis without heavy reliance on movement operations. Tree-adjoining grammar (TAG), introduced by Aravind Joshi, enhances phrase structure expressiveness by incorporating mild context-sensitivity through operations on elementary trees, which are non-terminal trees with distinguished sites for and adjoining. Unlike purely context-free models, TAG allows the attachment of auxiliary trees at specific nodes (adjoining), enabling the modeling of dependencies such as those in or agreement without unbounded crossing branches, while remaining parsable in polynomial time. This formalism preserves constituency relations but extends them to handle linguistic phenomena requiring limited context, as demonstrated in analyses of where an auxiliary tree adjoins to embed the relative pronoun's gap. TAG's tree-local nature ensures that phrase structures remain hierarchical and interpretable, bridging generative and traditions. X-bar theory, formalized by , provides a universal schema for phrase structure by positing binary branching hierarchies where phrases consist of a head (X^0), an optional specifier (XP), and complements (X'), ensuring endocentricity across categories like noun phrases and . This templatic approach constrains possible phrase structures to a limited set, addressing variability in basic rewrite rules by generalizing that specifiers project to maximal projections (X'') and bar-levels reflect iterative merging. For example, in a , the head verb combines with its object to form V', which then takes a as specifier to yield V'', promoting uniformity in how arguments are integrated into phrases. Jackendoff's schema influenced subsequent models by emphasizing headedness and projection principles, facilitating cross-categorial generalizations. In the evolution toward Noam Chomsky's during the 1990s, phrase structure grammar progressed to bare phrase structure, eliminating bar-levels and fixed schemata in favor of set-theoretic merges driven by selectional features, as a way to derive hierarchical structures from general computational principles without language-specific stipulations. This shift, detailed in Chomsky's work, reinterprets X-bar-like projections as emergent from head-complement and specifier relations in a binary merge operation, reducing the apparatus to bare essentials while maintaining constituency. For instance, a phrase forms simply as {head, complement} or {specifier, {head, complement}}, allowing dynamic labeling based on feature checking rather than predefined categories. This minimalist refinement aims to optimize phrase structure derivation within broader economy constraints of the language faculty.

Applications and Impact

Syntactic Theory

Phrase structure grammar (PSG) serves as a foundational component in , particularly in Chomsky's early formulations, where it generates the deep structure of sentences as a base for further syntactic operations. In this framework, PSG rules produce hierarchical representations that capture the underlying syntactic relations, which are then modified by transformational rules to yield surface structures, enabling the theory to account for the systematic differences between underlying and observed sentence forms prior to the 1980s shift toward . The integration of PSG into underscores its role in modeling the innate aspects of (UG), positing that humans possess biologically endowed principles of phrase structure that facilitate across diverse linguistic environments. Chomsky's theory suggests that these innate phrase structure principles, such as hierarchical organization and constituency relations, form part of UG, allowing children to converge on the grammar of their native language despite limited input. This view implies a universal blueprint for syntactic structure, where variations among languages arise from parameter settings within a fixed set of innate constraints. In terms of empirical coverage, PSG effectively models key syntactic phenomena, including , which permits the infinite of s within larger structures, and more generally, which allows complex integration while respecting constraints like islands that limit across certain boundaries. These capabilities demonstrate PSG's strength in capturing the bounded creativity of syntax, where derivations serve as analytical tools to reveal underlying relations without violating structural limits. Cross-linguistically, PSG accommodates variations in phrase order, as seen in head-initial languages like English, where heads precede their complements, contrasted with head-final languages like , where heads follow complements, yet both maintain universal hierarchical principles in their tree representations. This parametric variation highlights how PSG, within UG, explains typological differences while preserving core syntactic universals. Furthermore, PSG influences semantic interpretation by delineating phrase boundaries that guide how map to meaning, with deep structures providing the primary input for semantic rules in generative models. This linkage ensures that phrase-level constituency directly informs compositional semantics, bridging syntax and interpretation through hierarchical organization.

Computational Uses

Phrase structure grammars form the foundation for several key parsing algorithms in , enabling the automatic analysis of structure. , such as recursive descent, begins with the start symbol and recursively expands non-terminals according to the rules, predicting the structure from the top of the downward. This approach is straightforward to implement for predictive grammars but can suffer from in ambiguous cases. , exemplified by shift-reduce methods, starts from the input words and builds upward by shifting tokens onto a and reducing them when a production matches, efficiently handling left-recursive rules. Chart parsing, a dynamic programming , generalizes both strategies by maintaining a chart of partial parses to avoid redundant computations, making it suitable for any and achieving O(n^3) in the worst case. In , phrase structure grammars underpin statistical through probabilistic context-free grammars (PCFGs), which assign probabilities to rules derived from treebanks to disambiguate . PCFGs enable inside-outside algorithms for estimating rule probabilities and Viterbi to find the most likely tree, improving accuracy on tasks like and syntactic analysis. Early PCFG models achieved parsing accuracies around 80-85% on Wall Street Journal sections, establishing them as a benchmark for in . Phrase structure trees play a crucial role in machine translation by facilitating syntactic alignment and generation. In syntax-based models, parse trees guide the alignment of source and target phrases, preserving hierarchical structure to handle reordering and long-distance dependencies more effectively than flat phrase-based systems. For synthesis, tree-to-string transduction rules map source trees to target strings, enhancing fluency in systems like those extending with grammatical constraints. Modern libraries integrate these techniques for practical use. The Natural Language Toolkit (NLTK) supports recursive descent, shift-reduce, and chart parsers for context-free grammars, allowing users to define custom and generate parse trees from sentences. The Stanford Parser employs PCFGs and lexicalized variants to produce dependency and constituency parses, achieving high performance on English with F1 scores around 91-92% on standard benchmarks (as of its 2020 release). More recently, phrase structure grammars have been incorporated into neural architectures, such as self-attentive encoders and large language models, achieving state-of-the-art F1 scores over 96% on benchmarks like the Penn Treebank as of 2025. Despite these advances, phrase structure grammars face limitations in handling , where a single can yield exponentially many parses, and efficiency challenges with large grammars, often requiring O(n^3) time. Solutions include probabilistic , such as in PCFG , which discards low-probability partial parses to maintain tractability on real-world inputs.

References

  1. [1]
    [PDF] SYNTACTIC STRUCTURES
    In general, the assumption that languages are infinite is made in order to simplify. Page 12. 24. SYNTACTIC STRUCTURES the description of these languages. If a ...Missing: excerpt | Show results with:excerpt
  2. [2]
    Phrase-structure grammars – The Science of Syntax
    By the end of this chapter you should,. be able to build a phrase structure grammar by looking at data,; recognize and interpret constituency tests in ...
  3. [3]
    [PDF] TIIKEE MODELS FOR TIE DESCRIPTION OF LANGUAGE
    3.2. A phrase-structure grammar is defined by a finite vocabulary (alphabet) ... c73. Chomsky, N., The Logical. Structure of. Linguistic. Theory (mimeographed).
  4. [4]
    [PDF] Constituency Grammars - Stanford University
    3. free grammars are also called Phrase-Structure Grammars, and the formalism is equivalent to Backus-Naur Form, or BNF. The idea of basing a grammar on ...
  5. [5]
    5.3: Phrase Structure Rules, X-Bar Theory, and Constituency
    Mar 17, 2024 · Phrases can be grouped together to form other phrases, and to form sentences. We use tree diagrams to depict this organization.
  6. [6]
    Dependency Grammar v. Constituency Grammar - Language Log
    Oct 10, 2020 · The mathematical distinction between constituency (or "phrase-structure") grammars and dependency grammars is an old one.
  7. [7]
    Pāṇini as a variationist : Kiparsky, Paul - Internet Archive
    Dec 13, 2013 · Pāṇini as a variationist. by: Kiparsky, Paul; Joshi, S. D. (Shivram Dattatray), 1926-. Publication date: 1979. Topics: Pāṇini, Sanskrit language.
  8. [8]
    Introduction to a General Theory of Elementary Propositions
    Mar 19, 2013 · "Introduction to a General Theory of Elementary Propositions" is an article from American Journal of Mathematics, Volume 43.
  9. [9]
    [PDF] Distributional Structure - Gwern.net
    For the analysis, see the charts and comments in Reyburn'swork and in Z. S. Harris,. Cherokee Skeletal Grammar, and Cherokee Grammatical Word Lists and ...
  10. [10]
    Three models for the description of language - IEEE Xplore
    Three models for the description of language. Abstract: We investigate ... Date of Publication: 30 September 1956. ISSN Information: Print ISSN: 0096 ...
  11. [11]
    Remarks on Nominalization | Semantic Scholar
    In his introductory comments to this volume, Chomsky reviews the historical background surrounding the emergence of Remarks on Nominalization, ...
  12. [12]
    Phrase structure rules - Glottopedia
    Feb 19, 2009 · Phrase structure rules are rewrite rules that generate phrase structure(s). These have the general form of (i), where X is the name of the phrase and YZW ...
  13. [13]
    [PDF] The notion of derivations in linguistics: Syntax
    Dec 11, 2011 · An artificial illustrative example: Σ: S. F: S → NP VP. NP → he, Mary. VP → V NP. V → saw. Derivation 2: Line 1: S. Line 2: NP VP. Line 3: he ...
  14. [14]
    [PDF] Constituency Grammars - Stanford University
    Note that the ability to form coordinate phrases through conjunctions is often used as a test for constituency. ... Wundt's idea of constituency was taken up into ...<|separator|>
  15. [15]
    A1.1 Phrase Structure Rules – Essentials of Linguistics, 2nd edition
    Phrase structure rules (or PSRs) are the rules we use to build tree diagrams. They are a way to describe and record which kind of phrases can occur inside.Missing: source | Show results with:source
  16. [16]
    [PDF] Structural Ambiguity and Lexical Relations
    Prepositional phrase attachment is the canonical case of structural ambiguity, as in the timeworn example: Example 1. I saw the man with the telescope. An ...
  17. [17]
    [PDF] Noam Chomsky Syntactic Structures - Tal Linzen
    rule schemas to those of phrase structure grammars yielded clumsiness and missed insights and elegance which would be facilitated by opera- tions relating ...
  18. [18]
    [PDF] Introduction to Automata Theory, Languages, and Computation
    In the preface from the 1979 predecessor to this book, Hopcroft and Ullman ... Definition of Context-Free Grammars. 5.1.3 Derivations Using a Grammar. 5.1.4 ...
  19. [19]
    Syntactic structures : Chomsky, Noam : Free Download, Borrow, and ...
    Mar 24, 2021 · Syntactic structures. by: Chomsky, Noam. Publication date: 1957. Topics: Grammar, Comparative and general -- Syntax, Linguistics -- Research ...
  20. [20]
    [PDF] Evidence against the context-freeness of natural language
    The paper argues that natural language is not context-free, using Swiss-German's cross-serial clause construction and case-marking as evidence.
  21. [21]
    [PDF] The CYK Algorithm - Computer Science | UC Davis Engineering
    • Context-free grammar is in CNF if each rule has one of the following forms ... • Show the CYK Algorithm with the following example: – CNF grammar G.
  22. [22]
    [PDF] Context-Free Grammars and Constituency Parsing
    the noun phrase, a sequence of words surrounding at least one noun. Here are some noun phrase examples of noun phrases (thanks to Damon Runyon):. Harry the ...
  23. [23]
    Information-based syntax and semantics
    @inproceedings{Pollard1987InformationbasedSA, title={Information-based syntax and semantics}, author={Carl Pollard and Ivan A. Sag}, year={1987 ... Pollard, I.
  24. [24]
    Carl Pollard and Ivan A. Sag, Information-based syntax and ...
    Carl Pollard and Ivan A. Sag, Information-based syntax and semantics, Vol. 1: Fundamentals. Stanford: Center for the Study of Language and Information, 1987.
  25. [25]
    The Mental Representation of Grammatical Relations - MIT Press
    The Mental Representation of Grammatical Relations. Edited by Joan Bresnan. Hardcover. Out of print. Hardcover. ISBN: 9780262021586. Pub date: December 28, ...
  26. [26]
    Lexical-Functional Grammar: A Formal System for Grammatical ...
    Apr 14, 2015 · PDF | On Jan 1, 1982, Ronald M Kaplan and others published Lexical-Functional Grammar: A Formal System for Grammatical Representation | Find ...
  27. [27]
    Tree adjoining grammars: How much context-sensitivity is required ...
    Publisher: Cambridge University Press. Print publication year: 1985. Accessibility standard: Unknown. Why this information is here. This section outlines the ...
  28. [28]
    [PDF] Tree adjoining grammars
    Tree adjoining grammars (TAGs) achieve unboundedness by factoring dependencies and recursion on bounded structures, unlike transformational grammars.
  29. [29]
    Ray Jackendoff, X syntax: a study of phrase structure. (Linguistic ...
    Ray Jackendoff, X syntax: a study of phrase structure. (Linguistic Inquiry Monograph Two.) Cambridge (Mass): The MIT Press, 1977.
  30. [30]
    X syntax : a study of phrase structure : Jackendoff, Ray, 1945
    May 16, 2018 · The explanatory aim of research within the transformational-generative paradigm is to account for language acquisition.
  31. [31]
    [PDF] Bare Phrase Structure - Semantic Scholar
    1995. In Chomsky's Minimalist Program for Linguistic Theory (MPLT), movement is constrained by a locality theory on chain-links.1 Briefly stated, it is ...
  32. [32]
    [PDF] Chomsky - Bare phrase structure
    The `minimalist' program for linguistic theory seeks to explore these possibilities. Any progress toward this goal will deepen a problem for the biologial ...
  33. [33]
    [PDF] ASPECTS OF THE THEORY OF SYNTAX - Colin Phillips |
    speaking, a phrase structure grammar. Chapter 3 suggests a revi- sion of the transformational component and its relation to base. Page 5. PREFACE vii structures ...
  34. [34]
    [PDF] A Deep Dive into Transformational Grammar
    Phrase Structure Rules: The Building Blocks of. Sentences. Central to Chomsky's syntactic structures is the concept of phrase structure rules. These rules ...
  35. [35]
  36. [36]
    Linguistic Contributions to the Study of Mind - Chomsky.info
    We must postulate an innate structure that is rich enough to account for the disparity between experience and knowledge, one that can account for the ...
  37. [37]
    [PDF] On_WH-Movement.pdf
    ... phrase in an embedded structure is accessible ro rules relating it to phrases outside; plc, which stipulates that clauses are islands, subject to the ...
  38. [38]
    12 Subjacency and the ECP
    In this section, we present the subjacency condition, an ambitious attempt by Chomsky 1973 to subsume the island constraints under a single structural principle ...
  39. [39]
    [PDF] Newmeyer Handout #5 1 14. HEAD DIRECTIONALITY (1) The Head ...
    Heads follow phrases in forming larger phrases (in Japanese,. Lakhota, Basque, Amharic, …) b. Heads precede phrases in forming larger phrases (English, Edo,.
  40. [40]
    Noam Chomsky | Oxford Research Encyclopedia of Linguistics
    Aug 22, 2017 · In the 1960s phrase structure grammars were developed into what is known as the Standard Theory, which transformed into the Extended Standard ...
  41. [41]
    [PDF] Top-Down Parsing
    • A context-free grammar (CFG) is a generator for a context-free language ... • Recursive Descent – typically handwritten. • Parse table – typically ...
  42. [42]
    [PDF] Parsing 1. Grammars and parsing 2. Top-down and bottom-up ...
    All can be modeled by various kinds of grammars that are based on context-free grammars. ... Bottom-UP Chart Parsing Algorithm. Do until there is no input ...
  43. [43]
    [PDF] Basic Parsing with Context- Free Grammars - Columbia CS
    ◦ Place these new states into same chart entry as generated state, beginning and ending where generating state ends. ◦ So predictor looking at x S -> . VP [0,0].
  44. [44]
    [PDF] Probabilistic Context-Free Grammars (PCFGs) - Columbia CS
    It is very convenient to represent derivations as parse trees. For example, the above derivation would be represented as the parse tree shown in figure 2. This ...
  45. [45]
    [PDF] {Probabilistic|Stochastic} Context-Free Grammars (PCFGs)
    Gives a probabilistic language model for English. In practice, a PCFG is a worse language model for English than a trigram model. Can hope to combine the ...
  46. [46]
    Phrase Dependency Machine Translation with Quasi-Synchronous ...
    Researchers have shown that dependency trees are better preserved when projecting across word alignments than phrase structure trees (Fox 2002). This makes ...
  47. [47]
    A tree-to-string phrase-based model for statistical machine translation
    In this paper we describe a general method for tree-to-string phrase-based SMT. We study how syntactic trans-formation is incorporated into phrase-based SMT and ...Missing: synthesis | Show results with:synthesis
  48. [48]
    8. Analyzing Sentence Structure - NLTK
    A parser processes input sentences according to the productions of a grammar, and builds one or more constituent structures that conform to the grammar. A ...3 Context Free Grammar · 4 Parsing With Context Free... · 6 Grammar DevelopmentMissing: Stanford | Show results with:Stanford
  49. [49]
    Software > Stanford Parser
    A natural language parser is a program that works out the grammatical structure of sentences, for instance, which groups of words go together (as "phrases")About · Citing · Questions
  50. [50]
    [PDF] Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free ...
    Agenda-based active chart parsing (Kay, 1973; Kay,. 1980; Pereira and Shieber, 1987) provides an elegant unification of the central ideas of tabular methods for.