Fact-checked by Grok 2 weeks ago

Dependency grammar

Dependency grammar is a theoretical framework in that analyzes the syntactic structure of sentences as a of binary, asymmetrical relations—known as dependencies—between individual words, typically represented as directed trees with the serving as the central . Unlike constituency grammars, which emphasize hierarchical phrase structures, dependency grammar focuses exclusively on word-to-word connections, positing that one word (the head) governs another (the dependent) without intermediate non-terminal nodes. This approach highlights functional relations such as , object, and modifiers, often incorporating the of valency, which specifies the number and type of complements a requires to form a complete syntactic unit. The tradition traces its roots to ancient grammatical theories, including Pāṇini's Sanskrit grammar around 350 BCE, which implicitly recognized semantic and syntactic dependencies, and the Stoic logicians' verb-centered analyses in antiquity. Medieval developments, influenced by figures like Boethius and Arabic grammarians such as Ibn al-Sarrāğ, further emphasized head-dependent relations in syntax. Modern dependency grammar emerged in the mid-20th century with Lucien Tesnière's Éléments de syntaxe structurale (1959), which formalized dependency trees (or stemmas) as a tool for structural analysis, prioritizing the verb's governing role over linear word order. Subsequent advancements by scholars like Igor Mel’čuk in Meaning-Text Theory (1988) integrated dependencies across multiple strata from semantics to surface form, while Richard Hudson's Word Grammar (1984) developed a monostratal, cognitive-oriented variant. Key principles include projectivity, where dependency arcs do not cross in surface word order, and dependency directionality, which varies typologically across languages (e.g., head-initial or head-final patterns). Dependency grammars address phenomena like non-projective structures in free-word-order languages through extensions such as pseudo-projective parsing. In practice, the framework has proven influential in natural language processing, particularly for statistical parsing algorithms like those developed by Joakim Nivre (2003), and in cross-linguistic typology via projects such as Universal Dependencies, which annotate treebanks for 186 languages as of November 2025 to facilitate comparative syntax and machine translation. These applications underscore dependency grammar's emphasis on efficiency in capturing universal syntactic patterns while accommodating language-specific variations.

Core Concepts

Definition and Principles

Dependency grammar (DG) is a framework for modeling the syntax of natural languages through directed binary relations between words, known as dependencies, where each word except one designated root has precisely one head that governs it. This approach emphasizes the hierarchical organization of sentences around lexical heads, contrasting with constituency-based models that group words into phrases. Its roots trace back to ancient grammatical traditions, such as Pāṇini's analysis of . At the core of DG are several foundational principles that define its structure. The head-dependent asymmetry posits that dependencies are asymmetric, with the head word governing its dependent(s), often reflecting semantic or syntactic subordination, as originally articulated by Lucien Tesnière in his seminal work on structural syntax. The single-head constraint ensures that no word has more than one head, preventing multiple incoming dependencies and maintaining a unique path from any word to the root. Additionally, the root node serves as the sentence's primary head—typically the main verb—with no incoming , anchoring the entire structure. These principles manifest in the analysis of simple sentences. For instance, in the English sentence "The cat sleeps," the "sleeps" functions as the , the "cat" depends on "sleeps" as its , and the "The" depends on "cat" as its modifier. This yields a dependency tree where arcs point from heads to dependents, illustrating the word-to-word connections without intermediate phrasal nodes. A key of DG is that every non-root word has exactly one incoming arc, ensuring the is connected, acyclic, and thus forms a . This enforces projectivity in many formulations, where subtrees do not cross, though extensions allow non-projective dependencies for freer word orders.

Dependencies and Heads

In dependency grammar, a dependency relation links a head word to one or more dependent words, where the head governs the dependents by determining key properties of the syntactic unit they form. The head is the central element that carries the primary syntactic and semantic role, while dependents provide additional specification or modification. This , forms the core of the grammar's structure, distinguishing it from phrase-based approaches by focusing solely on word-to-word connections without intermediate phrasal nodes. Head selection relies on multiple criteria to identify the governing word in a dependency. Semantically, the head determines the overall meaning of the construction, with dependents serving to specify or restrict it, as the head acts as the to which arguments relate. Morphologically, the head imposes or on its dependents, such as through inflectional marking that aligns the dependent's form to the head's requirements. Syntactically, the head subcategorizes for its dependents via frames, dictating which dependents are obligatory and their positions relative to the head. Prosodically, the head contributes to the unit's rhythmic integrity, often ensuring prosodic unity where clitics or enclitics attach to the head for stress or boundary alignment. These criteria, drawn from frameworks like Meaning-Text Theory, collectively ensure consistent identification of heads across languages. Dependents fall into two primary types based on their relation to the head's . Arguments are required elements specified by the head's , such as direct objects that complete the head verb's meaning and cannot be omitted without altering ; they are non-repeatable and controlled by the head's semantic roles. Modifiers, or , are optional and add non-essential information, such as adjectives describing a head or adverbs qualifying a ; these can be repeated and are not valence-bound, allowing flexible attachment to the head. This distinction underscores how dependencies encode both propositional (via arguments) and elaboration (via modifiers). Dependency directionality concerns the linear order of heads and dependents, varying by language typology while the tree arcs always point from head to dependent. Head-initial languages position the head before its dependents in certain constructions, as seen in English verb phrases where the verb precedes its object (e.g., "eat apple"). Conversely, head-final languages place the head after dependents, characteristic of Japanese where verbs follow their arguments and modifiers precede the head noun (e.g., postpositional phrases). English exhibits mixed patterns, with head-final modifiers in noun phrases (adjectives before nouns), reflecting language-specific conventions overlaid on the universal dependency direction. Formally, a dependency tree is modeled as a rooted, where nodes represent words, and directed s indicate dependencies from heads to dependents. The (often a or main ) has no incoming edge, while every non- node has exactly one incoming edge (in-degree of 1), ensuring projective or non-projective structures without cycles. Heads may have unbounded outgoing s (out-degree), allowing multiple dependents, which supports hierarchical yet flat representations of sentence structure.

Historical Development

Origins in Traditional Grammar

The roots of dependency grammar can be traced to ancient linguistic traditions, particularly 's , a foundational composed around the 4th century BCE. formalized syntactic-semantic relations through the kāraka system, which identifies roles such as agent (kartā), patient (karma), and instrument (karana) as dependencies linking nouns to verbs in a . These kāraka relations function as proto-dependencies, emphasizing direct word-to-word connections centered on the , much like the head-dependent structures in modern dependency grammar, without relying on phrase-level hierarchies. This approach prioritized the verb's centrality in construction, providing an early model for asymmetrical syntactic relations. In , logicians further developed -centered analyses, viewing as governed by the with dependencies among words, influencing later theories. In European linguistic traditions, dependency-like ideas emerged through medieval and early modern grammars. Influenced by figures like (c. 480–524 CE), who introduced concepts of determinatio to describe semantic roles and head-dependent specifications across word classes, these developments emphasized modifier-modified relations. Building on concepts of government and dependentia from 12th-century grammarians, the Port-Royal Grammar of 1660 introduced the notion of to describe how elements modify a principal . Lucien Tesnière's pre-dependency grammar contributions in the early further developed these ideas; in his 1934 article "Comment construire une syntaxe?", he proposed the stemma technique—a graphical representation of as branching dependencies from a central —to analyze in translation and . Tesnière refined this method by 1936, classifying patterns across 190 languages and drawing on influences like Antoine Meillet's to emphasize verb-driven hierarchies over linear sequences. Non-Western grammatical traditions also paralleled dependency concepts by focusing on inter-word relations rather than phrasal units. In the tradition, as codified in Ibn al-Sarrāğ's Kitāb al-Uṣūl (d. 928 CE), syntax centered on the āmil (head or governor) and its maʿmūl fīhi (dependent), establishing hierarchical word dependencies that influenced later through cultural exchanges. These approaches highlighted relational asymmetries at the word level, aligning with principles and contrasting with the emerging phrase-structure focus in 20th-century . A pivotal shift occurred in the mid-20th century with Lucien Tesnière's Éléments de syntaxe structurale (), which posthumously formalized valency-based dependencies as the core of structural . Drawing on his earlier stemma work, Tesnière defined valency as the verb's capacity to govern actants (obligatory dependents) and circumstants (optional ones), providing a systematic framework for analyzing syntactic connections across languages. This publication synthesized traditional relational ideas into a cohesive theory, marking the transition from informal historical precedents to modern dependency grammar.

Modern Formulations and Key Theorists

The modern era of dependency grammar (DG) began in the mid-20th century with Lucien Tesnière's seminal work, Éléments de syntaxe structurale (1959), which formalized the relation as the core of syntactic structure and introduced valency theory to describe the combinatorial properties of words as governors and dependents. Tesnière's emphasized dependency links over constituency, using dependency trees (stemmata) to represent hierarchical relations without phrase boundaries, influencing subsequent computational and theoretical developments. Parallel advancements emerged in through David G. Hays's early algorithmic approaches to dependency analysis, as outlined in his 1964 paper on , which provided formalisms for and equivalence to immediate constituent grammars. contributed indirectly through his distributional and transformational analyses in the 1950s, where inter-word dependencies informed mappings between sentence forms, bridging structuralist methods to generative paradigms. In the Prague School tradition, Petr Sgall and collaborators developed functional dependency grammar via the Functional Generative Description (FGD) framework starting in the 1960s, integrating tectogrammatical representations to capture underlying functional relations beyond surface syntax. Post-1980s formulations expanded DG's scope with cognitive and semantic orientations. Richard Hudson's Word Grammar (1984) reframed dependencies as a network of word-to-word relations, emphasizing psychological reality and inheritance hierarchies to model syntactic inheritance without . Igor Mel'čuk's Meaning-Text Theory (MTT), initiated in the 1970s and refined through the 1990s, incorporated deep syntactic dependencies into a multi-level model linking semantics to surface text, using lexical functions to handle valency and semantic integration. In the , DG has integrated with minimalist , as seen in works deriving structures from merge operations in Minimalist Grammars, offering alternatives to by prioritizing head-dependent asymmetries. Corpus-driven advancements culminated in the Universal Dependencies (UD) project, launched in 2014, which standardizes DG annotations for cross-linguistic comparability; as of 2025, UD encompasses 319 treebanks across 179 languages, facilitating multilingual parsing and typological research.

Comparison to Phrase Structure Grammar

Structural Differences

Dependency grammar (DG) and (PSG) differ fundamentally in their representational architecture, with DG emphasizing direct binary relations between words without intermediate non-terminal nodes, while PSG relies on hierarchical binary-branching structures composed of phrasal constituents. In DG, syntax is captured through head-dependent relations among lexical items alone, treating phrases as derived rather than primitive elements, as articulated in Lucien Tesnière's foundational framework. Conversely, , as formalized by , posits constituency as a core primitive, where words are organized into nested phrases such as noun phrases (NPs) and verb phrases (VPs) to build the syntactic tree. This rejection of explicit constituency in DG allows it to view phrasal groupings as emergent properties arising from the network of dependencies, a perspective further developed by Igor Mel’čuk. To illustrate these differences, consider the sentence "The big barked." In a DG , "barked" serves as the head , with "dog" directly dependent as its , "big" as a modifier of "dog," and "the" as a of "dog," forming a flat structure of binary links without grouping into intermediate phrases. In PSG, however, "the big dog" is first bundled into an constituent via binary branching—e.g., "the" modifies "big dog," which then modifies "dog"—before the attaches to the "barked" in a (VP). This contrast highlights DG's avoidance of phrase-level nodes in favor of word-to-word asymmetries. A key structural implication of DG's design is its independence from linear precedence in defining core relations, enabling more straightforward handling of free word order languages compared to PSG's reliance on fixed constituent boundaries and order. In DG, dependencies are primarily relational and projective, allowing variations in word order to alter surface linearization without disrupting the underlying head-dependent structure, whereas PSG often requires additional mechanisms like movement rules to accommodate such flexibility.

Analytical Advantages and Limitations

Dependency grammar offers several analytical advantages over (PSG), particularly in its structural simplicity and applicability to diverse language types. By representing sentences as trees with one node per word and direct head-dependent relations, dependency grammar avoids the intermediate phrasal nodes required in PSG, resulting in flatter structures that reduce representational complexity and facilitate clearer identification of syntactic roles. This parsimony is especially beneficial for non-configurational languages with free , where PSG's emphasis on fixed constituency can impose artificial constraints, whereas dependency grammar directly links words regardless of linear position, enabling more flexible analyses. Computationally, projective dependency algorithms, such as the Eisner algorithm, achieve O(n^3) similar to CKY parsing for PSG but with lower constant factors due to the absence of non-terminal categories, making dependency grammar more efficient for large-scale processing in practice. Despite these strengths, dependency grammar faces limitations in handling certain phenomena that PSG captures more intuitively. Long-distance dependencies often require non-projective structures, where arcs cross, complicating and increasing computational demands beyond polynomial time for general cases, necessitating extensions like graph-based representations. Similarly, phrase-based phenomena like coordination challenge dependency grammar's tree-based format, as conjuncts may not form clear head-dependent hierarchies without additional mechanisms such as or multi-head , potentially leading to less intuitive analyses compared to PSG's constituent groupings. Cross-linguistically, dependency grammar demonstrates particular efficacy in s, where are primarily indicated on heads (e.g., verbs marking arguments via affixes), aligning naturally with its head-centric approach; for instance, Turkish, an agglutinative with flexible , benefits from dependency representations that accommodate frequent non-projectivity without relying on rigid phrases. In contrast, may align better with dependent-marking languages like English, where case and agreement markers appear on dependents, emphasizing configurational phrases for and . This typological insight underscores dependency grammar's in modeling head-initial or head-final dependencies across languages. A notable debate surrounds dependency grammar's theoretical adequacy, with critiquing it in favor of generative phrase structure models for limitations in capturing the full range of hierarchical structures via strong generative capacity, despite projective dependency grammars being weakly equivalent to context-free grammars in terms of the languages they generate. Proponents counter that dependency grammar achieves robust empirical coverage in multilingual corpora, as evidenced by high attachment scores in Universal Dependencies treebanks (e.g., 80-90% labeled accuracy across languages), demonstrating practical adequacy without the formalism's added layers.

Formal Models

Dependency Grammar Formalisms

Dependency grammar is formalized as a generative that defines well-formed through binary relations between words, without recourse to intermediate constituents. A grammar G can be defined using components including a set of words , a set of relations R ⊆ W × W (asymmetric and directed), and a set of labels Σ for types (e.g., , object). This structure generates dependency trees via head-selection functions specified in the , where each lexical entry includes a head word and its possible dependents. The generative rules of a dependency grammar proceed recursively, starting with the selection of a word from , which serves as the sentence's main head and has no incoming . Dependents are then attached to this root or to subsequently selected heads, guided by subcategorization frames or valency specifications in the ; these frames define the permissible number, category, and linear order of complements (obligatory dependents) and modifiers (optional dependents) for each head. For instance, a head might specify a valency frame requiring one and one object dependent, with attachments forming directed arcs from dependent to head. This process continues until all words in the are incorporated, yielding a complete . Key properties of dependency grammars include the guarantee of tree-like structures, where the resulting is connected, acyclic, and single-rooted. In projective dependency grammars, which enforce non-crossing to align with surface , a well-formed grammar often produces a unique per sentence under deterministic rules, though real-world grammars may exhibit local ambiguities resolvable via preference mechanisms such as attachment heuristics or scoring functions. A T is mathematically represented as T = (V, E), where V \subseteq W is the set of vertices corresponding to the n words in the sentence, and E \subseteq V \times V \times \Sigma is the set of labeled directed edges denoting dependencies, satisfying |E| = n - 1 to ensure the structure forms a tree with exactly one root and no cycles. This formalization aligns with the head-dependent principles outlined in core dependency theory, emphasizing asymmetric relations from dependents to heads.

Variants and Extensions

One prominent variant of dependency grammar is Link Grammar, introduced by Sleator and Temperley, which represents syntactic relations as bidirectional links between words while enforcing a no-crossing constraint to ensure planarity in the resulting structures. This approach differs from traditional dependency grammar by allowing links to connect words without strict head-directionality, facilitating the modeling of complex phenomena like coordination and discontinuous constituents through link types such as right or left nouns and verbs. Another key variant in computational applications is the arc-standard dependency grammar, a transition-based framework for that builds dependency trees incrementally using a and buffer with three operations: shift, left-arc, and right-arc. Developed by Nivre in 2004, this system assumes projectivity and enables efficient, deterministic suitable for real-time tasks. Extensions of dependency grammar often integrate with other formalisms to address multilevel linguistic analysis. For instance, hybrids with (LFG) map dependency structures onto LFG's functional structures (f-structures), combining dependency relations for syntactic heads with attribute-value matrices for functional roles like subject and object. Such integrations, as explored in conversions from LFG treebanks to dependency formats, enhance cross-framework compatibility while preserving LFG's parallelism between constituent and functional projections. Integrations with treat constructions as dependency catenae—continuous or discontinuous strings of words linked by dependencies—allowing dependency grammar to incorporate construction-specific meanings and idiomatic patterns without relying solely on lexical rules. This synthesis, proposed by Osborne and Groß, bridges the gap between form-meaning pairings in and the relational focus of dependency grammar, enabling analyses of non-compositional elements like phrasal verbs. In modern developments, Universal Dependencies (UD) serves as a typological standard for dependency annotation, standardizing relation labels across languages to support multilingual and development. As of the v2.16 release in May 2025, UD includes 319 treebanks covering 179 languages, with the v2.17 release on November 15, 2025, continuing to expand the framework that addresses typological variations in dependency patterns.

Representation Methods

Dependency Trees and Graphs

In dependency grammar, are represented as rooted trees or graphs where words serve as s and directed arcs connect heads to their dependents. The root , often a or a designated , has no incoming arc, while every other word has exactly one incoming arc from its head, ensuring a without cycles. This directed structure captures head-dependent relations, where the head governs the dependent syntactically. From a perspective, dependency representations form connected, acyclic directed graphs, specifically , with a unique from the to each . This acyclicity prevents loops, maintaining a strict , while ensures all words are linked within a single structure. Non-projective graphs extend this by allowing arcs that cross in linear order, accommodating languages with freer word orders, though projective graphs maintain non-crossing arcs for simpler . Graphically, these structures are depicted with words aligned horizontally in sentence order and directed arcs drawn upward as curves or lines above the baseline, originating from the head to the dependent. Vertical alignment facilitates visualization of projectivity, where non-crossing arcs stack neatly without intersections, emphasizing the planar nature of the tree. For example, in the sentence "She saw the man," the root "saw" governs "She" as subject and "man" as object, while "the" depends on "man" as determiner. This convention highlights the dependency hierarchy without phrasal intermediaries. Tools like DepViz provide interactive of these , rendering nodes with color-coded parts of speech and clickable arcs to explore subtrees. Such software, built on libraries like , aids in educational and analytical tasks by generating dynamic graphs from parsed sentences. Additionally, treebanks from the Universal Dependencies project support standardized across languages.

Labeling and Annotation Schemes

In dependency grammar, labeling schemes assign grammatical relations to the arcs connecting heads and dependents in dependency trees, enabling the encoding of syntactic roles such as , object, and modifier. One prominent system is provided by Universal Dependencies (UD), a multilingual framework that standardizes 37 core dependency labels to promote cross-linguistic consistency while allowing language-specific subtypes. Common UD labels include nsubj for nominal subjects, obj for direct objects, det for determiners, and advmod for adverbial modifiers. Conversions from phrase structure annotations, such as those in the Penn Treebank (PTB), to dependency labels often involve rule-based pipelines that map constituent functions to UD relations, achieving labeled attachment accuracies above 99% in optimized cases with additional annotations like entity types. These conversions prioritize as heads and reassign function words (e.g., determiners) as flat dependents to align with UD's lexicalist principles. Annotation guidelines in UD enforce a single head-per-word rule, where every non-root word depends on exactly one head, forming a with a notional node at the top. Label consistency is maintained through categories that minimize variation across languages, though subtypes (e.g., nsubj:pass for passive subjects) accommodate differences like case marking or . For instance, in the English "She eats an apple," the dependency arc from "eats" (head) to "apple" (dependent) is labeled obj to indicate a direct object relation. Challenges in these schemes include label proliferation, where an excess of fine-grained subtypes can lead to data sparsity and complicate parser training, as noted in annotations for low-resource languages like . Inter-annotator agreement for dependency labels in UD corpora typically reaches values around 0.8, varying by language and expertise level, with higher rates (e.g., 0.92) achieved post-adjudication in expert settings.

Types of Dependencies

Syntactic Dependencies

Syntactic dependencies form the core of dependency grammar by establishing directed relations between words based exclusively on grammatical structure, linking a head () to its dependent (subordinate) without reference to meaning or . These relations capture in , where each word except the depends on exactly one other word, forming a that reflects the hierarchical organization of syntactic dependencies. Pioneered by Lucien Tesnière in his 1959 work Éléments de syntaxe structurale, syntactic dependencies emphasize the as the central , with other elements attaching via binary connections like subject-verb or modifier-head. In modern formalizations such as Universal Dependencies (UD), syntactic relations are standardized into a typology of 37 universal labels, focusing on structural roles across languages. Core nominal relations include determiners (det), where a determiner modifies a nominal head (e.g., "the" depends on "" in "the book"), and possessives (poss), linking a possessor to the possessed noun (e.g., "John's" depends on "car" in "John's car"). Verbal relations encompass nominal subjects (nsubj), connecting a to the verb it subcategorizes (e.g., "birds" depends on "sing" in "Birds sing"), clausal subjects (csubj), attaching subordinate clauses to predicates (e.g., "that it rains" depends on "likely" in "That it rains is likely"), and auxiliaries (aux), where helping verbs support main verbs (e.g., "are" depends on "flying" in "Birds are flying"). Adjectival relations feature adjectival modifiers (amod), adjectives attaching to nouns (e.g., "blue" depends on "sky" in "blue sky"), and clausal modifiers of nouns (acl), relative clauses linking to head nouns (e.g., "who sings" depends on "bird" in "the bird who sings"). These relations prioritize grammatical function over linear position or semantics. An illustrative example is the sentence "Birds sing loudly": here, "sing" serves as the root, with "birds" as its nsubj dependent (subject relation) and "loudly" as an adverbial modifier (advmod) dependent on "sing," forming a simple projective that highlights pure syntactic linkage. Such dependencies underpin the of structure in dependency grammar, enabling parsers to reconstruct grammatical hierarchies efficiently. of UD treebanks for English reveals that these syntactic dependencies yield projective structures—where subtrees do not cross—in approximately 96% of sentences across 14 corpora comprising over 32,000 examples, underscoring their prevalence in head-initial languages like English.

Semantic and Functional Dependencies

Semantic dependencies in dependency grammar extend beyond structural relations to capture meaning-based connections between predicates and their arguments, often incorporating thematic or theta roles such as , , , and recipient. These roles, originally formalized in by Fillmore (1968) and elaborated as proto-roles by Dowty (1991), represent the semantic contributions of dependents to the overall proposition, where the head (typically a verb) assigns roles to its arguments based on their interpretive function. In frameworks like Meaning-Text Theory, semantic dependencies form a distinct layer above syntactic ones, linking words via logical predicates and arguments, as in Mel'čuk's (1988) multi-stratal model. Predicate-argument structures, as in PropBank, provide a practical of semantic within dependency-based analyses, annotating verbs with numbered argument roles (e.g., A0 for prototypical agent, A1 for patient or ) that align with dependency trees. This allows dependency parsers to incorporate (SRL) using features like dependency paths between predicates and candidates, enabling global inference to resolve argument assignments consistently. For instance, semantic dependencies facilitate cross-linguistic comparisons by abstracting from surface syntax to core event structures, as seen in resources like the Proposition Bank . Functional dependencies, in contrast, emphasize grammatical roles tied to heads, such as subject () or object (), which govern agreement, case marking, and syntactic behavior. These relations, rooted in Tesnière's (1959) foundational work on dependency junctions, treat function words (e.g., determiners, prepositions) as dependents that specify the grammatical function of , often directionally from head to dependent. In Universal Dependencies, functional labels like nsubj (nominal subject) or obj (direct object) encode these roles, with case features (e.g., nominative for subjects) annotated on arcs to reflect morphological dependencies. This layer bridges and semantics by ensuring that case roles align with theta assignments, as in head-final languages where accusative markers signal patienthood. A representative example is the sentence "John gave Mary a book," where the verb gave serves as the semantic head with dependencies: gaveJohn (agent/A0, nominative subject), gaveMary (recipient/A2, dative object), and gavea book (theme/A1, accusative object). Here, semantic dependencies highlight the event's participants via PropBank roles, while functional dependencies specify grammatical cases and positions relative to the head. Multilayer dependency grammars, such as those in ParDeepBank, explicitly stack semantic layers above syntactic ones to represent these relations, using Minimal Semantics (MRS) to derive logical forms from dependency trees. This approach, applied to parallel corpora in English, Portuguese, and Bulgarian, aligns predicate-argument structures across languages, achieving high coverage (e.g., 82% for English sentences) by propagating semantic heads from syntactic dependencies. Such frameworks address limitations of single-layer models by allowing independent resolution of semantic and functional ambiguities.

Linearization and Non-Projectivity

Projectivity in Dependency Structures

In dependency grammar, projectivity is a fundamental property that constrains the structure of dependency trees to align with the surface linear order of words in a . A dependency tree is projective if, for every , the set of words in its subtree forms a contiguous of the , ensuring that no dependency arcs cross when the tree is projected above the word sequence. This property simplifies by guaranteeing that subtrees correspond to intervals in the linear order, avoiding discontinuities that complicate computational models. Formally, projectivity can be tested by examining pairs of dependency arcs. For any two arcs from head h_1 to dependent d_1 and from head h_2 to dependent d_2, the spans [ \min(pos(h_1), pos(d_1)), \max(pos(h_1), pos(d_1)) ] and [ \min(pos(h_2), pos(d_2)), \max(pos(h_2), pos(d_2)) ] must either be disjoint, or one must be fully contained within the other, without partial interleaving of words. This condition ensures that all dependencies respect the , where modifiers and their subconstituents remain adjacent in the sentence string. A classic illustration of projectivity appears in the English "I saw the man with the telescope," where one syntactic attaches the prepositional "with the telescope" to the verb "saw," forming a contiguous subtree that includes "saw," "the," "man," and "with the telescope" as an from position 2 to 7. This projective structure maintains adjacency for the , contrasting with alternative attachments that would violate projectivity. Empirical studies of projectivity in annotated corpora reveal language-specific patterns. In head-initial languages like English, approximately 95% of sentences in Universal Dependencies treebanks exhibit projective structures, reflecting the relatively rigid that aligns dependencies with linear contiguity. In contrast, free-word-order languages such as show lower rates, with only about 89% of sentences being projective, due to greater flexibility in constituent placement that more frequently induces crossing dependencies.

Handling Discontinuities and Word Order

Dependency grammars address discontinuities in syntactic structures, such as those resulting from extraposition and , by permitting non-projective s that allow dependents to appear non-contiguously relative to their heads. These phenomena disrupt the contiguous spans typical in projective trees, but dependency grammar models them through crossing dependencies or transformations that preserve the underlying head-dependent relations while accommodating the observed linear disruptions. For instance, extraposition in English with intervening material, as in "I saw a man yesterday who I know," can be represented with a non-adjacent from the head noun "man" to the "who I know," where "yesterday" (attached to "saw") intervenes, causing a crossing dependency and avoiding the need for empty categories or movement rules. Scrambling, common in languages like , exemplifies how dependency grammar captures flexible constituent reordering without altering the hierarchical structure. While simple cases like "Das Buch las ich" ("The book read I") remain projective, scrambling often contributes to non-projectivity when combined with other phenomena such as extraposed relative clauses or intervening elements, resulting in crossing dependencies. For example, in sentences where a is dislocated past the main verb, such as "Obendrein hat Connectix ein Verfahren eingebaut, das eingelegte PlayStation-CDs erkennt" ("Moreover, Connectix has built in a that recognizes inserted CDs"), the adclausal dependency from "Verfahren" to "das..." crosses the main verb arc, reflecting German's free . This approach contrasts with phrase structure grammars, which often require additional mechanisms like adjunction or traces to handle such reorderings, whereas dependency grammar directly encodes the dependency while treating the discontinuity as a linear variation. A fundamental feature of dependency grammar is its separation of the hierarchical dependency relations from linear precedence, enabling the same dependency structure to support varying word orders across or within a . Unlike phrase structure grammars, where constituency imposes strict ordering constraints, dependency grammar linearizes trees via independent ordering principles, such as head-initial or head-final rules applied to subtrees. This decoupling facilitates analysis of free-word-order , where precedence is determined post-dependency assignment rather than being integral to the tree. In computational implementations, handling these discontinuities and word orders has evolved through extensions like pseudo-projective parsing in tools such as MaltParser, which transforms non-projective structures into projective ones for efficient arc-eager processing. Originally limited to projective cases in early deterministic algorithms, MaltParser's non-projective mode, introduced in subsequent developments, uses gap encodings to manage crossing dependencies during transition-based . More recent graph-based models, leveraging maximum algorithms, directly optimize non-projective parses and improve accuracy for discontinuous structures, achieving higher unlabeled attachment scores on datasets with .

Syntactic Functions

Argument Structure and Valency

In dependency grammar, valency denotes the capacity of a lexical head, typically a , to govern a specific number and type of syntactic dependents, analogous to the bonding capacity of an atom in . This concept, introduced by Lucien Tesnière, underscores that verbs are the central elements of , attracting obligatory complements known as actants to saturate their valency slots. For instance, transitive verbs exhibit a valency of at least two, requiring both a and an object, while intransitive verbs have a valency of one, needing only a . Argument structure in dependency grammar specifies the configuration of these valency slots, distinguishing between core arguments, which are obligatory dependents essential to the predicate's meaning, and peripheral arguments, which are optional that add circumstantial information without fulfilling the head's minimal requirements. frames capture the permissible combinations of these arguments for a given head, encoding the syntactic relations it licenses, such as , direct object, or indirect object. Core arguments directly realize the predicate's semantic roles as actants, whereas peripheral ones, like adverbials, attach more loosely and can often be omitted without rendering the structure incomplete. A classic example is the English verb give, which has a trivalent valency frame requiring three core arguments: the giver (), the receiver (indirect object), and the gift (direct object), as in "She gave him the book." In contrast, intransitive verbs like run exhibit underspecification, with a monovalent that mandates only a , as in "She runs," leaving no slots for additional core objects. This underspecification highlights how valency frames adapt to , ensuring that only the required dependents are projected in the dependency tree. Formally, the valency frame of a head h can be represented as a set V(h) = \{ d_1 : rel_1, d_2 : rel_2, \dots \}, where each d_i denotes a dependent and rel_i specifies the syntactic relation it must bear to h, such as nominal subject or oblique object. In processes of lexical , such as forming deverbal nouns, valency allows the derived form to retain or modify the original frame's requirements, propagating argument slots across morphological categories while preserving core-peripheral distinctions. This mechanism ensures consistency in how predicates license dependents within the broader relations of the grammar.

Dependency Roles and Relations

In dependency grammar, dependents are categorized into grammatical and functional roles that specify their syntactic and semantic connections to the head word. Tesnière's foundational framework distinguishes between actants, which are obligatory core arguments such as subjects and objects that fill essential slots in the 's , and circonstants, optional modifiers that add circumstantial details like manner or . These roles emphasize the verb as the central governor, with dependents linking directly to it or other heads to form hierarchical structures. Building on valency considerations, such roles interpret how words fulfill relational functions within the sentence. Common roles include the nominal subject (nsubj), defined as the syntactic subject and proto-agent of a , typically a or that performs the action; the direct object (obj), the patient or theme affected by the ; clausal complements (ccomp), finite s that serve as arguments to the head , often conveying reported speech or embedded propositions; and modifiers such as (advmod) for adverbs specifying time or degree, or adjectival (amod) for attributive adjectives describing s. For instance, in the "the book that I read," the "that" assumes the nsubj role of the embedded "read," linking the to the head "book" while maintaining its agentive function within the subordinate structure. Dependency relations extend to non-argument links, including coordination (conj), a symmetric relation between parallel elements like conjoined verbs or nouns, where the first item is conventionally the head; apposition (appos), which equates two noun phrases in a flat structure, such as proper names sharing reference; and flat dependencies for multi-word units, treating idioms or compounds as layered dependents without deep to preserve semantic unity. These relations address symmetric or associative phenomena that challenge strict head-dependent . Cross-linguistically, role assignment shows flexibility, especially in polysynthetic languages like , where verb complexes incorporate multiple morphemes as dependents, allowing roles such as subject or object to be encoded via derivational affixes rather than separate words, necessitating morpheme-level annotations to capture variation in syntactic elaboration.

Applications in Linguistics and Computation

Theoretical and Cross-Linguistic Uses

Dependency grammar has been employed in to test universals related to head-directionality, which classifies languages based on whether heads precede or follow their dependents in . By analyzing dependency directions in treebanks across multiple languages, researchers have demonstrated that languages form a from predominantly head-initial to head-final patterns, providing empirical support for typological universals in . This approach refines traditional binary classifications by quantifying the proportion of head-initial versus head-final dependencies, revealing mixed tendencies in all languages studied. In formal semantics, dependency grammar interfaces with compositional semantic frameworks through adaptations like Glue Semantics, which map dependency structures to logical forms while preserving lexical integrity and handling non-binary branching. This interface enables analyses of phenomena such as quantifier scope ambiguities, control infinitives, and relative clauses by composing meanings directly from dependency relations. Such integrations bridge syntax and semantics without relying on intermediate phrase structures, facilitating precise semantic derivations from dependency trees. Cross-linguistically, dependency grammar supports typological comparisons by aligning dependency relations with features in resources like the World Atlas of Language Structures (WALS), such as subject-verb and adjective-noun orders. Dependency-sensitive metrics, applied to WALS data, measure typological distances while accounting for predictable feature dependencies, enhancing the accuracy of across morphosyntactic and phonological traits. For agglutinative languages, which feature extensive suffixation for grammatical encoding, dependency grammar offers advantages in typology by focusing on syntactic relations between words rather than internal segmentation, allowing clearer cross-language comparisons of dependency patterns in languages like Turkish and . A key example of dependency grammar's universal applicability is the principle of dependency distance minimization, where syntactically linked words tend to be positioned closer together in sentences to reduce , as evidenced by analyses in dependency networks. This minimization holds across , supporting claims of structural s in human syntax. Recent extensions of Dependencies, a dependency-based framework, have advanced documentation of endangered as of 2025, such as the development of treebanks for Suansu, a Tibeto-Burman spoken in . These efforts adapt UD schemas to handle language-specific morphosyntactic deviations, aiding preservation and typological insights for under-resourced varieties.

Computational Parsing and NLP Integration

Dependency grammar has been central to computational since the early 2000s, with designed to efficiently construct dependency trees from sentences. Transition-based parsers, such as the arc-standard , process input words sequentially using a and to build projective trees in linear time, O(n), by applying transitions like shift, left-arc, and right-arc. This approach, introduced by Nivre, enables greedy or beam-search decoding, making it suitable for applications despite potential error propagation in long-range dependencies. In contrast, graph-based parsers model dependency structures as weighted directed graphs and find the maximum (MST) to yield the highest-scoring tree, accommodating both projective and non-projective attachments via the Chu-Liu-Edmonds . This method, as formalized by McDonald et al., operates in O(n^3) time but can be approximated to O(n^2) for practical efficiency, excelling in of arc scores learned from training data. Graph-based approaches handle non-projectivity more naturally than transition-based ones, though hybrid systems combine elements of both for improved accuracy. Integration of dependency parsing into (NLP) pipelines often serves as preprocessing for tasks like (MT), where syntactic dependencies inform alignment and reordering. For instance, dependency-aware models enhance neural MT by incorporating syntactic knowledge to better capture argument structure, leading to improved translation quality in low-resource languages. Hybrid systems also merge dependency parsing with part-of-speech () tagging, using joint models to refine both annotations iteratively, as POS errors directly impact dependency accuracy. Prominent tools implement these algorithms with high performance; the Stanford Parser employs neural network-based dependency parsing to output typed dependencies, achieving robust results across languages. Similarly, UDPipe provides an end-to-end trainable for tokenization, tagging, , and parsing in CoNLL-U format, supporting over 100 languages via Universal Dependencies (UD). On the English Penn Treebank (PTB), state-of-the-art models reach unlabeled attachment scores (UAS) of approximately 96%, demonstrating the maturity of these systems for English while highlighting challenges in multilingual settings. Recent advances as of 2025 leverage transformers for neural dependency parsing, with self-attentive biaffine models outperforming prior LSTMs by encoding contextual representations directly into arc prediction. UDPipe's evolution incorporates transformer-based embeddings for enhanced morphological analysis and parsing, boosting multilingual robustness. Following UD version 2.17's release on November 15, 2025, which expanded treebanks to 339 across 186 languages, parsers exhibit greater cross-linguistic consistency due to unified annotation guidelines.

References

  1. [1]
    Dependency Grammar - Annual Reviews
    Jan 14, 2019 · Dependency grammar is a descriptive and theoretical tradition in linguistics that can be traced back to antiquity.
  2. [2]
    Dependency Grammar (Chapter 23) - The Cambridge Handbook of ...
    Dependency grammar (DG) is the modern continuation of the European tradition of grammatical analysis stretching back to classical antiquity.
  3. [3]
    [PDF] Chapters of Dependency Grammar
    Dependency and constituency are alternative frameworks for syntactic analysis. They provide conceptual and formal tools to describe how words combine to ...
  4. [4]
    [PDF] Dependency Grammar
    Jul 25, 2002 · In Ancient times and the Middle Ages, formal grammar was primarily dependency-based. • In contemporary formal grammar, the development of ...
  5. [5]
    [PDF] 2 Dependency Parsing - maxwell.vrac.puc-rio.br
    The roots of dependency grammar can be traced back to medieval theories of grammar [26] or even to Panini's grammar of Sanskrit [27], several centuries.<|control11|><|separator|>
  6. [6]
    [PDF] Dependency Grammar
    A sentence is parsed by choosing for each word what other word (including ROOT) that it is a dependent of. • Usually some constraints: • Only one word is a ...Missing: core principles
  7. [7]
    (PDF) Dependency in Language - ResearchGate
    The notion of linguistic dependency is introduced with three major types: semantic, syntactic, and morphological dependencies.
  8. [8]
    [PDF] Dependency Grammar
    Hudson's Word Grammar [Hudson(2004)] explicitly allows for structure-sharing, explicitly violating the single-head constraint: ▷ wash → clean. ▷ dish → clean.
  9. [9]
    [PDF] Stanford typed dependencies manual
    Dependency syntax representations are naturally thought of as “directed graphs”, but some of the precise formal properties of Stanford dependencies graphs can ...
  10. [10]
    (PDF) Is Paninian grammar a Dependency grammar? Why or why not?
    Paninian grammar is investigated with respect to three core aspects of dependency-based approaches to syntax: 1) the absence of the initial binary division of ...
  11. [11]
    [PDF] Translators' introduction - HAL-SHS
    Feb 16, 2021 · Dependency gram- mars of course do acknowledge phrases, a phrase being a complete subtree consisting of two or more words. Tesnière defined the ...
  12. [12]
    Parsing Chinese Sentences with Grammatical Relations
    Usually, dependencies represent various grammatical relations (GRs), which are exemplified in traditional grammars by the notions of subject, direct/indirect ...
  13. [13]
    Modern linguistic theory and the Arabic grammatical tradition
    There are striking parallels between modern linguistic theory and the grammatical tradition developed by the medieval Arabic grammarians.
  14. [14]
    Elements of Structural Syntax - OAPEN Home
    ... Tesnière presents insightful analyses of numerous phenomena of syntax. Among the highlights are the concepts of valency and head-initial vs. head-final ...
  15. [15]
    Lucien Tesnière, Elements of structural syntax. Translated by ... - jstor
    The ideas about valency and dependency that Tesnière developed in this book laid the foundations for subsequent researchers to construct a universal parsing ...
  16. [16]
    Dependency Theory: A Formalism and Some Observations - jstor
    A dependency grammar is strongly equivalent to an IC grammar if (i) they have the same terminal alphabet, and (ii) for every string over that alphabet ...
  17. [17]
    [PDF] More Concerning the Roots of Transformational Generative Grammar
    Harris, Zellig S. 1969. The Two Systems of Grammar: Report and paraphrase. (= Transformations and Discourse Analysis Papers, 79.) Philadelphia: University of ...
  18. [18]
    Functional Generative Description | ÚFAL
    The basic principles of the Functional Generative Description (FGD) were formulated at the beginning of the 60's of the 20th century by Petr Sgall and have been ...
  19. [19]
    Word Grammar | Richard ('Dick') Hudson
    1984: Word Grammar (Blackwell) First attempt to think through the consequences of abandoning phrase-structure in favour of dependency structure. A research ...
  20. [20]
    [PDF] THE MEANING-TEXT THEORY
    The goal of Meaning-Text theory (MTT) is to write systems of explicit rules that express the correspondence between meaning and text (or sound) in various ...
  21. [21]
    Dependency Structures Derived from Minimalist Grammars
    This paper provides an interpretation of Minimalist Grammars [16],[17] in terms of dependency structures. Under this interpretation, merge operations derive ...Missing: integration | Show results with:integration
  22. [22]
    Universal Dependencies
    UD is an open community effort with over 600 contributors producing over 200 treebanks in over 150 languages.Short introduction to UD · Dependency Relations · UD English PUD · UD Guidelines
  23. [23]
  24. [24]
  25. [25]
    [PDF] Dependency Parsing - Stanford University
    Dependency parsing describes sentence structure using directed relations between words, illustrated by arcs from heads to dependents, without phrase-structure ...
  26. [26]
    [PDF] Dependency Parsing - Stanford University
    Dependency parsing describes sentence structure using words and directed relations between them, linking heads to dependents, without phrasal constituents.Missing: survey | Show results with:survey<|control11|><|separator|>
  27. [27]
    [PDF] Dependency Parsing of Turkish - ACL Anthology
    This article presents an investigation of data-driven dependency parsing of Turkish, an agglutinative, free constituent order language that can be seen as the ...
  28. [28]
    Notational Variants and Cognition: The Case of Dependency Grammar
    Jan 9, 2023 · If word A dominates or governs word B, then word B depends on word A. Accordingly, word A is called a 'head' and word B a 'dependent'. These ...
  29. [29]
    dependency grammar in nLab
    May 19, 2021 · Dependency grammars are a kind of formal grammars based on the notion of dependency graph, rather than constituency trees.Missing: Σ) | Show results with:Σ)
  30. [30]
    [PDF] An Introduction to Dependency Grammar - Programming Systems Lab
    Many linguists consider Dependency Grammar (DG) to be inferior to established phrase structure based theories like GB (Chomsky 1986), LFG (Kaplan & Bresnan ...
  31. [31]
    [PDF] Dependency Grammar
    Jul 25, 2002 · Dependency is an asymmetrical relation between a head and a de- pendent, i.e. a vertical organization principle. Heads and dependents are ...
  32. [32]
    [PDF] Dependency Structure Grammars
    Dependency grammars (DGs) are formal grammars, which define syntactic re- lations between words in the sentences. Following to the tradition going back to L ...Missing: seminal | Show results with:seminal
  33. [33]
    [PDF] Dependency Grammar: Classification and Exploration
    Ralph Debusmann and Marco Kuhlmann. 1.4 Extensible Dependency Grammar. For the exploration of dependency grammars, we have developed a new meta- grammatical ...Missing: survey | Show results with:survey
  34. [34]
    [PDF] Some Notes on Generative Capacity of Dependency Grammar
    2Sleator and T emperley's formalism is called Link. Grammar but it is usually classified as a dependency system. between dependency grammars and context- free ...
  35. [35]
    [PDF] Parsing English with a Link Grammar
    Our program is written in C and may be obtained through the internet. c. 1991 Daniel Sleator and Davy Temperley. * School of Computer Science, Carnegie Mellon ...
  36. [36]
    [PDF] Parsing English with a Link Grammar | Semantic Scholar
    This work has encoded English grammar into a new formal grammatical system called a link grammar, and written a program (based on new algorithms) for ...
  37. [37]
    Arc-Eager Parsing with the Tree Constraint - MIT Press Direct
    Jun 1, 2014 · One of the most widely used transition systems for dependency parsing is the arc-eager system first described in Nivre (2003), which has been ...
  38. [38]
    From Lexical Functional Grammar to enhanced Universal ...
    Feb 4, 2019 · The paper describes the conversion of an LFG treebank of Polish into enhanced Universal Dependencies, and—more generally—identifies the kinds of ...
  39. [39]
    Construction Grammar meets Dependency Grammar | Request PDF
    Aug 6, 2025 · Dependency grammars are the formalisms that use clausal argument relation annotations to tag binary grammatical relationships between the ...
  40. [40]
    Download UD treebanks - Universal Dependencies
    283 treebanks, 161 languages, released May 15, 2024. Version 2.13 treebanks are archived at http://hdl.handle.net/11234/1-5287. 259 treebanks, 148 languages, ...
  41. [41]
    [PDF] Lecture 7: Introduction to Dependency Parsing
    Formally, the dependency structure of a sentence is a graph with the words of the sentence as its nodes, linked by directed, labeled edges, with the following ...
  42. [42]
    Dependency formalisms
    This structure, as defined by Melcuk [10], consists of a set of planar directed arcs among the words that form a tree. Each word (except the root word) has an ...
  43. [43]
    nathanlesage/depviz: Visualize dependency trees - GitHub
    DepViz is a tiny visualization server that is built on top ... I created this tool because I am a visual learner and have no formal linguistic background.
  44. [44]
    Syntax: General Principles - Universal Dependencies
    Syntactic annotation in the UD scheme consists of typed dependency relations between words. The basic dependency representation forms a tree.
  45. [45]
    [PDF] Universal Dependencies v1: A Multilingual Treebank Collection
    In this paper, we describe v1 of the universal guidelines, the underlying design principles, and the currently available treebanks for 33 languages.
  46. [46]
    Universal Dependency Relations
    There are 37 universal syntactic relations in UD v2, organized by functional categories in relation to the head and structural categories of the dependent.Missing: grammar | Show results with:grammar
  47. [47]
    [PDF] Roads Lead to UD: Converting Stanford and Penn Parses to English ...
    We describe and evaluate different approaches to the conversion of gold standard corpus data from Stanford Typed Dependencies (SD) and Penn-style ...
  48. [48]
  49. [49]
    [PDF] The Coptic Universal Dependency Treebank rnt=k
    In this section we evaluate the application of the. UD annotation scheme to Coptic by conducting an inter-annotator agreement experiment using three pairs of ...
  50. [50]
    Universal Dependencies | Computational Linguistics | MIT Press
    Jul 13, 2021 · Similarly, the typology of dependency relations also captures whether the dependent is a nominal, a clause, or a modifier. For example, a ...
  51. [51]
  52. [52]
    [PDF] Validation of Universal Dependencies by regeneration
    Table 1 shows statistics on the 14 English corpora that comprise 32,641 sentences, of which 1,408 (4,1%) have non-projective dependencies and gave rise to ...
  53. [53]
  54. [54]
    [PDF] The Proposition Bank: An Annotated Corpus of Semantic Roles
    The Proposition Bank project takes a practical approach to semantic representation, adding a layer of predicate-argument information, or semantic role ...
  55. [55]
    [PDF] Dependency-based Syntactic-Semantic Analysis with PropBank and ...
    A representation of the complex grammatical relation between the predicate and the argument. It consists of the sequence of dependency relation labels and link ...
  56. [56]
    [PDF] ParDeepBank: Multiple Parallel Deep Treebanking
    The Parallel Deep Bank (ParDeep-. Bank) has potentially many applications: for HPSG grammar development; machine translation; evaluation of parsers on ...
  57. [57]
    [PDF] Restricted Non-Projectivity: Coverage vs. Efficiency - ACL Anthology
    A syntactic dependency tree is projective if the yield of each node is a substring of the sentence—or equivalently, if no dependencies cross when drawn above ...
  58. [58]
    [PDF] An Efficient Algorithm for Projective Dependency Parsing
    First of all, it must be noted that projectivity is not a property of the dependency graph in itself but only in relation to the linear ordering of tokens in ...
  59. [59]
    [PDF] A Survey of Non-Projective Dependencies and a Novel Approach to ...
    This thesis surveys non-projective dependencies in English, German, and Czech, and creates a projectivization scheme to improve parsing performance.
  60. [60]
    Fast and Accurate Non-Projective Dependency Tree Linearization
    We propose a graph-based method to tackle the dependency tree linearization task. We formulate the task as a Traveling Salesman Problem (TSP), and use a ...
  61. [61]
    Valency - Brill Reference Works
    Valency as a concept was first fully developed within the framework of Dependency Grammar by Tesnière (1959). He compared the verb to an atom that has the ...
  62. [62]
    [PDF] Dependency Grammar
    – In PSG, phrase structure provides the scope for linear precedence, or word order domains. – In DG, we have no phrases, only lexical nodes ... • With ...
  63. [63]
    [PDF] Dependency Grammar and Valency Theory - SciSpace
    Tesniere's DG thus contains all elements of a modern phrase structure grammar: the concept of a phrase corresponds to the concept of a node. The structural ...
  64. [64]
  65. [65]
    Formal Semantics for Dependency Grammar - ACL Anthology
    In this paper, we provide an explicit interface to formal semantics for Dependency Grammar, based on Glue Semantics.
  66. [66]
    Morphology: General Principles - Universal Dependencies
    If a language is agglutinative, this is typically the form with no inflectional affixes; in fusional languages, the lemma is usually the result of a ...
  67. [67]
    Universal Dependencies for Suansu - ACL Anthology
    Aug 5, 2025 · This contribution presents the Naga-Suansu Universal Dependencies (UD) treebank, the first resource of this kind for Suansu, an endangered and ...
  68. [68]
    Non-Projective Dependency Parsing using Spanning Tree Algorithms
    Ryan McDonald, Fernando Pereira, Kiril Ribarov, and Jan Hajič. 2005. Non-Projective Dependency Parsing using Spanning Tree Algorithms. In Proceedings of Human ...
  69. [69]
    Software > Stanford Parser
    In version 3.5.0 (October 2014) we released a high-performance dependency parser powered by a neural network. The parser outputs typed dependency parses for ...About · Citing · Questions
  70. [70]
    UDPipe: Trainable Pipeline for Processing CoNLL-U Files ...
    UDPipe, a pipeline processing CoNLL-U-formatted files, performs tokenization, morphological analysis, part-of-speech tagging, lemmatization and dependency ...<|separator|>
  71. [71]
    Improved Dependency Parsing using Implicit Word Connections ...
    Furthermore, by combining with a pre-trained language model, our model gets state-of-the-art performance on the English PTB dataset, achieving 96.35% UAS and ...
  72. [72]
    [PDF] Self-attentive Biaffine Dependency Parsing - IJCAI
    The current state-of-the-art dependency parsing approaches employ BiLSTMs to encode input sentences. Motivated by the success of the transformer-based ...
  73. [73]
    [PDF] Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing ...
    Aug 17, 2018 · This volume contains papers describing systems submitted to the CoNLL 2018 Shared Task: Multilingual. Parsing from Raw Text to Universal ...