Fact-checked by Grok 2 weeks ago

Lexical functional grammar

Lexical Functional Grammar (LFG) is a constraint-based, non-derivational theory of linguistic structure that models , , and semantics through parallel levels of representation, primarily constituent structure (c-structure) for surface syntactic form and functional structure (f-structure) for and features. Developed in the late by Joan Bresnan and Ronald M. Kaplan as an alternative to transformational , LFG emphasizes lexical specification of syntactic properties and avoids movement rules or deep structures, instead using unification of attribute-value matrices to link form and function. Its core architecture separates the organization of words and phrases in c-structure—typically represented as phrase structure trees generated by context-free rules—from the abstract encoding of predicate-argument relations, tense, agreement, and other features in f-structure. The framework originated from Bresnan's critique of transformational approaches, particularly in her 1978 work on psychological plausibility in syntax, and was formalized in Kaplan and Bresnan's 1982 paper, which introduced LFG as a mathematically tractable system compatible with computational processing. Over time, LFG has expanded to include additional projection levels, such as argument structure (a-structure) for thematic roles and prosodic structure (p-structure) for phonological encoding, while maintaining its declarative, modular design. Key well-formedness conditions ensure structural integrity: the Uniqueness Condition prohibits conflicting values in f-structures, Completeness requires all grammatically governed functions to be realized, and Coherence mandates that only those functions appear. Unlike transformational theories, which rely on derivations from underlying representations, LFG employs a parallel correspondence via projection functions (e.g., φ mapping c-structure nodes to f-structure) and lexical mapping theory to handle phenomena like passives and unergatives without transformations. This lexicalist approach, where much of the grammar's complexity is encoded in lexical entries and rules, supports functional uncertainty for long-distance dependencies and enables analyses of non-configurational languages, such as Warlpiri, with flexible . LFG's monotonicity and economy of expression principles promote surface-oriented explanations, aligning with cross-linguistic typology and integrating with semantics via frameworks like Glue Semantics. In practice, LFG has influenced through tools like the Linguistic Environment (XLE) for and generation, and it accommodates diverse languages, including those with complex predicates, pro-drop, and systems in Austronesian languages. Its emphasis on universality in f-structure alongside language-specific c-structure variation makes it a powerful tool for theoretical and applied research, as detailed in major works like Dalrymple (ed.)'s 2023 Handbook of Lexical Functional Grammar.

Introduction

Overview

Lexical functional grammar (LFG) is a constraint-based, non-transformational theory of syntax in that models grammatical structure through multiple parallel levels of , allowing for the of lexical with syntactic rules without sequential derivations. This approach emphasizes declarative constraints over procedural transformations, enabling a modular of where syntactic, morphological, and semantic components interact independently yet cohesively. Central to LFG is the separation of surface syntactic form, which reflects observable phrase and linear order, from abstract grammatical functions, such as and object, that represent underlying relational properties of clauses. By distinguishing these dimensions, LFG captures how languages encode meaning and relations independently of their superficial arrangement, facilitating the treatment of phenomena like argument and agreement. Developed in the late by Joan Bresnan and Ronald Kaplan, LFG was proposed as an alternative to transformational , prioritizing lexical specification and to model syntax. The theory's primary goals include accounting for cross-linguistic diversity, such as in languages with free word order, and explaining long-distance dependencies through functional equations rather than movement operations. This parallel architecture supports a typology-sensitive framework that accommodates variation while maintaining universal principles of grammatical organization.

Basic principles

Lexical functional grammar (LFG) is fundamentally lexicalist, positing that the serves as the primary repository for syntactic information, with complex derived directly from lexical entries without post-lexical transformations that alter word forms. This principle of lexical integrity holds that morphologically complex words are in syntactic representations, preventing syntactic rules from accessing or manipulating their internal morphological structure. As a result, all idiosyncrasies of and are encoded lexically, ensuring that syntax operates only on whole words as indivisible leaves in phrase structure trees. LFG adopts a declarative , treating grammars as sets of declarative constraints rather than procedural rules that generate derivations step by step. This approach rejects the notion of deep structure and transformational movement operations, which are central to earlier generative models, in favor of direct mappings from lexical specifications to surface-oriented of sentence structure. Instead of deriving one level of representation from another through operations, LFG posits that grammatical well-formedness is determined by the simultaneous satisfaction of constraints across multiple parallel levels of structure. A core tenet of LFG is the universality of grammatical functions, such as subject (SUBJ) and object (OBJ), which are treated as primitive relational categories independent of phrase structure configurations and applicable across diverse languages. These functions capture universal aspects of predicate-argument relations, allowing LFG to model syntactic phenomena without relying on language-specific constituent positions to define roles like subjecthood. For instance, grammatical functions provide a consistent framework for analyzing argument alternations that vary cross-linguistically, emphasizing their role in encoding syntactic relations directly from the lexicon. The projection architecture of LFG enables lexical entries to simultaneously project information to multiple representational levels, ensuring coherence through constraint-based mappings rather than sequential derivations. In this parallel system, lexical specifications drive the construction of surface forms while satisfying functional requirements, promoting both linguistic universality and computational efficiency in grammar implementation. This design underscores LFG's commitment to a modular yet integrated view of grammar, where lexical projections align diverse structural dimensions without intermediate abstract levels.

History and development

Origins in the 1970s

Lexical Functional Grammar (LFG) emerged in the mid-1970s through collaborative efforts at and PARC, driven by linguists and computational researchers seeking alternatives to prevailing syntactic theories. Key figures Joan Bresnan, then at , and Ronald M. Kaplan, at PARC after earlier work at Harvard, initiated the framework's development in response to the limitations of Noam Chomsky's Extended Standard Theory (EST), which emphasized complex transformational derivations that struggled to accommodate empirical observations from language use. This period marked a pivotal shift toward a grammar model that prioritized psychological realism and cross-linguistic applicability over abstract rule transformations. The initial motivations for LFG were rooted in psycholinguistic evidence indicating that human language processing involves parallel mechanisms rather than strictly serial derivations, challenging the EST's assumptions about mental representations of grammar. Researchers aimed to address these findings by designing a system that could handle diverse linguistic phenomena without invoking movement rules, particularly to explain cross-linguistic variations in word order, such as free or flexible arrangements in non-configurational languages. This focus on parallel processing and typological diversity sought to bridge theoretical linguistics with experimental data from comprehension and production studies, fostering a more unified account of syntactic relations. A foundational precursor to LFG was Bresnan's 1978 paper "A Realistic ," which critiqued the derivational complexity and psychological implausibility of transformational rules in , arguing that such mechanisms overburdened and failed to align with evidence. In its place, Bresnan proposed functional annotations as a direct way to encode like and object, emphasizing lexical specification over syntactic transformations to achieve greater efficiency and realism. This approach highlighted the lexicon's central role in determining syntactic behavior, setting the stage for LFG's non-derivational architecture. Kaplan's contributions in the mid-1970s emphasized computational feasibility, drawing on his expertise in and to explore finite-state approximations for LFG grammars. At PARC, he investigated how constraint-based representations could enable efficient, implementable models of , ensuring that theoretical innovations remained practical for machine processing and empirical testing. These efforts complemented Bresnan's linguistic insights, promoting a framework amenable to both human cognition and computational simulation. Early collaborations among Bresnan, Kaplan, and associates at these institutions involved integrating ideas from , , and , while shifting away from the arc-based relational networks of emerging relational grammar. Relational grammar, developed concurrently by and others, used arcs to represent changing across derivation levels, but LFG proponents moved toward a lexicalist that encoded relations statically through functional specifications, avoiding multi-stratal derivations altogether. This evolution reflected a broader dissatisfaction with layered syntactic theories, favoring a unified, constraint-driven system for grammatical analysis.

Key publications and milestones

The foundational formalization of Lexical Functional Grammar (LFG) occurred with the publication of Kaplan and Bresnan's 1982 paper "Lexical-Functional Grammar: A Formal System for Grammatical Representation," which introduced the core architecture and constraint-based approach. This was followed by Joan Bresnan's edited volume The Mental Representation of Grammatical Relations in 1982, which compiled key papers articulating the theory's core principles and marked the shift from informal proposals to a rigorous framework. A pivotal advancement in the was the introduction of functional uncertainty by Ronald M. Kaplan and Annie Zaenen in their 1989 paper "Long-Distance Dependencies, Constituent Structure, and Functional Uncertainty," enabling the theory to handle certain long-distance dependencies through constraints on functional structures without relying on transformations. In the 1990s, LFG gained traction in through grammar engineering projects. This period also saw the launch of the ParGram initiative in 1996, an international collaboration aimed at creating parallel typological grammars across languages using LFG to ensure consistent functional structure representations for cross-linguistic comparison and . The 2000s featured significant expansions in integrating LFG with semantics, highlighted by Mary Dalrymple's 1999 edited collection Semantics and Syntax in Lexical Functional Grammar: The Resource Logic Approach, which developed a glue semantics framework using to compose meanings from lexical resources. In the 2010s, open-source tools advanced LFG's accessibility, including the Linguistic Environment (XLE) parser developed at PARC, which implements efficient chart-based parsing for LFG grammars and has been widely used in grammar engineering projects. Concurrently, the annual LFG conference series, inaugurated in 1996, has fostered ongoing research and community collaboration, with proceedings documenting theoretical and applied advancements.

Theoretical foundations

Parallel architecture

Lexical functional grammar (LFG) employs a parallel architecture in which multiple levels of linguistic representation are projected simultaneously from lexical items, rather than derived sequentially through transformations. This model posits two primary autonomous dimensions: constituent (c-structure), which captures phrase structure and linear order; and functional (f-structure), which encodes grammatical functions such as and object along with morphosyntactic features. An additional dimension, argument (a-structure), specifies predicate-argument relations including thematic roles. These levels are generated in parallel from the , where lexical entries provide the core predicates and constraints that populate each , ensuring that the grammar operates as a declarative system of constraints rather than a procedural one. Unlike serial derivation models, LFG's parallel projection avoids stepwise transformations between levels, treating them as independent yet interconnected through functions. For instance, the mapping function φ links c-structure nodes to f-structure elements in a many-to-one fashion, allowing a single f-structure to correspond to multiple possible c-structures without altering the functional s. This autonomy enables the levels to evolve their own principles—syntactic for c-structure, functional for f-structure, and semantic for a-structure—while remaining constrained by the to ensure coherence across representations. The formal definition of an LFG thus views it as a over the set of well-formed structures at each level that satisfy the imposed constraints, defining through simultaneous satisfaction rather than a generative from one level to another. The parallel architecture promotes by permitting mismatches between phrase structure and functional or argument relations, which is particularly advantageous for analyzing non-configurational languages where is free and constituency is flat, yet grammatical functions remain robustly encoded. In such languages, the same f-structure can align with diverse c-structures, avoiding the need to derive one from the other and allowing -specific syntactic variation without impacting universal functional principles. For example, the English give projects a ditransitive a-structure with roles for , , and (e.g., PRED 'give <(SUBJ), OBJ, OBJθ>'), which maps to a configurational c-structure like "gave the book to ." In contrast, the same a-structure in a non-configurational like Warlpiri can correspond to a flat c-structure with freer , such as discontinuous noun phrases, while preserving the functional relations. This flexibility highlights LFG's capacity to model cross-linguistic diversity through parallel constraints rather than uniform derivations.

Lexicalist approach

Lexical functional grammar (LFG) adopts a lexicalist approach by centralizing the bulk of grammatical information within the , thereby reducing the need for idiom-specific rules in the syntactic component. This design posits the as the primary repository for irregularities and language-specific idiosyncrasies, allowing syntactic rules to remain largely universal and productive. In LFG, the serves as the primary repository for grammatical information, ensuring that deviations from regular patterns are encoded at the lexical level rather than dispersed across construction-specific mechanisms. Central to this approach is the specification of subcategorization frames, argument linking, and morphological rules for each , which captures the unique combinatorial properties of words without invoking phrasal idioms. For instance, verbs like "eat" are lexically defined with frames such as 〈SUBJ, OBJ〉, dictating their argument requirements and mappings to grammatical functions. Morphological processes, including and , are handled through lexical rules that operate within the , aligning with principles of distributed morphology to avoid post-syntactic adjustments. This integration ensures that remains modular and constrained by lexical integrity, prohibiting syntactic operations from intruding into morphological domains. LFG's commitment to lexicalism extends to the principle of no syntactic idioms, where apparent constructional irregularities, such as passives, are derived via lexical alternations rather than dedicated phrasal rules. Passivization, for example, is treated as a lexical process that adjusts argument linking without altering the core syntactic architecture. This lexical of changes underscores the theory's emphasis on "all changes are lexical," promoting a uniform treatment of grammatical functions across languages. The lexicalist approach yields benefits in predictability for , as learners can generalize from lexical specifications rather than memorizing construction-specific exceptions, and in computational efficiency for , where constraint-based evaluation of lexical entries streamlines resolution. A representative example involves causative verbs, which are lexically specified with adjusted argument structures and linking rules to grammatical functions, such as promoting an original to an while introducing a new external causer as ; this handles cross-linguistic variations, like in , without resorting to syntactic transformations.

Core components

Constituent structure (c-structure)

In Lexical Functional Grammar, the constituent structure, or c-structure, represents the surface syntactic organization of a through a phrase structure tree that captures linear precedence, dominance, and constituency relations among words and phrases. This tree structure adheres to context-free and is typically binary-branching, though the framework permits multi-branching nodes to accommodate language-specific syntactic patterns. Each terminal node corresponds to a morphologically complete word, in accordance with the principle of lexical integrity. Nodes in the c-structure are labeled with syntactic categories, such as for noun phrase, for , and S or for sentence or inflectional phrase. Additionally, nodes bear functional annotations using up-arrow (↑) and down-arrow (↓) symbols, which encode equations linking the c-structure to other levels of representation; ↓ denotes the functional structure (f-structure) of the current node, while ↑ refers to that of its mother node. For instance, in an English sentence, the subject might be annotated as (↑ SUBJ) = ↓, indicating that its f-structure value fills the function of the mother node. The c-structure's design emphasizes flexibility to model typological variation without enforcing a universal template like . For non-configurational languages, such as Warlpiri, flat or multi-branching trees are permitted, avoiding obligatory VP constituents and allowing adjuncts or arguments to attach directly to higher nodes. This language-particular approach enables the representation of diverse word orders and constituency patterns while maintaining a surface-true depiction of syntax. In , the c-structure provides the overt syntactic input to functions that derive deeper , focusing on how surface forms realize phonological and morphological properties. For example, an English subject-verb-object like "A linguist saw the proof" has a c-structure S → NP_{(↑ SUBJ)=↓} VP_{(↑ PRED)= 'see<SUBJ, OBJ>'}, with the VP expanding to V NP_{(↑ OBJ)=↓}. In , —where arguments can freely reorder, as in object-subject-verb sequences—is handled via flat c-structures like S → (NP)* V, with annotations assigning grammatical functions (e.g., SUBJ, OBJ) independently of linear to preserve semantic relations across permutations. These mappings from c-structure briefly with f-structure to ensure grammatical coherence.

Functional structure (f-structure)

In Lexical Functional Grammar (LFG), the functional structure, or f-structure, represents the abstract syntactic relations of a sentence as a lattice-like attribute-value matrix (AVM), where attributes correspond to grammatical properties and functions, and values are either atomic features, symbols, or subsidiary f-structures. This structure encodes key grammatical functions such as subject (SUBJ), object (OBJ), and complement (COMP), along with morphosyntactic features like tense, number, person, and the predicate (PRED). For instance, the f-structure organizes information hierarchically, allowing shared values across attributes to capture phenomena like agreement and coreference, forming a directed lattice rather than a simple tree. The f-structure is subject to strict well-formedness conditions: the Uniqueness Condition prohibits conflicting values for any attribute; completeness requires that every grammatical function designated as governable by the predicate—such as SUBJ or OBJ in a transitive verb—must be realized in the f-structure, ensuring no required arguments are missing; and coherence prohibits extraneous functions that are not governed by the predicate, preventing irrelevant elements from appearing. These constraints together verify that the f-structure fully and appropriately represents the sentence's grammatical relations. Feature values in the f-structure are resolved through unification, a process that merges compatible information from lexical entries, rules, and annotations to produce the most specific consistent , known as the least general unifier. If conflicting values arise—such as singular and for the same number attribute—unification fails, rendering the structure ill-formed. This allows incremental construction of the f-structure from diverse sources while maintaining consistency. To connect the f-structure to other levels, LFG employs path expressions in functional annotations, such as (↑ SUBJ) = ↓, which specifies that the f-structure of the current constituent (↓) fills the SUBJ attribute of its mother's f-structure (↑). These expressions enable precise mapping of surface forms to abstract functions without relying on linear order. A representative example is the passive sentence "The toy was given to the baby by the girl," whose f-structure demotes the original to an (OBL) function while promoting the to SUBJ, with the realized as a distinct OBL distinguished by its peripheral case (PCASE):
[ PRED    'give <SUBJ, OBL, OBL>'
  TENSE   PAST
  SUBJ    [ PRED  'toy'
            NUM   SG ]
  OBL     [ PCASE 'to'
            PRED  'baby'
            NUM   SG ]
  OBL     [ PCASE 'by'
            PRED  'girl'
            NUM   SG ] ]
Here, completeness ensures the governed functions (SUBJ, OBL for and ) are present, and unification resolves features like number across the matrix.

Argument structure (a-structure)

In Lexical Functional Grammar (LFG), argument structure (a-structure) represents the semantic predicate-argument relations of a , encoding the core participants in an event as a hierarchically ordered list derived from the verb's lexical entry. This structure interfaces between and , specifying the number and types of arguments without reference to their syntactic positions. The arguments in a-structure are associated with theta-roles, such as (the instigator of the event), (the entity undergoing change or motion), and (the endpoint or recipient), which capture the semantic relations between the and its participants. These roles follow a universal thematic , typically ordered as > > recipient/experiencer > > / > , which guides the assignment of grammatical functions. Argument linking rules systematically map theta-roles from a-structure to grammatical functions in functional structure (f-structure), such as (SUBJ) or object (OBJ), through Lexical Mapping Theory. This theory employs intrinsic role classifications—unrestricted ([-o]) for agents and locatives, or restricted ([-r]) for themes/patients—and defaults like assigning the highest-ranked role to SUBJ, ensuring biuniqueness where each role links to at most one function. An hierarchy further influences oblique assignments, prioritizing animate arguments for core functions like SUBJ over inanimates, as seen in languages like where animates receive as subjects. A-structure influences morphological realizations, including case marking and agreement, by determining how arguments are morphologically encoded to reflect their semantic roles and grammatical functions. For instance, in ergative languages like , case suffixes directly encode a-structure roles, with nominative for the most agent-like argument and accusative for themes. A-structure distinguishes core arguments, which are subcategorized by the verb and obligatory, from adjuncts, which are optional non-arguments like locatives or instruments that do not bear theta-roles but may add modifiers. Core arguments must link to grammatical functions, while adjuncts remain outside this mapping. A representative example is the ditransitive verb give in English, whose a-structure is <agent, theme, goal>, linking the agent to SUBJ ("Mary"), the theme to OBJ ("the book"), and the goal to a secondary object (often OBJθ or OBJ2, "John") in the construction "Mary gave John the book." This mapping adheres to the thematic hierarchy, with the agent as the highest-ranked role assigned to SUBJ.

Grammar rules and mappings

Structure mappings

In Lexical Functional Grammar (LFG), structure mappings establish relations between the parallel levels of representation, such as constituent (c-structure), functional (f-structure), and argument (a-structure), without relying on transformational derivations. These mappings are defined as constraint-based functions that ensure the well-formedness of structures through monotonic unification of attribute-value matrices, allowing for flexible syntactic realizations while preserving invariant functional and argument relations. The φ (phi) function serves as the primary correspondence between c-structure and f-structure, mapping nodes in the phrase structure tree to objects in the f-structure. Formally, it is a total, many-to-one where φ(↑) denotes the f-structure of the dominating and φ(↓) the f-structure of the current , facilitating equation resolution in annotated such as (↑ SUBJ) = ↓. For instance, in the sentence "Anna wrote books," the φ maps the NP "" to the SUBJ attribute in the f-structure, unifying relevant features like and number. This mapping supports the projection architecture by allowing multiple c-structure , such as or specifiers, to contribute to a single f-structure unit without altering its core properties. The λ (lambda) function links a-structure to f-structure, specifying how thematic roles from the predicate's argument structure are realized as grammatical functions through morphosyntactic rules. It operates via lexical mapping principles, such as those using binary features [±r] (restrictive) and [±o] (objective), to assign roles like to SUBJ or to OBJ, ensuring cross-linguistic consistency in argument realization. For example, in predicates with alternations, the λ function determines case assignment, as in verbs where an maps to nominative SUBJ and a to genitive OBJ. This function integrates with the φ mapping in extended architectures, sometimes composed as φ = λ ∘ α, where α projects from c-structure to a-structure. Outside-in functional uncertainty provides a for handling unbounded dependencies within these mappings, using regular expressions over f-structure paths to relate distant elements without traces or . The equation (↑ COMP* PRED) = 'say', for instance, allows the predicate of an embedded complement to unify with the matrix verb's requirements, capturing phenomena like wh-extraction across clauses. Constraints on path length or domain, such as gf* where gf includes SUBJ, , or COMP, ensure by restricting searches to accessible f-structure attributes. A key advantage of this design is its application to free-word-order languages, where c-structure scrambling yields an invariant f-structure via the φ function. In Warlpiri, for example, NPs marked for case (e.g., ergative for SUBJ, absolutive for ) can appear in any linear order within a flat VP, but the φ mapping consistently assigns them to the same grammatical functions in f-structure, preserving argument relations regardless of surface position. This relational approach, enforced by unification constraints, contrasts with derivational theories by treating mappings as declarative specifications rather than sequential operations.

Functional uncertainty and cohesion

In Lexical Functional Grammar (LFG), functional uncertainty provides a mechanism for capturing long-distance dependencies, such as those arising in wh-questions, relative clauses, and topicalization, without relying on transformational rules or traces. This approach employs path-based equations in functional structures (f-structures) to relate non-local elements, allowing for flexible yet constrained associations between a filler and its gap. A general form of such an equation is ( \uparrow \, \text{GF}_1^* \, \text{GF}_2 ) = x, where \uparrow denotes the f-structure of the current node, \text{GF}_i represents grammatical functions like subject (SUBJ) or complement (COMP), and the Kleene star (*) indicates zero or more intermediate functions, enabling the path to traverse multiple levels of embedding. The coherence condition in LFG ensures that f-structures are well-formed by requiring all grammatical functions subcategorized for by a to be realized within the structure, preventing underspecification or extraneous elements. Specifically, for each in the f-structure, every required must appear as a grammatical function, promoting a tight integration between lexical specifications and syntactic realization. This condition applies universally to core s, guaranteeing that the functional description matches the verb's requirements without allowing spurious or incomplete projections. Adjunct cohesion extends these principles to modifiers, such as , topics, and foci, by incorporating an extended condition that mandates their linkage to an appropriate in the f-structure without altering . Under this extension, must be coherently attached via paths that respect the predicate's frame, ensuring modifiers contribute additional information while maintaining the integrity of required . For instance, temporal or locative unify with the f-structure of the they modify, satisfying without introducing unbound functions. LFG extends functional uncertainty to handle anaphora and through f-structure paths that enforce locality and constraints in a non-derivational manner. Binding equations, such as ( \uparrow \, \text{SUBJ} ) = ( \uparrow \, \text{COMP} \, \text{GF} ), allow an anaphor to corefer with its antecedent across embedded structures, where the path traverses complements or while respecting principles analogous to those in generative binding theory. This approach accommodates cross-linguistic variation in binding domains by parameterizing the allowable paths, ensuring anaphors bind appropriately within their minimal governing category. A representative example of functional uncertainty resolving relative clause attachment appears in Turkish, where head-internal relative clauses exhibit long-distance dependencies between the relativized noun and its . In constructions like adam [kitab-ı oku-yan] adam ("the man who reads the book"), uncertainty equations such as ( \uparrow \, \text{HD} ) = ( \downarrow \, \text{GF}^* \, \text{OBJ} ) link the head noun to the object across the participial , unifying the f-structures without operations and accommodating free . This mechanism highlights LFG's efficacy in agglutinative languages with flexible syntax.

Applications

Computational implementations

Lexical functional grammar (LFG) has been implemented computationally since its , with a on efficient and tools that support its structure representations. Early implementations emphasized declarative processing of constituent (c-) and functional (f-) structures, enabling robust applications. These systems integrate constraint-based unification for f-structures while handling the ambiguity inherent in c-structures through specialized algorithms. Chart-based parsing forms the core of many LFG implementations, allowing integrated building of c- and f-structures during . This approach uses variants of the Earley algorithm to construct parse charts that represent possible derivations efficiently. To manage the exponential growth of ambiguous parses, packed shared forests are employed, where common substructures are shared across alternative analyses, reducing redundancy and computational cost. This technique, developed by and Kaplan, enables processing of sentences with high ambiguity while preserving all viable interpretations for subsequent unification. Finite-state approximations provide another key computational strategy in LFG, particularly for handling and simpler syntactic phenomena. Kaplan and colleagues demonstrated how LFG grammars can be converted or approximated using finite-state transducers (FSTs), which preprocess input for tokenization, morphological analysis, and even limited long-distance dependencies. These FSTs integrate seamlessly with deeper LFG parsing, allowing efficient handling of regularities in word forms and basic phrase structures without full context-free expansion. Such approximations are especially useful in resource-constrained environments or for preprocessing in wide-coverage grammars. The Linguistic Environment (XLE) stands as the primary tool for LFG development, offering a comprehensive platform for writing, testing, and optimizing annotated c-structure rules and lexical entries. XLE supports chart parsing, FST , and , facilitating the creation of deep grammars for practical applications. Open-source alternatives, such as the Linguistic Environment (FLE) and eXLEpse, extend accessibility by providing similar editing and parsing capabilities without proprietary restrictions; FLE, for instance, emphasizes collaborative engineering, while eXLEpse integrates with for enhanced usability. LFG implementations have proven effective in through projects like ParGram, which develops parallel grammars maintaining consistent f-structure alignments across languages. ParGram grammars, implemented in XLE, cover numerous languages including English, , , , and , enabling robust parsing for transfer-based translation systems. These grammars achieve high coverage and accuracy in applications, such as English-Norwegian MT prototypes, by leveraging LFG's cross-linguistic universality. A persistent in LFG computational implementations is handling during f-structure unification, where multiple c-structure paths may lead to conflicting functional equations. Packing techniques mitigate this by delaying full expansion until necessary, often combined with optimization principles like marks to prune suboptimal unifications efficiently. Despite these advances, scaling to very large ambiguities remains computationally intensive, requiring ongoing refinements in representation and search strategies.

Typological and cross-linguistic studies

The ParGram project, a collaborative effort to develop parallel LFG grammars across typologically diverse languages, has produced implementations for languages including English, , , , , and . These grammars align functional structures (f-structures) to reveal universal patterns in predicate-argument relations and grammatical functions, while allowing constituent structures (c-structures) to vary parametrically according to language-specific and . For instance, Japanese's head-final syntax contrasts with English's head-initial order, yet both yield comparable f-structures for equivalent sentences, demonstrating LFG's capacity to model cross-linguistic parallelism without imposing a universal c-structure. LFG's argument structure (a-structure) and mapping principles provide a unified account of ergative and split-intransitive systems by linking thematic roles to grammatical functions and case assignment independently of linear order. In Warlpiri, an Australian language with , transitive subjects receive while intransitive subjects may alternate between nominative and absolutive marking based on semantic transitivity; a-structure distinguishes unergative and unaccusative intransitives, ensuring consistent f-structure realization despite morphological splits. Similarly, in , which exhibits ergative alignment in transitive clauses but accusative in intransitives, a-structure linking assigns to external arguments (agents) and absolutive to internal ones (patients or themes), capturing the language's auxiliary selection and agreement patterns without relying on configurational hierarchies. LFG addresses free word order phenomena, such as , through mechanisms like , which permit multiple c-structure positions to map to the same f-structure attributes. In , scrambling of objects or to pre-verbal positions for or is analyzed without rules; instead, equations correlate non-canonical orders to invariant f-structures, preserving and constraints. , with its flexible SVO/SOV/OVS possibilities driven by information structure, employs similar to license variations while maintaining consistent predicate-argument relations in f-structures. These applications highlight LFG's typological adequacy in capturing functional universals—such as -object asymmetries and predicate-argument coherence—amid parametric c-structure variation, making it particularly valuable for field on underdescribed languages where surface forms diverge widely. For example, in , a VSO language, the c-structure flattens and object under the , but mapping rules project them as SUBJ and OBJ in the f-structure, aligning with accusative patterns found in SVO languages like English.

Comparisons with other theories

Versus generative grammar

Lexical functional grammar (LFG) fundamentally differs from Chomskyan in its non-transformational architecture. While generative theories, such as Government and Binding (GB) and the (MP), rely on transformational rules to derive surface structures from underlying deep structures—exemplified by operations like NP-movement or V-to-I raising—LFG eschews such derivations entirely, instead employing declarative constraints and lexical mappings to relate different levels of representation directly. This approach avoids the complexity of movement rules, which in can lead to overgeneration of ungrammatical forms. In terms of levels of representation, LFG posits multiple parallel structures—constituent structure (c-structure) for phrase organization and functional structure (f-structure) for grammatical relations—that correspond via projection functions, contrasting with generative grammar's serial derivation from a single underlying phrase structure to interface levels like Logical Form (LF) and Phonetic Form (PF) in Minimalism. This parallelism, akin to aspects of Jackendoff's parallel architecture, supports more efficient models of language comprehension and production. LFG's explanatory power lies in its lexicalist emphasis, where much syntactic information is encoded in the , reducing reliance on abstract parameters and enabling straightforward accounts of phenomena like passives without invoking unobservable movements, unlike generative grammar's potential for overgeneration in complex derivations. Cross-linguistic data further highlights these differences; for instance, LFG handles non-configurational languages such as Warlpiri and through flexible c-structures that map to f-structures, predicting patterns based on functional cues rather than hierarchical structures alone, which generative models struggle to unify across typological . Despite these contrasts, both frameworks share a universalist , positing formal models of innate linguistic to explain rapid acquisition and grammaticality judgments, though LFG achieves this through a more lexical and less parametrically driven system that prioritizes surface-oriented constraints over deep abstract rules.

Versus head-driven phrase structure grammar

Lexical Functional Grammar (LFG) and (HPSG) are both lexicalist, constraint-based frameworks that eschew transformations in favor of declarative representations, yet they diverge significantly in their architectural approaches to linguistic structure. A core difference lies in the separation of representational levels: LFG posits distinct structures—constituent structure (c-structure) for surface syntactic constituency and linearity, functional structure (f-structure) for grammatical relations and predicate-argument structure, and argument structure (a-structure) for thematic roles—linked by mapping functions that allow for mismatches between form and function. In contrast, HPSG employs a single, unified sign-based feature structure that integrates phonological, syntactic, and semantic information within a typed feature system, avoiding the need for separate levels by encoding all relations through attribute-value matrices. This separation in LFG facilitates modularity, enabling independent variation across structures to model typological diversity, such as free word order languages, more straightforwardly. HPSG, however, excels in providing a richer, more detailed feature geometry that captures fine-grained inheritance relations and constraints within a single hierarchy. Regarding grammar formalisms, LFG relies on relational constraints expressed through functional equations and annotations in , which project information from c-structure to f-structure without deep embedding of lexical details in syntactic rules. HPSG, by comparison, utilizes type hierarchies for and lexical rules that inherit and modify feature values, allowing for a more integrated treatment of lexical exceptions and paradigmatic relations. The handling of word order further highlights these contrasts: LFG encodes linearity primarily through its c-structure, a context-free phrase structure tree that directly reflects surface order, with mechanisms like functional uncertainty permitting non-local dependencies in f-structure. HPSG addresses order via linear precedence principles and order domains within the valence lists of the head, often requiring additional operations like domain union or compaction to linearize constituents without a dedicated phrase structure level. For instance, in analyzing clause structure, LFG can flatten the mittelfeld into a single c-structure node while maintaining f-structure precedence, whereas HPSG uses ordered domains to group and sequence elements like NPs and VPs. An illustrative example is the treatment of passive constructions. In LFG, passives are derived through lexical mapping rules that suppress the external in a-structure and promote the internal to in f-structure, without altering c-structure linearity—e.g., "The ball was kicked by John" maps the patient to SUBJ while demoting the to an . HPSG handles passives via lexical rules that inherit from active entries, adjusting the list to remove the slot and add a by-phrase complement, with the promoted object filling the unsaturated position through type . This in both frameworks underscores their shared emphasis on lexicon-driven syntax, but LFG's multi-level mappings offer greater flexibility for cross-linguistic passive variations, while HPSG's unified structure ensures tighter constraint integration.

Criticisms and extensions

Limitations and debates

One empirical challenge to Lexical Functional Grammar (LFG) lies in accounting for island effects, which constrain long-distance dependencies such as wh-extraction or quantifier scope. In standard LFG analyses, capturing these effects often requires extensions like functional uncertainty paths or multi-modal glue semantics, where mode assignments (e.g., blocking operators ⇃2 for finite clauses) are tailored to specific , introducing ad-hoc elements to limit scoping out of islands like factive complements while permitting it in others such as rogative clauses. Debates also surround LFG's reliance on grammatical functions (GFs) for semantic composition, where primitives like subject and object directly inform predicate-argument structure in glue semantics. Critics argue this syntactic grounding of semantics risks over-dependence on GFs, potentially underplaying language-specific thematic roles or structures that emerge independently of . Theoretically, LFG's and architecture—positing independent c(onstituent)-structure and f(unctional)-structure levels—has faced scrutiny from psycholinguistic studies questioning its alignment with neurolinguistic processing data. For instance, evidence suggests incremental, non-modular integration of syntactic and semantic information during comprehension, challenging the strict separation of projections in LFG as psychologically implausible for real-time language use. Debates on the universality of GFs further highlight tensions, as LFG treats them as primitive and invariant across languages, yet typological evidence from agglutinative languages like Turkish or polysynthetic ones like Central Alaskan Yup'ik reveals variable relational hierarchies that resist uniform GF mapping, suggesting diachronic grammaticalization patterns undermine claims of innateness. Such counterexamples from typological studies underscore challenges to LFG's cross-linguistic applicability. In response, proponents emphasize LFG's flexibility through extensions like optimizability constraints or enriched glue semantics to accommodate variations, though its adoption in formal semantics remains slower than alternatives like type-logical grammars, partly due to the complexity of integrating GFs with dynamic predicate logic. For example, agreement mismatches in , such as ϕ-feature discrepancies between subjects and postverbal objects in , strain LFG's extended coherence condition by complicating PRED value unification across non-canonical orders, often necessitating language-specific lexical rules.

Recent developments

In the 2020s, semantic extensions to Lexical Functional Grammar (LFG) have advanced through deeper integration with Glue Semantics, particularly in Mary Dalrymple's work, which facilitates monotonic meaning composition directly from functional structures (f-structures). The XLE+Glue system, developed by Dalrymple, Patejuk, and Zymla, embeds Glue Semantics into the Linguistics Environment (XLE) parser, allowing grammar engineers to compute semantic representations alongside syntactic analyses in a declarative manner. This integration supports resource-sensitive semantics via , enabling efficient handling of phenomena like scope ambiguity in reciprocals, as explored in Asudeh and Dalrymple's 2022 analysis of reciprocal scope using LFG+Glue. Dalrymple's 2023 handbook chapter further elaborates Glue Semantics as a syntax-independent framework for LFG, emphasizing its role in composing meanings monotonically without backtracking. Hybrid approaches combining LFG with machine learning have emerged to enhance parsing robustness, particularly through neural architectures. For instance, neural network language generation systems have incorporated LFG f-structures to improve spoken dialogue outputs, as in Mairesse et al.'s 2016 multi-domain model that uses LFG for structured semantic transfer before neural refinement. More recent efforts, such as those in the 2022 BigScience workshop, discuss Transformer-based models' emergent syntactic structures akin to LFG's parallel projections, suggesting potential for hybrid inference in low-resource settings. These hybrids leverage transformers for initial sequence labeling while enforcing LFG constraints for functional coherence, improving accuracy in tasks like dependency recovery. In 2025, Miriam Butt argued that LFG remains robust in the era of large language models (LLMs), advocating for hybrid rule-based and probabilistic approaches to enhance NLP applications. LFG has found broader applications in modeling sign language grammars, notably (ASL) projects since 2015. Neidle's LFG-based analysis of ASL syntax highlights non-manual markers and as functional projections, integrating them into f-structures without relying on linear order. This approach supports efforts, as in Huenerfauth's 2003 survey extended in later ASL generation work, where LFG handles spatial syntax for natural signing avatars. In creole studies, LFG's typological flexibility aids in analyzing contact-induced grammars; for example, Plag's 2011 model uses LFG terminology to describe pidgin-to-creole transitions, emphasizing invariant lexical forms and functional mappings in languages like Sranan. Theoretical updates in LFG have incorporated prosody as a dedicated level via p-structure in multi-factor grammars. Bögel et al.'s 2009 proposal for prosodic in LFG introduced p-structure as a from c-structure, co-described by constraints for syllable-based phrasing. Recent developments extend this: Elfardy and Habash's 2022 computational implementation integrates p-structure into LFG parsers for , modeling prosody-syntax mismatches via alignment functions. In 2023, Wedgwood's overview in the LFG handbook surveys p-structure interfaces, advocating syllable-driven p-diagrams for cross-linguistic prosodic typology. Jones's 2024 analysis further refines the syntax-prosody interface, addressing experimental evidence for gradient effects in English intonation using p-structure constraints. Ongoing advancements include the LFG25 conference, held in July 2025 at , which emphasized computational and typological work in the spirit of LFG, including AI-linguistics interfaces for hybrid models. Pre-conference workshops focused on multilingual grammar engineering, building on open resources like the ParGram project, which provides parallel LFG grammars and datasets for over 20 languages to support engineering and evaluation.

References

  1. [1]
    [PDF] Lexical-Functional Grammar: A Formal System for Grammatical ...
    Lexical-functional grammar has evolved both from previous research within the transforma- tional framework (e.g., Bresnan 1978) and from earlier computational ...
  2. [2]
    [PDF] Handbook of Lexical Functional Grammar - OAPEN Library
    This chapter provides a general summary of the architecture of LFG. It is mainly focused on describing the two main syntactic levels, c- and f-structure ...
  3. [3]
    Lexical-Functional Grammar: A Formal System for Grammatical ...
    Apr 14, 2015 · PDF | On Jan 1, 1982, Ronald M Kaplan and others published Lexical-Functional Grammar: A Formal System for Grammatical Representation | Find ...
  4. [4]
    Handbook of Lexical Functional Grammar - Language Science Press
    Dec 23, 2023 · Lexical Functional Grammar (LFG) is a nontransformational theory of linguistic structure, first developed in the 1970s by Joan Bresnan and Ronald M. Kaplan.Missing: primary | Show results with:primary
  5. [5]
    [PDF] Lexical-Functional Grammar
    Aug 10, 2009 · Lexical-Functional Grammar (LFG) was first developed in the 1970's by Joan Bresnan, a linguist at. MIT, and Ron Kaplan, a psychologist at ...Missing: primary | Show results with:primary
  6. [6]
    Handbook of Lexical Functional Grammar - OAPEN Library
    Lexical Functional Grammar (LFG) is a nontransformational theory of linguistic structure, first developed in the 1970s by Joan Bresnan and Ronald M. Kaplan.
  7. [7]
    [PDF] 25. Lexical-Functional Grammar - KOPS
    LFG has been used to analyze diverse phenomena, including discontinuous constitu- ents and long-distance dependencies in a large number of typologically diverse ...
  8. [8]
    [PDF] Lexical-Functional Grammar - Stanford University
    6 Long Distance Dependencies . ... Even languages with wildly free word order, such as the Pama). Nyungan ...
  9. [9]
    Lexical-Functional Grammar: An Overview - Annual Reviews
    Jan 14, 2020 · Lexical-Functional Grammar (LFG) is a model for the analysis of language in which different types of linguistic information are represented ...
  10. [10]
    The Mental Representation of Grammatical Relations - MIT Press
    The Mental Representation of Grammatical Relations. Edited by Joan Bresnan. Hardcover. Out of print. Hardcover. ISBN: 9780262021586. Pub date: December 28, 1982.
  11. [11]
    ANLT - The Alvey Natural Language Tools
    The UK Alvey Programme originally funded three projects at the Universities of Cambridge, Edinburgh and Lancaster to provide tools for use in natural language ...Missing: Functional | Show results with:Functional
  12. [12]
    [PDF] ParGramBank: The ParGram Parallel Treebank - ACL Anthology
    The ParGram project regulates the features and values used in its grammars. Since its inception in 1996, ParGram has included a “feature com- mittee”, which ...
  13. [13]
    Semantics and Syntax in Lexical Functional Grammar - MIT Press
    Mar 17, 1999 · Semantics and Syntax in Lexical Functional Grammar. The Resource Logic Approach. Edited by Mary Dalrymple. Hardcover. Out of print. Hardcover.
  14. [14]
    XLE Project
    The Xerox Linguistic Environment (XLE) consists of cutting-edge algorithms for parsing and generating Lexical Functional Grammars (LFGs)
  15. [15]
    Proceedings of the Lexical Functional Grammar Conference
    LFG conferences are intended to foster research and academic exchange on any work within the formal architecture of Lexical Functional Grammar as well as ...Missing: ParGram | Show results with:ParGram
  16. [16]
    Non-configurationality in Australian aboriginal languages
    Austin, P., Bresnan, J. Non-configurationality in Australian aboriginal languages. Nat Lang Linguist Theory 14, 215–268 (1996). https://doi.org/10.1007 ...Missing: Nonconfigurationality | Show results with:Nonconfigurationality
  17. [17]
    [PDF] The lexical integrity principle: Evidence from Bantu
    FIVE TESTS OF LEXICAL INTEGRITY. We begin with a review of five tests of lexical integrity: extraction, con- joinability, gapping, inbound anaphoric islands ...
  18. [18]
    [PDF] Lexical Functional Grammar Abstract 1 LFG's syntactic structures
    Lexical Functional Grammar (LFG) is a linguistic theory which studies the var- ious aspects of linguistic structure and the relations between them.<|control11|><|separator|>
  19. [19]
    [PDF] Lexicality and Argument Structure - Stanford University
    Oct 12, 1995 · Bresnan, Joan and Samuel A. Mchombo. 1995. The lexical integrity principle: Evidence from Bantu. Natural Language & Linguistic Theory13.181 ...
  20. [20]
    [PDF] Locative Inversion in Chicheŵa: A Case Study of Factorization in ...
    16 JOAN BRESNAN AND JONNI M. KANERVA and they fail to passivize.15 In contrast, trans. For example, the verbs peza 'find', thamangitsa 'chase', and tumiza ...
  21. [21]
    [PDF] Handbook of Lexical Functional Grammar - Refubium
    ... Dalrymple, Mary (ed.). 1999. Semantics and syntax in Lexical Functional Grammar: The resource logic approach (Language, Speech, and Communication). Cam ...
  22. [22]
    [PDF] Direct Compositionality and the Architecture of LFG
    Jun 5, 2006 · ... λ function from a-structure to f-structure. The original φ function ... Lexical Functional Grammar. San Diego, CA: Aca- demic Press. de ...
  23. [23]
    [PDF] Long-distance Dependencies, Constituent Structure, and Functional ...
    Our solution to the uncertainty problem is much more direct: we utilize a formal device that permits an infinite set of functionally constrained possibilities ...
  24. [24]
    [PDF] BUILDING A LEXICAL FUNCTIONAL GRAMMAR FOR TURKISH by ...
    Once we derive the participle phrase we unify it with the appropriate argument of the verb using rules based on functional uncertainty. (85) shows a relative ...
  25. [25]
    [PDF] Chapter 23 Computational implementations and applications - Zenodo
    Computational implementations of LFG are computer programs composed of LFG annotated c-structure rules and lexical entries. LFG was designed to be computa-.
  26. [26]
    Computational Aspects of Lexical Functional Grammar - Compass Hub
    Jan 4, 2011 · Kaplan (Kaplan and Bresnan 1982). The original aim of LFG was to provide a formal framework for syntactic modeling that is both computationally ...Missing: feasibility | Show results with:feasibility
  27. [27]
    [PDF] Integrating Finite-state Technology with Deep LFG Grammars1
    This section describes how the FST morphologies are integrated into the deep grammars (again, see. (Kaplan and Newman, 1997) for more details). The basic idea ...Missing: approximations | Show results with:approximations
  28. [28]
    [PDF] Long-Distance Dependency Resolution in Automatically Acquired ...
    This paper shows how finite approximations of long distance dependency (LDD) resolution can be obtained automatically for wide-coverage, robust, probabilistic ...
  29. [29]
    Lexical-Functional Grammar (LFG) & XLE
    The treebank is based on deep LFG (Lexical-Functional Grammar) grammars that were developed within the framework of the ParGram (Parallel Grammar) effort.
  30. [30]
    [PDF] eXLEpse: An Eclipse-based, Easy-to-Use Editor for Computational ...
    eXLEpse provides functionality for editing computational LFG grammars and an interface to the XLE grammar development platform (Crouch et al., 2011). The editor ...
  31. [31]
    [PDF] The Parallel Grammar Project - ACL Anthology
    The project started in January 1994 and ended in July 1996. Development of grammatical resources was carried out in the framework of the. Page 2. Advanced ...
  32. [32]
    [PDF] Non-Configurationality in Australian Aboriginal Languages
    Author(s): Peter Austin and Joan Bresnan. Source: Natural Language & Linguistic Theory , May, 1996, Vol. 14, No. 2 (May, 1996), pp. 215-268. Published by ...Missing: Natures | Show results with:Natures
  33. [33]
    [PDF] Structural Topic and Focus without Movement1 - Stanford University
    One of the challenges in presenting a syntactic analysis of free word order languages like Urdu and Turkish lies in motivating the various possible ...
  34. [34]
    [PDF] Chapter 36 LFG and Slavic languages - Zenodo
    This chapter provides a survey of LFG work on Slavic languages. It briefly intro- duces some of the Slavic family's most salient grammatical properties, ...
  35. [35]
    [PDF] Chapter 29 LFG and Celtic languages - Zenodo
    The basic structure of the Irish cleft construction is copula + clefted phrase + relative particle + the remainder of the sentence. This is illustrated in (10) ...
  36. [36]
    Ray Jackendoff LFG10 Abstract - Stanford University
    The Parallel Architecture and its Lexicon: Is There Anything Useful for LFG? ... Classical generative grammar: Combinatorial structure of language arises ...
  37. [37]
    [PDF] Universal grammar and mental continuity: Two modern myths
    Indeed, a growing body of evidence suggests that language acquisition is ... Through methods like these,. LFG, HPSG, and Role and Reference Grammar have ...<|control11|><|separator|>
  38. [38]
  39. [39]
    [PDF] the relationship of HPSG Order Domains to LFG - Stanford NLP Group
    A remaining difference between LFG and linearization HPSG is that in the former all the surface phrase structure information is placed in one structure (the c- ...
  40. [40]
    [PDF] Approaches to scope islands in LFG+Glue - Stanford University
    Figure 3: Possible lexical constraints on scope out of a clause embedded by ... interact in the MA rule, are obviously ad-hoc to an extent. As with the ...Missing: challenges | Show results with:challenges
  41. [41]
    [PDF] Yet Another Attempt to Explain Glue - Semantics Archive
    'Glue semantics' is certainly the nearest thing there is to an 'official' theory of composi- tional semantics for LFG, but suffers from the problem that a ...Missing: reliance debate
  42. [42]
  43. [43]
    [PDF] The Grammaticalization of Grammatical Relations: A ... - eScholarship
    ... LFG assumption of the universality and diachronic invariance of relational primitives cannot be supported. In particular, I point out that Allen's (1995) LFG ...
  44. [44]
    [PDF] A Featural Typology of Bantu Agreement - OAPEN Library
    also proposes that ϕ agreement in Bantu languages is independent of Case, ... LFG framework 183 licensing 1–2, 13–15, 22–3, 25–7, 30–2,. 71, 81–5, 101, 103 ...
  45. [45]
    [PDF] Papers in Bantu grammar and description
    Agreement Properties and Word Order in Comparative Bantu ... He argues that many object diagnoses proposed for Bantu fail to result in a coherent notion of.
  46. [46]
    [PDF] XLE+Glue – A new tool for integrating semantic analysis in XLE
    In this paper, we present XLE+Glue,1 a resource for grammar developers that integ- rates semantic capabilities into the Xerox Linguistics Environment (XLE; ...
  47. [47]
    Reciprocal scope at the syntax-semantics interface
    Dec 31, 2022 · This paper analyzes the compositional semantics of reciprocals and reciprocal scope, using an explicit syntax-semantics interface in LFG+Glue.
  48. [48]
  49. [49]
    [PDF] Multi-domain Neural Network Language Generation for Spoken ...
    (2008) using a generator based on the Lexical Functional Grammar (LFG) f- ... neural network based language model. In Proceedings of InterSpeech. Nikola ...
  50. [50]
    [PDF] Emergent Structures and Training Dynamics in Large Language ...
    May 27, 2022 · In Transformer-based ar- chitectures, i.e.: ALBERT and ELECTRA ... Lexical functional grammar. Brill. Marina Danilevsky, Kun Qian ...
  51. [51]
    [PDF] aspects of the syntax of American Sign Language (ASL) Carol Neidle
    This paper will begin with a brief introduction to the syntactic organization of. American Sign Language (ASL); then one specific area where unification-based ...
  52. [52]
    [PDF] American Sign Language Natural Language Generation and ...
    His approach uses a lexical-functional grammar (LFG) for analysis of the English text into a functional structure, hand-crafted transfer rules for converting an ...
  53. [53]
    [PDF] CREOLES AS INTERLANGUAGES:
    minology of Lexical Functional Grammar, Pienemann, Di Biase & Kawaguchi (2005) explain the learners' initial behavior as the consequence of a fixed ...
  54. [54]
    [PDF] Prosodic Phonology in LFG : A New Proposal - KOPS
    Abstract. In this paper we outline a new architecture for modeling the interaction between syntax and prosody. This architecture does not make use of corre-.
  55. [55]
    The prosody-syntax interface : A computational implementation
    Dec 31, 2022 · This paper introduces a new approach for the integration of prosodic structure into the computational LFG grammars.
  56. [56]
    Prosody and its interfaces - Zenodo
    Nov 22, 2023 · The second part surveys the different proposals for the integration of p-structure and its interfaces into LFG, with a particular focus on the ...
  57. [57]
    [PDF] University of Groningen The syntax-prosody interface in LFG Jones ...
    Abstract. This paper sets out two challenges to LFG analyses of the syntax-prosody interface—one general, one specific—arising from experimental evidence ...
  58. [58]
    LFG25: the 30th International Lexical Functional Grammar Conference
    Jun 5, 2025 · The conference will be held July 22–24, 2025. There will be a day of pre-conference activities on July 21st. There will be a workshop on ...
  59. [59]