Fact-checked by Grok 2 weeks ago

Dynamic semantics

Dynamic semantics is a framework within formal semantics and logic that conceptualizes the meaning of linguistic expressions, particularly , as their capacity to update or modify an ongoing discourse context or information state, rather than as fixed truth-conditional content in isolation. This approach addresses limitations of static semantic theories by accounting for contextual dependencies, such as anaphora and , through dynamic processes that accumulate information across utterances. The foundational ideas of dynamic semantics emerged in the early 1980s, primarily through two independent but closely related contributions. Hans Kamp introduced Discourse Representation Theory (DRT) in his 1981 paper "A of Truth and Semantic Representation," published in 1984, which models discourse interpretation via mental representations that evolve with each sentence, introducing discourse referents for entities and relations that bind pronouns and resolve anaphoric links. Concurrently, Irene Heim developed File Change Semantics in her 1982 doctoral dissertation "The Semantics of Definite and Indefinite Phrases," framing semantic interpretation as incremental updates to a "file" of information cards representing the common ground, with indefinites adding new entries and definites requiring familiarity conditions. These theories shifted focus from sentence-level truth to inter-sentential coherence, enabling compositional analyses of complex discourses. Subsequent developments refined and generalized dynamic semantics. In 1991, Jeroen Groenendijk and Martin Stokhof proposed Dynamic Predicate Logic (DPL), a non-representational, compositional extension of where connectives and quantifiers are interpreted dynamically, allowing variables to bind across clause boundaries and handling phenomena like donkey anaphora without intermediate structures. DPL's emphasis on variable assignments as information states influenced later work on presupposition projection, plurals, and tense. In recognition of their pioneering contributions, Hans Kamp and Irene Heim were jointly awarded the 2024 Rolf Schock Prize in Logic and Philosophy. Dynamic approaches have also integrated with type-theoretic frameworks, such as in Reinhard Muskens' 1996 work combining Montague semantics with discourse representation, facilitating broader applications in and cognitive modeling. Key features of dynamic semantics include its treatment of as non-commutative (order affects update outcomes), dynamic entailment (where premises to entail conclusions), and mechanisms for in failure. It excels in explaining discourse-level phenomena, such as conditional perfection and evidentials, by viewing meaning as a transformative process akin to program execution in computational terms. While primarily applied to , dynamic semantics draws from dynamic epistemic logics in and has inspired extensions to questions, imperatives, and discourses.

Introduction

Definition and Scope

Dynamic semantics is a theoretical approach in semantics that conceptualizes the meaning of a as its potential to update or modify an information state, rather than as a static assigned independently of . In this framework, interpretation involves a dynamic process where each utterance incrementally builds upon and alters the shared knowledge or , treating sentences as functions that transform input contexts into output contexts. This perspective shifts focus from isolated propositional content to the evolving nature of meaning in ongoing communication. The scope of dynamic semantics encompasses context-dependent linguistic phenomena, including anaphora, presuppositions, and the maintenance of , where the interpretation of expressions relies on prior utterances. It stands in contrast to static semantics, such as , which evaluates sentences compositionally for truth conditions relative to a fixed model, without accounting for inter-sentential dependencies. By modeling meaning as context change, dynamic semantics addresses how accumulates information over sequences of sentences, enabling a unified treatment of both local and global interpretive effects. A primary motivation for dynamic semantics arises from the limitations of compositional static approaches in handling context-sensitive elements, like pronouns that refer back to antecedents in previous discourse. For instance, consider the pair: "A woman walks in the park. She sits down." In a static framework, the pronoun "she" would lack a clear referent without external resolution, but dynamic semantics interprets the first sentence as introducing a discourse referent for "a woman," which the second sentence then updates by binding "she" to it, ensuring coherent interpretation across the discourse. This mechanism highlights how dynamic theories capture the incremental, interactive buildup of meaning in natural language use.

Historical Background

The origins of dynamic semantics trace back to the , when linguists began addressing limitations in static truth-conditional approaches, particularly in handling anaphora and structure. Lauri Karttunen's introduction of discourse referents in 1976 provided an early for tracking entities introduced in , enabling incremental beyond isolated sentences. This work laid groundwork for viewing semantics as a process of updating contextual information, influencing subsequent dynamic models. Meanwhile, Richard Montague's formal semantics in the early established a compositional, truth-conditional but struggled with phenomena like resolution across sentences. A pivotal milestone came in 1981 with Hans Kamp's Discourse Representation Theory (DRT), which formalized semantics as the construction and update of representation structures to account for anaphora and truth conditions in connected . Kamp's "A Theory of Truth and Semantic Representation" shifted focus from static denotations to dynamic processes, integrating insights from Karttunen and enabling compositional analysis of . Independently, Irene Heim developed File Change Semantics in her 1982 doctoral dissertation "The Semantics of Definite and Indefinite Noun Phrases," offering a similar framework that models context as a "file" of information states updated by sentences, particularly refining the treatment of definite and indefinite noun phrases. Heim's approach emphasized familiarity conditions for referents, bridging and anaphora in a unified dynamic framework. In the 1990s, dynamic semantics advanced through the work of Jeroen Groenendijk, Martin Stokhof, and Frank Veltman, who developed update semantics to formalize context as evolving information states. Groenendijk and Stokhof's 1991 Dynamic Predicate Logic (DPL) recast as relations between input and output information states, enhancing handling of quantification and in . Veltman's 1996 update semantics further incorporated defaults and non-monotonic reasoning, drawing influences from to model epistemic updates and conditionals. These developments marked a maturation of dynamic approaches, emphasizing incremental interpretation over Montague-style static models and establishing core tools for later linguistic theories.

Foundational Concepts

Context and Information States

In dynamic semantics, the is formally represented as an evolving information state, typically modeled as a set of possible worlds, variable assignments, or discourse referents that captures the partial knowledge accumulated during discourse. This representation allows sentences to update the incrementally, reflecting how linguistic expressions modify the shared information among interlocutors rather than merely describing static truths. Information states embody this partial about the , serving as the input to semantic and the output of updates. Assertions add constraints by intersecting the current state with new possibilities, thereby narrowing the set of compatible scenarios, while tests filter states by checking consistency without adding new . Formally, if C denotes the current (an information state) and \phi a , the updated context is C[\phi], the result of applying \phi's context change potential to C; alternatively, the notation [[ \phi ]]_{C} denotes the set of output states reachable from C via \phi. In systems using possible worlds, such as update semantics, an information state \sigma is a of worlds W, updated by an atomic assertion p as \sigma = \{ w \in \sigma \mid p holds at w \}. In assignment-based approaches, states are sets of variable assignments, updated relationally to track referents. These updates exhibit key properties that ensure systematic information growth. Monotonicity guarantees that if one state \sigma is a of another \tau, then \sigma[\phi] \subseteq \tau[\phi], meaning additional prior information does not lead to less information post-update. Compositionality holds dynamically, where the meaning of complex expressions is derived from the of their parts' context change potentials, such as sequential application for . For example, updating an initial C (all possible worlds) with "It is raining" yields C' = \{ w \in C \mid raining holds at w \}, eliminating worlds incompatible with rain and thereby refining the interlocutors' shared . This mechanism briefly accommodates bindings, which extend states to link pronouns or descriptions to prior elements.

Bindings and Anaphora Resolution

In dynamic semantics, indefinite noun phrases introduce variables that bind to new discourse referents, updating the information state to make these referents available for reference in subsequent discourse. This process contrasts with static semantics, where indefinites function as existential quantifiers with scope limited to their immediate clause, preventing cross-sentential accessibility. For instance, in the discourse "John has a book. It is interesting," the indefinite "a book" establishes a discourse referent for the variable, which the pronoun "it" can then access through dynamic binding, ensuring coherent anaphora resolution. Anaphora resolution in dynamic semantics relies on pronouns being treated as free variables whose values are determined by the dynamic scope of prior bindings, rather than as unbound elements requiring separate resolution mechanisms. This approach allows pronouns to link to antecedents introduced earlier in the discourse, as the information state propagates updated assignments forward. Formally, indefinites operate as existential quantifiers whose scope extends dynamically across sentences, introducing pairs of variable-value assignments that persist; anaphoric pronouns then resolve by selecting compatible values from this updated state. A key challenge addressed by dynamic bindings is the interpretation of donkey sentences, such as "If a owns a , he beats it," where the "it" must covary with the indefinite "a donkey" embedded under a conditional. In static frameworks, this leads to issues with quantifier and unbound pronouns, but dynamic semantics resolves it by quantifying over sequences of assignments, allowing the indefinite to bind the universally within the conditional's dynamic . This mechanism ensures that the referent for the donkey is available to "it" in a way that captures the intended conditional dependency without requiring intermediate quantifier raising.

Key Frameworks

Discourse Representation Theory

Discourse Representation Theory (DRT), developed by Hans Kamp in 1981, is a foundational framework in dynamic semantics that models the interpretation of discourses through structured representations of information. It addresses how successive sentences in a contribute to an evolving of the described situation, emphasizing the compositional buildup of meaning across utterances. At the core of DRT are Discourse Representation Structures (DRSs), which serve as the primary representational units. A DRS is visualized as a "box" containing two components: a universe of discourse referents, typically represented as variables (e.g., x, y) that stand for entities introduced in the discourse, and a set of conditions, which are atomic predicates or equations relating these referents (e.g., \textit{man}(x), x = y). These structures capture the partial information state resulting from processing the discourse up to a given point, allowing for the representation of unresolved references and dependencies. The construction of DRSs proceeds incrementally as sentences are interpreted. Indefinite noun phrases, such as "a man," introduce new discourse referents into the main DRS universe and add corresponding conditions, thereby expanding the information state. Quantifiers like universals ("every") trigger the embedding of subordinate DRSs within the main structure, creating scoped environments where referents in the superior DRS are accessible to the embedded one, thus handling scope interactions dynamically. This incremental process ensures that each sentence updates the overall DRS in a way that reflects the discourse's evolving context. Anaphora resolution in DRT relies on the accessibility of discourse referents across embeddings. , such as "he," are interpreted by coreferring to an appropriate antecedent referent in the same or an accessible superior , provided no conflicting conditions block the resolution. This mechanism naturally accounts for phenomena like without requiring static rules, as the hierarchical structure of DRSs enforces resolution constraints based on relations. Formally, the semantics of DRT defines truth conditions for DRSs relative to models through a recursive verification procedure. A DRS is true in a model if there exists an embedding function mapping its discourse referents to model entities such that all conditions are satisfied; for embedded DRSs, this extends by requiring successful embeddings for both the superior and subordinate structures. This model-theoretic interpretation aligns DRT with while extending it to dynamics. A classic example illustrates DRS construction and anaphora: Consider the discourse "A man whistles. He walks." The first sentence introduces a referent x and conditions \textit{man}(x), \textit{whistles}(x) in the main . The second sentence resolves "he" to x (the most accessible antecedent), adding the condition \textit{walks}(x) to the same , yielding a unified structure where x satisfies all predicates. DRT's key advantage lies in its ability to handle discourse-level phenomena, such as anaphora and tense, compositionally within a single representational framework, providing a unified account of how discourses build coherent interpretations.

Update Semantics

Update semantics is a procedural framework within dynamic semantics, developed by Jeroen Groenendijk, Martin Stokhof, and Frank Veltman, in which the meaning of a sentence is understood as a context change potential—a function that transforms an input information state () into an updated output state. This approach shifts focus from static truth conditions to how utterances incrementally modify the shared in . The basic update rule treats atomic sentences as tests on the current information state: an proposition p updates a state s by eliminating possibilities in s where p fails to hold, resulting in s = \{ w \in s \mid p is true at w \}. Complex sentences compose these updates either sequentially or in parallel; for instance, \neg \phi updates by excluding the possibilities eliminated by \phi, yielding s[\neg \phi] = s \setminus s[\phi]. At its core, update semantics employs intersective updates, where an assertion intersects the current state with the propositional content of the , thereby refining the set of possible worlds or indexed to those compatible with the new information. This mechanism ensures monotonic refinement of the context, assuming no failure. Compositionality in update semantics defines logical connectives dynamically: \phi \land \psi performs sequential update, first applying \phi to the input state and then \psi to the result, formalized as s[\phi \land \psi] = s[\phi][\psi]. Disjunction \phi \lor \psi takes the union of possibilities updated by \phi and those updated by \psi after excluding \phi, expressed as s[\phi \lor \psi] = s[\phi] \cup s[\neg \phi][\psi], allowing non-monotonic effects in certain cases. A representative example is the definite description in "The king of France is bald," which updates an information state by filtering to possibilities where there is exactly one salient satisfying "king of " and that is bald; states lacking such a unique yield an undefined or empty update, accommodating presuppositional behavior.

Advanced Topics

Handling Presuppositions

In dynamic semantics, presuppositions are background assumptions triggered by specific linguistic expressions, such as definite descriptions (e.g., "the king of ") or factive verbs (e.g., ""), which must be satisfied by the context for the sentence to be appropriately uttered. These assumptions differ from assertions in that they project through embeddings like or questions, remaining as requirements on the global context. Dynamic semantics treats presuppositions as preconditions on context change potentials (CCPs), where a sentence's update to an information state is only defined if the presuppositions hold in that state; otherwise, the resulting context is undefined. This approach, formalized in Heim's satisfaction theory, extends file change semantics by requiring local satisfaction: for embedded sentences, presuppositions must be entailed by a temporary context derived from the embedding operator. Failure to satisfy these preconditions halts the update, modeling the infelicity of presupposition failure. A key mechanism for handling is , whereby if a is not satisfied by the current , it is added to the to make the possible, often without explicit assertion. This process can be (adding to the entire common ground) or local (within a specific ), depending on the and . In contrast to static theories, dynamic approaches distinguish filtering—where a of a is entailed by the preceding and thus filtered out—from strengthening, where the is incorporated into the assertion of the sentence to ensure satisfaction. For instance, in "If has a son, his son is bald," the of "his son" (that has a son) is filtered by the antecedent, avoiding commitment. Consider the sentence "John regrets smoking," where the factive verb "regrets" presupposes that John smokes. In a dynamic , the update succeeds only if the input context entails this ; if not, accommodation adds "John smokes" to the context before applying the assertion that John has a negative attitude toward it. This ensures the sentence's while distinguishing the presupposed content from the at-issue update. The problem addresses how "project" from embedded positions to become global requirements, as in "John does not smoking," which still that John smokes despite the . resolves this through context inheritance: operators like create a local context that inherits the global , requiring satisfaction in both local and global states without filtering the trigger's . This compositional mechanism predicts patterns for connectives and embeddings, unifying handling with general updates.

Modals and Conditionals

In dynamic semantics, modal operators such as necessity (□φ) and possibility (◇φ) are analyzed through test semantics, which differs from the intersective updates used for declarative assertions. The necessity operator □φ functions as a test on the current information state s, succeeding only if φ holds in every world compatible with s; formally, this is captured as s[□φ] = s whenever s[φ] = s, and undefined (or leading to the absurd state) otherwise. This filtering mechanism ensures that □φ verifies epistemic necessity without altering the underlying possibilities, thereby preserving the context for further discourse updates. Such tests reflect how modals express commitments relative to accessible worlds, emphasizing compatibility rather than outright assertion. However, pure test semantics faces challenges in handling modal subordination, where subsequent utterances build on the possibilities introduced by modals (e.g., "A wolf might come in. It will eat you if it does"), leading to extensions in dynamic epistemic logic or alternative approaches using selection functions. The possibility operator ◇φ is similarly treated as a test: s[◇φ] = s if s[φ] ≠ ∅ (i.e., φ is with s), and otherwise. This tests for epistemic possibility without expanding the , though extensions address subordination by allowing hypothetical branches. Unlike standard assertions that narrow the via (s[φ] = s ∩ ||φ||), modals like ◇φ verify , accommodating modal subordination where later utterances build on the introduced possibilities in advanced frameworks. Conditionals, such as "if φ then ψ," extend this dynamic approach by combining testing and updating: the antecedent φ is first tested (filtering to states where φ is compatible), followed by an update with the consequent ψ in the resulting hypothetical . Formally, s["if φ then ψ"] = s[φ][ψ], where [φ] acts as a test ensuring s[φ] ≠ ∅ before applying the intersective update [ψ]. This semantics captures the conditional's role in restricting outcomes selectively, as in the example "If it rains, we stay home": the update tests for rain-compatible states in the current and then restricts those to worlds where staying home holds, without committing to rain occurring outright. Alternative formulations employ selection functions to pick a φ-world from s and evaluate ψ relative to it, yielding s["if φ then ψ"] = {w ∈ s | f(w, φ) ∈ ||ψ||}, where f is a contextual selector, which integrates hypothetical reasoning while avoiding global intersections. These treatments highlight key differences from intersective semantics: modals and conditionals do not merely narrow the state but test or branch into selected subcontexts, facilitating complex interactions like embedded and non-monotonic updates. For instance, a conditional can project possibilities outward, allowing "If John comes, he might bring Mary" to introduce a new without collapsing antecedent and consequent worlds. This selective ensures that modals and conditionals support incremental growth, central to interpretation.

References

  1. [1]
  2. [2]
    (PDF) Dynamic Predicate Logic - ResearchGate
    Aug 6, 2025 · PDF | On Feb 1, 1991, Jeroen Groenendijk and others published Dynamic Predicate Logic | Find, read and cite all the research you need on ...
  3. [3]
  4. [4]
    [PDF] Introduction to Dynamic Semantics - Elizabeth Coppock
    In dynamic semantics, the meaning of a sentence is a context change potential (Heim 1982, 1983), rather than a description of the world. Basic idea: The ...
  5. [5]
    [PDF] Introduction to Dynamic Semantics
    According to dynamic semantics, the meaning of a sentence is an 'update', that takes in a context, and gives back a … CONTEXT. Page 27 ...
  6. [6]
    [PDF] Heim (1983)
    File Change Semantics and the. Familiarity Theory of Definiteness. Irene Heim. 1 Introduction. What is the difference in meaning between definite noun phrases ...
  7. [7]
    [PDF] INFORMATIVE DYNAMIC SEMANTICS - RUcore
    In dynamic semantics, the meaning of a sentence is modeled as a rule for how a body of infor- mation grows when the sentence is accepted.
  8. [8]
    [PDF] Lecture 5. Dynamic Semantics, Presuppositions, and Context ...
    Apr 3, 2009 · This led to the idea of dynamic semantics; in Heim's terms, the meaning of a sentence is no longer a set of (static) truth-conditions.<|control11|><|separator|>
  9. [9]
    [PDF] Lauri Karttunen: Discourse Referents - Daniel W. Harris
    According to the standard theory, referential indices are merely formal indicators of coreference with no further semantic significance. They are not meant to.Missing: dynamic | Show results with:dynamic
  10. [10]
    Montague Semantics - Stanford Encyclopedia of Philosophy
    Nov 7, 2011 · Montague semantics is the approach to the semantics of natural language introduced by Richard Montague in the 1970s. He described the aim of ...Missing: roots | Show results with:roots
  11. [11]
    [PDF] File Change Semantics and the Familiarity Theory of Definiteness
    Heim, Irene. 1982. The Semantics of Definite and Indefinite Noun Phrases. Doctoral dissertation,. University of Massachusetts, Amherst. Heim, Irene. 1983.
  12. [12]
    [PDF] Defaults in Update Semantics - Homepages of UvA/FNWI staff
    The standard definition of logical validity runs as follows: An argument is valid if its premises cannot all be true without its conclusion being true as well.Missing: 1996 programming
  13. [13]
    Dynamic Semantics - Stanford Encyclopedia of Philosophy
    Aug 23, 2010 · Dynamic semantics is a perspective on natural language semantics that emphasizes the growth of information in time.The notion of context in... · Dynamic generalized... · Presupposition and dynamic...
  14. [14]
  15. [15]
    [PDF] Context and Information in Dynamic Semantics - Martin Stokhof
    Groenendijk, J.A.G. and Stokhof, M.J.B. (1987) Dynamic predicate logic. Toward a compositional and non-representational discourse theory, Amsterdam: ITLI/ ...
  16. [16]
    Discourse Representation Theory
    May 22, 2007 · In the early 1980s, Discourse Representation Theory (DRT) was introduced by Hans Kamp as a theoretical framework for dealing with issues in ...Introduction · Donkey pronouns · Basic DRT · Applications and extensions
  17. [17]
    [PDF] Coreference and Modality - Jim Pryor
    called an update semantics. The interpretation process always leads to an ... terms of truth, but in terms of sequential update and support:7. De nition ...
  18. [18]
    [PDF] The Semantics of Definite and Indefinite Noun Phrases
    Irene Heim's seminal dissertation from 1982 has been freely available in its original format for some time, but we have often longed for a retypeset and ...
  19. [19]
    [PDF] On the Projection Problem for Presuppositions
    The projection problem is the problem of pre- dicting the presuppositions of complex sen- tences in a compositional fashion from the.
  20. [20]
    [PDF] Presupposition and Assertion in Dynamic Semantics
    This book is about linguistic presupposition, how what we take for granted is reflected in what we say. In the book I first explore what around a hundred ...<|control11|><|separator|>
  21. [21]
  22. [22]