Fact-checked by Grok 2 weeks ago

Substructural logic

Substructural logics are a class of nonclassical logical systems defined by the restriction or omission of one or more structural rules—such as weakening, , and —that are standard in the sequent calculi of classical and intuitionistic logics. These rules, originally formalized by in the 1930s, govern the manipulation of premises in proofs, including the ability to discard irrelevant assumptions (weakening), reuse assumptions multiple times (), and reorder them freely (). By limiting these operations, substructural logics treat logical resources as consumable and ordered, avoiding paradoxes like those of material implication where irrelevant premises lead to arbitrary conclusions. The concept of substructural logic gained prominence in the late , building on earlier foundational work in relevant and noncommutative systems. Pioneering contributions include Ivan Orlov's 1928 propositional , which rejected weakening to ensure between and conclusions; Wilhelm Ackermann's entailment , which further limited permutation and assertion; and Joachim Lambek's 1958 associative calculus, a noncommutative system for syntactic categories in . A major milestone was Jean-Yves Girard's introduction of in 1987, presented not as an alternative to but as an extension that refines it by forbidding weakening and contraction as global rules, instead controlling resource reuse through special exponential connectives like ! and ?. This innovation highlighted substructural logics' potential for modeling resource-sensitive processes, such as computation and concurrency. Key variants of substructural logics include relevant logics (e.g., R and , which primarily reject weakening), linear logics (which reject both weakening and ), and affine logics (which allow weakening but not ). These systems have structural formulations that can be monotonic or nonmonotonic, reflexive or irreflexive, depending on the rules retained, and they often employ frame-based semantics like Routley-Meyer frames with ternary accessibility relations to capture relevance. In applications, substructural logics underpin type theories in programming languages for safe , categorial grammars in via Lambek calculus, and partial correctness reasoning in extensions, where assertions are modeled using intuitionistic linear implications without full exchange. Their study also intersects with , addressing metainferences and hierarchies that approximate while resolving semantic paradoxes like the Liar.

Fundamentals

Definition and Motivation

Substructural logics are a of non-classical logical systems formulated in where some or all structural rules—such as weakening, , , and associativity—are omitted or restricted, thereby altering the ways in which can be manipulated during inference. This restriction distinguishes them fundamentally from , in which can be freely added (via weakening), duplicated or reused (via ), or reordered (via ) without affecting validity, treating logical resources as abundant and interchangeable. In substructural logics, by contrast, are viewed as consumable or limited resources that must be tracked precisely, preventing arbitrary duplication or discard and enabling finer control over inference steps. The motivation for substructural logics stems from diverse disciplinary needs to model resource-sensitive reasoning beyond the assumptions of . In philosophy, they address limitations of material implication, such as its paradoxes where irrelevant yield valid conditionals (e.g., a false antecedent implying anything), by enforcing between and conclusions in systems like relevant logic. In , these logics support in concurrent and programming, where assumptions or data cannot be freely copied or discarded, as exemplified in linear logic's treatment of proofs as processes consuming inputs to produce outputs. Similarly, in , substructural systems facilitate dependency tracking in syntactic composition, ensuring that grammatical elements combine directionally without extraneous reuse, as in categorial grammars for structure. Substructural logics emerged in the late as a unified to resolve issues of and in , with foundational contributions including Lambek's 1958 calculus for syntax, Anderson, Belnap, and Dunn's 1975 developments in relevant entailment, and Girard's 1987 introduction of .

Structural Rules in Classical Logic

In classical sequent calculus, as introduced by Gentzen in his foundational work on , sequents are expressed in the notation \Gamma \vdash \Delta, where \Gamma represents a of formulas serving as the antecedents or on the left-hand side, and \Delta represents a of formulas serving as the consequents or conclusions on the right-hand side. This notation treats formulas as resources that can be used without inherent restrictions on multiplicity or order, reflecting the unrestricted nature of classical deduction. The core structural rules in this system—weakening, contraction, exchange, and associativity—govern the manipulation of these multisets, enabling flexible handling of premises and conclusions. Weakening allows the addition of irrelevant formulas to either side of the sequent without affecting derivability; for instance, from \Gamma \vdash \Delta, one may derive \Gamma, A \vdash \Delta (left weakening) or \Gamma \vdash \Delta, A (right weakening), where A is any formula not contributing to the proof. Contraction permits the elimination of duplicate formulas in the antecedents by merging multiple occurrences into one; specifically, from \Gamma, A, A \vdash \Delta, the sequent \Gamma, A \vdash \Delta follows (left contraction), with a symmetric rule for the right side. Exchange, also known as permutation, ensures that the order of formulas within \Gamma or \Delta is immaterial, allowing arbitrary reordering such as swapping adjacent formulas on either side. Associativity further reinforces the multiset structure by making the grouping of irrelevant; thus, sequents like (\Gamma, A, B) \vdash \Delta and \Gamma, (A, B) \vdash \Delta are equivalent, as the comma operation is associative. These rules collectively underpin the system's permissiveness, allowing proofs to ignore limitations in usage. In , these structural rules play a pivotal role in achieving key theoretical properties, including the , which guarantees that any proof using the cut rule (a form of ) can be transformed into an equivalent cut-free proof, and the of the system relative to classical semantics. They facilitate unrestricted manipulation of premises, ensuring that fully captures the expressive power of classical propositional and predicate logic.

Core Concepts

Sequent Calculus Framework

The sequent calculus framework serves as the foundational proof theory for substructural logics, providing a structured way to represent logical deductions through sequents and inference rules. Introduced by Gerhard Gentzen in his seminal work on logical inference, this system formalizes proofs as trees built from initial axioms and rule applications, enabling precise analysis of deduction processes. A sequent takes the form \Gamma \vdash \Delta, where \Gamma is a multiset of formulas on the left (antecedent, representing assumptions) and \Delta a multiset on the right (succedent, representing conclusions); the turnstile \vdash indicates that the assumptions entail the conclusions. Proofs begin with axioms of the form A \vdash A, where a formula A appears on both sides, and are extended via inference rules that preserve the validity of sequents. Inference rules in are categorized into logical rules, which handle the and elimination of logical connectives, and structural rules, which permit manipulations of the antecedent and succedent such as weakening, , and . Logical rules include left and right rules for connectives like (\to), (\land), and disjunction (\lor). For , the right (\to_R) derives \Gamma \vdash A \to B from \Gamma, A \vdash B, while the left (\to_L) derives \Gamma, A \to B, \Delta \vdash C from \Gamma \vdash A and \Delta, B \vdash C. Similarly, for , the right (\land_R) allows \Gamma \vdash A \land B from \Gamma \vdash A and \Gamma \vdash B, and the left rules (\land_{L1}, \land_{L2}) project components accordingly. The cut , a key structural-logical hybrid, eliminates implications by combining two sequents: from \Gamma \vdash A and \Delta, A \vdash B, it yields \Gamma, \Delta \vdash B, facilitating the connection of subproofs but often targeted for elimination in normalized proofs. A cornerstone of the framework is the proof process, embodied in Gentzen's (Hauptsatz), which demonstrates that any proof using the cut rule can be transformed into an equivalent cut-free proof through a series of reductions. This theorem, proven for both classical and intuitionistic logics, ensures the of the system and highlights the subformula property in cut-free proofs, where all formulas in a derivation are subformulas of the end-sequent. In classical and intuitionistic settings, cut-elimination preserves and supports automated proof search by avoiding non-local cuts. Sequent calculus relates to natural deduction as a complementary system, both originating from Gentzen's efforts to model deduction more analytically; while natural deduction emphasizes elimination and introduction rules in a tree-like structure mimicking informal reasoning, offers a symmetric treatment of antecedents and succedents, better suited for hypothetical judgments and multiple conclusions. This symmetry facilitates the study of structural properties in proofs, providing a versatile foundation for extensions in substructural logics.

Relaxation of Structural Rules

Substructural logics achieve resource sensitivity by relaxing the structural rules of classical , which typically include weakening, contraction, exchange, and associativity. These relaxations prevent unrestricted manipulation of premises, modeling scenarios where assumptions represent limited resources that must be tracked precisely. One common relaxation omits weakening, the rule allowing arbitrary addition of premises (e.g., from \Gamma \vdash C to \Gamma, B \vdash C), ensuring all formulas in the antecedent \Gamma must be used in the derivation. Omitting contraction, which merges multiple occurrences of a formula (e.g., from \Gamma, A, A \vdash C to \Gamma, A \vdash C), treats repeated premises as distinct, preventing duplication and enforcing single-use consumption. Restricting exchange makes premise order significant, as in non-commutative systems where reordering (e.g., \Gamma, A, B \vdash C does not imply \Gamma, B, A \vdash C) is invalid. Non-associativity further relaxes grouping, treating antecedents as ordered trees rather than flat multisets, so structures like A, (B, C) differ from (A, B), C. These relaxations yield distinct systems: affine logics permit weakening but omit contraction, allowing discard but not reuse of resources; strict relevance logics omit both weakening and contraction to enforce premise relevance; full linear logics extend this by also restricting exchange for order sensitivity. For instance, classical logic derives A, B \vdash B from A \vdash A and B \vdash B via weakening, but substructural variants without weakening fail this inference, requiring explicit resource accounting.

Varieties

Linear Logic

Linear logic, introduced by Jean-Yves Girard in 1987, represents a refinement of classical and intuitionistic logics by treating logical resources as consumable entities that cannot be freely duplicated or discarded. This approach arises from a relaxation of structural rules in the , specifically by prohibiting weakening and contraction for linear formulas while permitting . A central innovation is the introduction of the exponential modality !, which marks formulas as reusable resources; only !A allows contraction (duplication) and weakening (deletion), enabling the recovery of classical behavior for such persistent propositions. Linear logic further distinguishes between multiplicative connectives, which combine resources sequentially, and additive connectives, which operate in parallel: the multiplicative conjunction ⊗ and par ⌓ (denoted as \parr) emphasize resource consumption, while the additives & (with) and ⊕ (plus) focus on choice without resource splitting. The full set of connectives includes these alongside linear implication (⌓, or -o), negation (~), and the dual exponential ? for the "why not" modality in the succedent. In practice, these features ensure precise resource tracking. For instance, to derive the sequent A, B \vdash A \otimes B from the axioms A \vdash A and B \vdash B, the tensor introduction rule requires both premises to be combined exactly once, without duplication or omission of either resource: \frac{A \vdash A \quad B \vdash B}{A, B \vdash A \otimes B} \text{($\otimes$R)} This contrasts with classical conjunction, where resources could be reused freely.

Relevant Logic

Relevant logic, also known as , emerged in the mid-20th century as a response to perceived inadequacies in classical and logics regarding the notion of entailment. It was primarily developed during the and by philosophers Alan Ross Anderson and Nuel D. Belnap, Jr., who sought to formalize a stricter for logical where premises must genuinely contribute to the conclusion. Their collaborative work culminated in the seminal two-volume treatise Entailment: The Logic of Relevance and Necessity, which established the foundational systems and philosophical motivations for the field. A central feature of relevant logic is the rejection of the weakening rule, which in permits the addition of irrelevant premises without affecting validity; this ensures that all premises in an entailment are meaningfully used to derive the conclusion, embodying the principle of . Unlike stricter substructural systems, —the rule allowing multiple uses of a —is typically permitted in core relevant logics like R, but subject to overall constraints enforced by the system's axioms and rules. (permuting premises) and associativity (reordering groupings of premises) are generally preserved, maintaining a flexible yet disciplined handling of premise combinations. The logic introduces specialized connectives to capture relevant relationships. The relevant implication, denoted →, requires that the antecedent and consequent share at least one variable (or content in semantic terms), preventing implications from holding vacuously; for instance, if A and B have no common variables, A → B is not a theorem. Fusion, symbolized as • (or sometimes ◦), serves as a multiplicative conjunction that combines premises in a way that preserves relevance, behaving like a "non-additive and" where both operands must be utilized without dilution. Relevant as a substructural specifically relaxes the weakening to enforce anti-weakening, prioritizing the substantive connection between premises and conclusions over unrestricted premise manipulation. A key illustration of its distinctiveness is the avoidance of classical paradoxes arising from irrelevant , such as the (P \to Q) \to ((Q \to R) \to (P \to R)), which fails as a when P and R share no variables, as the premise P \to Q does not meaningfully support the irrelevantly chained conclusion P \to R.

Other Substructural Systems

BCI logic is a substructural system that omits the structural rules of and weakening, thereby treating implications as non-duplicable and non-discardable resources in its implicational fragment. This logic serves as the foundational basis for , where its axioms correspond to the combinators B, C, and I, enabling variable-free term manipulation without resource duplication. BCK logic is defined by axioms B, C, and K, permitting weakening but prohibiting , with retained via the C axiom; this results in a system where resources are used at most once but can be discarded. It models aspects of the that prohibit resource duplication, aligning with typed lambda terms where variables occur at most once, thus supporting precise control over resource consumption in computational proofs. Non-commutative linear logic, often denoted as NL, relaxes the exchange rule in the multiplicative fragment of to enforce ordered resource usage, distinguishing it from commutative variants by preserving sequence in premise combinations. This makes NL suitable for modeling linguistic structures or concurrent processes where the order of hypotheses affects inference outcomes. Affine logic permits weakening but prohibits , allowing resources to be discarded as optional but preventing their duplication, which contrasts with linear logic's stricter exact-use requirement. It proves useful in type systems for representing optional arguments or probabilistic computations where unused resources can be ignored without . These systems illustrate a of substructural restrictions, where logics are ordered by the inclusion of structural rules, with fully restricted variants like BCK at one extreme and progressively permissive ones like affine approaching classical behavior.

Formal Semantics

Residuated Lattices

A residuated lattice is an algebraic structure that serves as the canonical semantics for the multiplicative fragment of substructural logics. Formally, it is a tuple \langle L, \wedge, \vee, \otimes, \mathbf{1}, \multimap, \multimap^\mathrm{op} \rangle, where \langle L, \wedge, \vee \rangle is a lattice, \langle L, \otimes, \mathbf{1} \rangle is a monoid, and the binary operations \multimap (right residual) and \multimap^\mathrm{op} (left residual) satisfy the residuation condition: for all a, b, c \in L, a \otimes b \leq c \quad \iff \quad b \leq a \multimap c \quad \iff \quad a \leq c \multimap^\mathrm{op} b, where \leq is the lattice order. In substructural logics, residuated lattices model the multiplicative connectives, such as (\otimes) for and implications (\multimap, \multimap^\mathrm{op}) for conditionals, capturing resource-sensitive reasoning where are not freely reusable or discardable. The absence of structural rules in these logics corresponds to specific properties of the : for instance, the lack of (non-duplication of ) aligns with the non-idempotence of \otimes (i.e., a \otimes a \not\leq a), while the absence of weakening (non-discardability) relates to the unit \mathbf{1} not being the . Key variants of residuated lattices tailor the semantics to particular substructural systems. Full Lambek logic (FL) is semantically characterized by general residuated lattices (RL), which impose no additional restrictions beyond residuation. Integral residuated lattices (IL) add the condition that \mathbf{1} is the top element (x \leq \mathbf{1} for all x), enabling weakening and modeling intuitionistic-like systems. Non-commutative variants (as in RL and IL) distinguish left and right residuals to reflect ordered resource combination, whereas commutative residuated lattices enforce (\ a \otimes b = b \otimes a\ ), corresponding to logics with (permutation of premises). Many substructural logics are sound and complete with respect to classes of residuated lattices. For example, BCI logic is complete relative to commutative integral bounded residuated lattices, while the multiplicative fragment of linear logic is complete with respect to *-autonomous residuated lattices (a variant with dualizing objects). follows from the interpretation of sequents as inequalities in the lattice, and completeness is established via canonical extensions or cut-elimination theorems for the corresponding sequent calculi. The residuation condition embodies a Galois connection between the monoid operation and residuals, ensuring that \otimes is the left adjoint to the right residual in the lattice order: \begin{aligned} & a \otimes (-) \dashv (-) \multimap a, \\ & (-) \otimes a \dashv a \multimap^\mathrm{op} (-). \end{aligned} This adjunction provides the foundational link between the syntactic implications and algebraic inequalities in substructural semantics. Semantics for specific varieties of substructural logics, such as , often extend residuated lattices with phase spaces or coherence conditions to model additive connectives alongside multiplicatives.

Premise Composition Methods

In substructural logics, premises in sequent calculi are composed into antecedent structures that reflect restrictions on structural rules such as weakening, , , and associativity, unlike the flat sets used in . These compositions enable precise modeling of resource sensitivity, , and order in proofs. Classical logic employs sets for antecedents, where formulas are unordered and duplicates are identified, allowing unrestricted weakening (introduction of irrelevant premises) and (reuse of premises). This set-based composition supports the full structural rules, ensuring that the and multiplicity of premises do not affect provability. Multisets extend sets by for the multiplicity of formula occurrences without regard to , prohibiting to track each premise's exact usage as a limited resource. In , antecedents are formalized as s, where operations like fusion (juxtaposition) combine them without merging duplicates, enforcing that each premise is consumed exactly once in derivations. For instance, Girard's for multiplicative uses antecedents to model , as in sequents of the form \Gamma \vdash \Delta, where \Gamma and \Delta are s. This approach implies that proofs maintain a strict of assumptions, preventing the arising from irrelevant or overused premises. Similarly, in relevant logics, s in antecedents ensure that premises are relevant to the conclusion by disallowing , as formalized in systems like , where antecedents represent collections with counted occurrences to enforce the variable-sharing principle.90045-4) Sequences introduce order sensitivity by treating antecedents as linear arrangements of formulas, omitting the exchange rule to preserve positional dependencies. In the Lambek calculus, antecedents are sequences, reflecting non-commutativity to model in , where the order of premises corresponds to phrase composition. Formalization involves division operators (left and right slash) that operate on these ordered structures, with rules like application ensuring that premise order is maintained: for example, from A B \vdash C, one derives A \vdash B \backslash C or B \vdash A / C, without permuting elements. This has implications for grammatical modeling, as sequences capture directional dependencies in categorial grammars. Trees generalize sequences to non-associative, hierarchical structures, allowing antecedents to branch without implicit grouping, which omits the associativity rule. In non-associative variants of the Lambek calculus, such as NL, antecedents are represented as binary trees to model non-flat constituency in linguistic trees, with operations like extraction respecting the tree's branching. This formalization restricts merging to explicit nodes, preventing associative regrouping of premises. Such tree-based compositions are used in multimodal substructural systems to handle complex nesting without associativity assumptions. These premise composition methods provide advantages in controlling resource flow within proofs, enabling substructural logics to model phenomena like computational resources in or syntactic hierarchies in Lambek systems, where classical sets would allow overly permissive manipulations. Semantically, these syntactic structures align with operations in residuated lattices, offering a corresponding algebraic .

History and Applications

Historical Development

The origins of substructural logic trace back to at least the 1920s, with Ivan Orlov's 1928 formulation of a propositional relevant logic that rejected weakening to ensure relevance between premises and conclusions. Significant developments occurred in the mid-20th century, particularly in relevant logics motivated by concerns over the in . In the 1950s, Alan Ross Anderson and Nuel D. Belnap began systematizing relevant logics, emphasizing that premises must be genuinely used in derivations to avoid irrelevant implications; their foundational work culminated in the multi-volume Entailment: The Logic of Relevance and Necessity (1975, 1992), which established proof-theoretic frameworks restricting structural rules like and weakening to ensure resource sensitivity. Independently, in 1958, Joachim Lambek introduced the as a noncommutative, associative system for modeling in , omitting weakening and to track linguistic resources precisely; this calculus, detailed in "The Mathematics of Sentence Structure," laid groundwork for categorial grammars and substructural approaches in formal semantics. Interest in these resource-aware logics surged in the late , particularly with Jean-Yves Girard's introduction of in 1987, which reframed by treating hypotheses as consumable resources and incorporating modalities for controlled reuse, as presented in his seminal paper "Linear Logic" in .90045-4) This revival highlighted substructural logics' potential beyond , influencing and . Around the same time, Kosta Došen advanced the abstract study of logics via structural rule restrictions in papers like "Logical Constants as Punctuation Marks" (1988) and "Sequent Calculi for Some Consequence Relations" (1990), and he coined the term "substructural logics" at the 1990 workshop to unify systems deviating from full Gentzen-style structural rules. The field's consolidation occurred in 1990 with the First International Workshop on Substructural Logics in , , which fostered dialogue among researchers in relevant, linear, and Lambek-style systems; proceedings edited by Peter Schroeder-Heister and Kosta Došen (1993) marked a key milestone in community formation. Through the and , substructural logics integrated with , particularly in , concurrency, and algebraic semantics, as evidenced by Greg Restall's An Introduction to Substructural Logics (2000), which synthesized proof and model theories. A comprehensive algebraic treatment emerged in Residuated Lattices by Nikolaos Galatos, Peter Jipsen, Tomasz Kowalski, and Hiroakira Ono (2007), providing a unified framework for many substructural varieties and underscoring their mathematical depth.

Modern Applications

In , substructural logics, particularly , have been integrated into type systems to enforce resource-sensitive computations and provide complexity bounds. Girard's Light Linear Logic (LLL), introduced in 1998, refines by restricting structural rules to characterize polynomial-time functions through the proofs-as-programs , enabling type-based analysis of without relying on external bounds. This approach has influenced implicit , where types directly control resource usage in languages. Additionally, session types, rooted in , model concurrency in message-passing systems by ensuring linear use of communication channels, preventing errors like or race conditions in distributed protocols. The between propositions and session types, formalized in works like Caires and Pfenning's 2013 framework, allows processes to be verified as adhering to expected interaction sequences, enhancing safety in concurrent programming languages such as those based on the . In , the Lambek calculus serves as a foundational substructural system for categorial grammars, enabling the modeling of syntax through resource-sensitive composition without the limitations of context-free grammars. Developed by Lambek in , this non-commutative logic treats syntactic categories as types and as deduction, capturing phenomena like and discontinuous dependencies that exceed context-free power, such as in cross-serial dependencies. Modern extensions, including multimodal categorial grammars, leverage these properties to parse complex linguistic structures while preserving directional constraints on argument combination. In philosophy, substructural logics support models of and by incorporating relevance constraints that prevent irrelevant assumptions from propagating, addressing limitations in monotonic classical systems. Relevant logics, a key substructural variety, ensure that premises bear directly on conclusions, facilitating defeasible inferences where beliefs can be overridden by new evidence without global inconsistency. For instance, in frameworks, substructural approaches reformulate conditional logics to handle dynamic updates under resource scarcity, as explored in integrations of substructural sequents with revision operators. This is particularly useful for modeling practical reasoning in and , where default rules yield to exceptions without requiring full recomputation. Emerging applications since 2010 highlight substructural logics in and AI proof assistants. In , linear logic models the by prohibiting contraction, reflecting the impossibility of duplicating unknown quantum states and enabling resource-aware protocols for quantum circuits and error correction. Abramsky and Coecke's 2004 framework provides a categorical semantics for quantum protocols, drawing on monoidal structures akin to those in to capture quantum processes, , and measurements while respecting resource limits. In AI, linear types have been mechanized in proof assistants like to verify resource-bounded proofs, with formalizations of supporting cut-elimination and focusing for certified programming. Nigam et al.'s 2017 encoding demonstrates how substructural types ensure precise control over proof resources, aiding in the development of secure, efficient theorem provers. Compared to , substructural logics excel in real-world modeling by explicitly handling and dependency, where resources like assumptions or data cannot be freely duplicated or discarded, thus avoiding paradoxes in resource-intensive domains such as concurrency and . This resource sensitivity provides finer-grained control, enabling accurate representations of consumption and ordering in proofs and processes.

References

  1. [1]
    [PDF] Editorial Introduction: Substructural Logics and Metainferences - HAL
    Jan 7, 2023 · Abstract. The concept of substructural logic was originally introduced in relation to limitations of Gentzen's structural rules of ...
  2. [2]
    [PDF] Relevant and Substructural Logics - Greg Restall
    with an eye to proofs, or with an eye to models.2 Relevant and substructural logics are no ...
  3. [3]
    [PDF] LINEAR LOGIC : ITS SYNTAX AND SEMANTICS - Jean-Yves GIRARD
    Linear logic is not an alternative logic ; it should rather be seen as an exten- sion of usual logic. Since there is no hope to modify the extant classical or.
  4. [4]
    [PDF] Substructural Logic and Partial Correctness - CS@Cornell
    We formulate a noncommutative sequent calculus for partial correctness that subsumes proposi- tional Hoare Logic. Partial correctness assertions are ...
  5. [5]
    An Introduction to Substructural Logics - ResearchGate
    Substructural logic [7, 19] is a general term for a family of logics that prohibit or limit the use of some of the structural rules. Based on various ...
  6. [6]
  7. [7]
    [PDF] The Mathematics of Sentence Structure Joachim Lambek
    Mar 11, 2008 · The calculus presented here is formally identical with a calculus constructed by G. D. Findlay and the present author for a discussion of ...
  8. [8]
    Structural Proof Theory - Cambridge University Press & Assessment
    Sara Negri, University of Helsinki, Jan von Plato, University of Helsinki. Appendix by Aarne Ranta. Publisher: Cambridge University Press. Online publication ...Missing: rules | Show results with:rules
  9. [9]
    [PDF] Untersuchungen über das logische Schließen I - Digizeitschriften
    Titel: Untersuchungen über das logische Schließen I. Autor: Gentzen, G. Ort: Berlin. Jahr: 1935. PURL: https://resolver.sub.uni-goettingen.de/purl ...
  10. [10]
    [PDF] Sequent Calculus
    The sequent calculus was originally introduced by Gentzen [Gen35], primarily as a technical device for proving consistency of predicate logic. Our goal of ...
  11. [11]
    [PDF] A Tutorial on Computational Classical Logic and the Sequent Calculus
    Gentzen's sequent calculus provides a native language for classical logic which admits ... corresponds one-for-one with the structural rules of Gentzen's LK ...
  12. [12]
    [PDF] Talk Notes: Substructural Logics
    Linear logic, by forgoing weakening and contraction, requires every assumption to be used exactly once; affine logic forgoes only contraction (use each assump-.
  13. [13]
    Linear logic - ScienceDirect.com
    1987, Pages 1-101. Theoretical Computer Science. Linear logic☆. Author links open overlay panel. Jean-Yves Girard ... Because of its length and novelty this paper ...Missing: original | Show results with:original
  14. [14]
    Entailment and Relevance - jstor
    Volume 25, Number 2, June 1960. ENTAILMENT AND RELEVANCE'. NUEL D. BELNAP, JR. Those who object to the identification of strict implication and entail- ment ...
  15. [15]
    BCK and BCI Logics, Condensed Detachment and the 2-Property
    A BCI-λ-term with no free variables is called aBCl-combinator. There is a precise correspondence between BCK-combinators in CL and those in λ-calculus (and ...
  16. [16]
    Non‐commutative intuitionistic linear logic - Wiley Online Library
    1 Abrusci, V. M., Sequent calculus for intuitionistic linear propositional logic. In: Proceedings of the summer School and Conference on Mathematical Logic, ...
  17. [17]
    Semantics of weakening and contraction - ScienceDirect.com
    These give rise to two logics which are “inbetween” linear and intuitionistic logic: in affine (or weakening) logic one always has a weakening and a ! w for ...<|separator|>
  18. [18]
    Substructural Logics and Residuated Lattices — an Introduction
    This is an introductory survey of substructural logics and of residuated lattices which are algebraic structures for substructural logics.
  19. [19]
    [PDF] A Survey of Residuated Lattices1 - Vanderbilt University
    In the language of residuated lattices, this subvariety is defined relative to RL by the identity x(x\e) = e. Other well known subvarieties of RL include ...
  20. [20]
    Substructural Logics - Stanford Encyclopedia of Philosophy
    Aug 15, 2024 · Substructural logics are non-classical logics notable for the absence of one or more structural rules present in classical logic.
  21. [21]
    Relevance Logic - Stanford Encyclopedia of Philosophy
    Jun 17, 1998 · In the work of Anderson and Belnap the central systems of relevance logic were the logic \(\mathbf{E}\) of relevant entailment and the system ...
  22. [22]
    [2501.00496] Semi-Substructural Logics à la Lambek - arXiv
    Dec 31, 2024 · ... non-associative Lambek calculus, with trees as antecedents. Each calculus is respectively equivalent to the sequent calculus with stoup (for ...
  23. [23]
    Non-associative, Non-commutative Multi-modal Linear Logic
    Aug 1, 2022 · Non-associative contexts will be organized via binary trees, here called structures. Definition 1. (Structured sequents). Structures are ...<|separator|>
  24. [24]
    A Historical Introduction to Substructural Logics - Oxford Academic
    Oct 31, 2023 · It is as if the structural part of logic were more fundamental: to change logic, we have to change this part. Logical constants are in principle ...
  25. [25]
    [PDF] LIGHT LINEAR LOGIC - Jean-Yves GIRARD
    Light Linear Logic is a purely logical system with a more careful handling of structural rules : this system is strong enough to represent all polytime ...
  26. [26]
    [PDF] Linear Logic Propositions as Session Types
    In this paper we present a type system for the π-calculus that exactly corresponds to the standard sequent calculus proof system for dual intuitionistic linear ...
  27. [27]
    A Logic for Categorial Grammars: Lambek's Syntactic Calculus
    Our second chapter is a rather complete study of the Lambek calculus, which enables a completely logical treatment of categorial grammar.
  28. [28]
    [PDF] When Conditional Logic and Belief Revision Meet Substructural ...
    Conditional logic and belief revision the- ory are prominent theories in artificial intelligence dealing with common sense reasoning. We show in this article ...Missing: defeasible | Show results with:defeasible<|separator|>
  29. [29]
    [PDF] Linear Logic for Generalized Quantum Mechanics
    Linear logic is used as a dynamic quantum logic, extending quantum logic with time, and is a generalized dynamic quantum logic.
  30. [30]
    [PDF] Mechanizing Linear Logic in Coq - Vivek Nigam
    Feb 16, 2018 · This paper formalizes linear logic in Coq and mechanizes the proof of cut-elimination and the completeness of focusing. Moreover, the ...