nLab linear logic

Contents

Context

Foundations

foundations

The basis of it all

 Set theory

set theory

Foundational axioms

foundational axioms

Removing axioms

Type theory

natural deduction metalanguage, practical foundations

  1. type formation rule
  2. term introduction rule
  3. term elimination rule
  4. computation rule

type theory (dependent, intensional, observational type theory, homotopy type theory)

syntax object language

computational trinitarianism =
propositions as types +programs as proofs +relation type theory/category theory

logicset theory (internal logic of)category theorytype theory
propositionsetobjecttype
predicatefamily of setsdisplay morphismdependent type
proofelementgeneralized elementterm/program
cut rulecomposition of classifying morphisms / pullback of display mapssubstitution
introduction rule for implicationcounit for hom-tensor adjunctionlambda
elimination rule for implicationunit for hom-tensor adjunctionapplication
cut elimination for implicationone of the zigzag identities for hom-tensor adjunctionbeta reduction
identity elimination for implicationthe other zigzag identity for hom-tensor adjunctioneta conversion
truesingletonterminal object/(-2)-truncated objecth-level 0-type/unit type
falseempty setinitial objectempty type
proposition, truth valuesubsingletonsubterminal object/(-1)-truncated objecth-proposition, mere proposition
logical conjunctioncartesian productproductproduct type
disjunctiondisjoint union (support of)coproduct ((-1)-truncation of)sum type (bracket type of)
implicationfunction set (into subsingleton)internal hom (into subterminal object)function type (into h-proposition)
negationfunction set into empty setinternal hom into initial objectfunction type into empty type
universal quantificationindexed cartesian product (of family of subsingletons)dependent product (of family of subterminal objects)dependent product type (of family of h-propositions)
existential quantificationindexed disjoint union (support of)dependent sum ((-1)-truncation of)dependent sum type (bracket type of)
logical equivalencebijection setobject of isomorphismsequivalence type
support setsupport object/(-1)-truncationpropositional truncation/bracket type
n-image of morphism into terminal object/n-truncationn-truncation modality
equalitydiagonal function/diagonal subset/diagonal relationpath space objectidentity type/path type
completely presented setsetdiscrete object/0-truncated objecth-level 2-type/set/h-set
setset with equivalence relationinternal 0-groupoidBishop set/setoid with its pseudo-equivalence relation an actual equivalence relation
equivalence class/quotient setquotientquotient type
inductioncolimitinductive type, W-type, M-type
higher inductionhigher colimithigher inductive type
-0-truncated higher colimitquotient inductive type
coinductionlimitcoinductive type
presettype without identity types
set of truth valuessubobject classifiertype of propositions
domain of discourseuniverseobject classifiertype universe
modalityclosure operator, (idempotent) monadmodal type theory, monad (in computer science)
linear logic(symmetric, closed) monoidal categorylinear type theory/quantum computation
proof netstring diagramquantum circuit
(absence of) contraction rule(absence of) diagonalno-cloning theorem
synthetic mathematicsdomain specific embedded programming language

homotopy levels

semantics

Monoidal categories

monoidal categories

With braiding

With duals for objects

With duals for morphisms

With traces

Closed structure

Special sorts of products

Semisimplicity

Morphisms

Internal monoids

Examples

Theorems

In higher category theory

Contents

Idea

Linear logic is a substructural logic in which the contraction rule and the weakening rule are omitted, or at least have their applicability restricted.

Absence of weakening and contraction

Notice that contraction and weakening are the two structural inference rules which exhibit context extensions ΓΓ,P\Gamma \,\mapsto\, \Gamma, P as admitting natural diagonal and projection maps, respectively, hence as admitting interpretation as cartesian products Γ×P\Gamma \times P (cf. Jacobs 1994):

Therefore, when these inference rules are dropped then the resulting multiplicative conjunction \otimes of contexts may still have interpretation as a symmetric monoidal tensor product (such as known from the tensor product of vector spaces) but not necessarily as a cartesian monoidal product (such as known from the product of topological spaces).

Accordingly, if implication still follows the usual inference rule of function types but now with respect to a non-cartesian multiplicative conjunction \otimes

ΓPt p:TΓ(pt p):PT \frac{ \Gamma \otimes P \;\;\vdash\;\; t_p \,\colon\, T } { \Gamma \;\;\vdash\;\; (p \mapsto t_p) \,\colon\, P \multimap T }

etc., then the function type PTP \multimap T may be interpreted as an internal hom in a non-cartesian closed monoidal category with symmetric monoidal tensor product \otimes — such as the category of finite-dimensional vector spaces, in which case the terms Γf:PT\Gamma \vdash f \colon P \multimap T interpret as linear functions — whence the logical connective\multimap” is also called linear implication.

Which exact flavour of monoidal category provides categorical semantics for linear logic depends on exactly which further inference rules are considered (see also at bunched logic): In the original definition of Girard (1987), linear logic is the internal logic of/has categorical semantics in star-autonomous categories (Seely 89, prop. 1.5). But more generally linear logic came to refer to the internal logic of any possibly-non-cartesian symmetric closed monoidal category (then usually called multiplicative intuitionistic linear logicMILL) or even polycategory (Szabo 78 see the history section and see also de Paiva 89, Blute 91, Benton–Bierman–de Paiva–Hyland 92, Hyland–de Paiva 93, Bierman 95, Barber 97, Schalk 04, Melliès 09). Under this interpretation, proof nets (or the associated Kelly–Mac Lane graphs) of linear logic are similar to string diagrams for monoidal categories.

Indeed, these more general senses of linear logic still faithfully follow the original motivation for the term “linear” as connoting “resource availability” explained below, since the non-cartesianness of the tensor product means the absence of a diagonal map and hence the impossibility of functions to depend on more than a single (linear) copy of their variables.

In addition to these usual “denotational” categorical semantics, linear logic also has an “operational” categorical semantics, called the Geometry of Interaction, in traced monoidal categories.

Although linear logic is traditionally presented in terms of inference rules, it was apparently discovered by Girard while studying coherence spaces.

As a quantum logic

With the archetypical categorical semantics of linear logic in the compact closed category of finite dimensional vector spaces in mind, which is the habitat of quantum information theory (see there for more), one may naturally understand linear logic as a form of quantum logic [Pratt (1992)] and in its natural enhancement to linear type theory also as a quantum circuit-language [Abramsky & Duncan (2006); Duncan (2006)].

[Dal Lago & Faggian (2012)]: “It’s more and more clear that strong relationships exist between linear logic and quantum computation. This seems to go well beyond the easy observation that the intrinsic resource-consciousness of linear logic copes well with the impossibility of cloning and erasing qubits.”

[Staton (2015), p. 1]: “A quantum programming language captures the ideas of quantum computation in a linear type theory.”

Indeed:

  1. the absence of the contraction rule above, whose categorical semantics is the duplication of data, exactly reflects the no-cloning theorem of quantum physics

  2. the absence of the weakening rule above, whose categorical semantics is the erasure of data, exactly reflects the no-deleting theorem,

  3. the non-cartesian multiplicative conjunction \otimes, whose categorical semantics is given by tensor products (such as of Hilbert spaces), reflects the existence of entangled terms [Baez (2004)].

Historically, linear logic was not explicitly motivated as a quantum logic this way, but as a logic keeping track of “resource availability” (see below), where (again due to the absence of the contraction rule) a variable could, a priori, be “only used once” and in fact (due to the absence of the weakening rule) “exacty once”. However, meanwhile the resource concept has also (independently) penetrated quantum information theory (for example, the quantum teleportation-protocol uses Bell states as a resource that allow for and are consumed by the intended “teleportation” operation), see the references there.

As a logic of resource availability

Unlike many traditional formal systems of logic, which deal with the truth of propositions, linear logic is often described and was originally motivated [Girard (1987)] as dealing with the availability of resources.

A proposition, if it is true, remains true no matter how we use that fact in proving other propositions. By contrast, in using a resource AA to make available a resource BB, AA itself may be consumed or otherwise modified. Linear logic deals with this by restricting our ability to duplicate or discard resources freely.

For example:

Let’s please add a better example!

We have

have cakeeat cake\text{have cake} \vdash \text{eat cake}

from which we can prove

have cake,have cakehave cakeeat cake\text{have cake},\; \text{have cake} \vdash \text{have cake} \wedge \text{eat cake}

which by left contraction (duplication of inputs) in classical logic yields

have cakehave cakeeat cake\text{have cake} \vdash \text{have cake} \wedge \text{eat cake}

Linear logic would disallow the contraction step and treat have cake,have cakeA\text{have cake},\; \text{have cake} \vdash A as explicitly meaning that two slices of cake yield AA. Disallowing contraction then corresponds to the fact that we can’t turn one slice of cake into two (more’s the pity), so you can't have your cake and eat it too.

Game semantics

Linear logic can also be interpreted using a semantics of “games” or “interactions”. Under this interpretation, each proposition in a sequent represents a game being played or a transaction protocol being executed. An assertion of, for instance,

P,QR P, Q \vdash R

means roughly that if I am playing three simultaneous games of PP, QQ, and RR, in which I am the left player in PP and QQ and the right player in RR, then I have a strategy which will enable me to win at least one of them. Now the above statements about “resources” translate into saying that I have to play in all the games I am given and can’t invent new ones on the fly.

For example, PPP \vdash P can be won by using the copycat strategy where one makes the two games identical except with left and right players reversed.

As a relevance logic

Linear logic is closely related to notions of relevance logic, which have been studied for much longer. The goal of relevance logic is to disallow statements like “if pigs can fly, then grass is green” which are true, under the usual logical interpretation of implication, but in which the hypothesis has nothing to do with the conclusion. Clearly there is a relationship with the “resource semantics”: if we want to require that all hypotheses are “used” in a proof then we need to disallow weakening. In linear logic, all hypotheses must be used exactly once, whereas in relevance logic(s), all hypotheses must be used at least once.

Definition

Linear logic is usually given in terms of sequent calculus. There is a set of propositions (although as remarked above, to be thought of more as resources to be acquired than as statements to be proved) which we construct through recursion. Each pair of lists of propositions is a sequent (written as usual with ‘\vdash’ between the lists), some of which are valid; we determine which are valid also through recursion. Technically, the propositional calculus of linear logic also requires a set of propositional variables from which to start; this is usually identified with the set of natural numbers (so the variables are p 0p_0, p 1p_1, etc), although one can also consider the linear logic LL(V)LL(V) where VV is any initial set of propositional variables.

Here we define the set of propositions:

  • Every propositional variable is a proposition.

  • For each proposition AA, there is a proposition A A^\perp, the negation of AA.

  • For each proposition AA and proposition BB, there are four additional propositions:

  • There are also four constants to go with the four binary operations above:

    • \top (read ‘top’), the additive truth;

    • 0\mathbf{0} (read ‘zero’), the additive falsity;

    • 1\mathbf{1} (read ‘one’), the multiplicative truth;

    • \bot (read ‘bottom’), the multiplicative falsity.

  • For each proposition AA, there are two additional propositions:

The terms “exponential”, “multiplicative”, and “additive” come from the fact that “exponentiation converts addition to multiplication”: we have !(A&B)!A!B!{(A \& B)}\equiv !{A} \otimes !{B} and so on (see below).

However, the connectives and constants can also be grouped in different ways. For instance, the multiplicative conjunction \otimes and additive disjunction \oplus are both positive types, while the additive conjunction &\& and multiplicative disjunction \parr are negative types. Similarly, the multiplicative truth 1\mathbf{1} and the additive falsity 0\mathbf{0} are positive, while the additive truth \top and multiplicative falsity \bot are negative. This grouping has the advantage that the similarity of symbols matches the adjective used (because the symbols were chosen with this grouping in mind).

But conversely, the natural grouping by multiplicative/additive, or equivalently by de Morgan dual pairs, has led many authors to alter Girard’s notation, in particular reverting to the category-theoretic ×\times and ++ for the additives &\& and \oplus, and introducing a different symbol such as \odot, \bullet or (confusingly) \oplus for Girard’s \parr. But on this page we will stick to Girard’s conventions for consistency.

In relevance logic, the terms “conjunction” and “disjunction” are often reserved for the additive versions &\& and \oplus, which are written with the traditional notations \wedge and \vee. In this case, the multiplicative conjunction ABA\otimes B is called fusion and denoted ABA\circ B, while the multiplicative disjunction ABA\parr B is called fission and denoted A+BA+B (or sometimes, confusingly, ABA\oplus B). In relevance logic the symbol \bot may also be used for the additive falsity, here denoted 0\mathbf{0}. Also, sometimes the additive connectives are called extensional and the multiplicatives intensional.

Sometimes one does not define the operation of negation, defining only p p^\perp for a propositional variable pp. It is a theorem that every proposition above is equivalent (in the sense defined below) to a proposition in which negation is applied only to propositional variables.

We now define the valid sequents, where we write A,B,CD,E,FA, B, C \vdash D, E, F to state the validity of the sequent consisting of the list (A,B,C)(A,B,C) and the list (D,E,F)(D,E,F) and use Greek letters for sublists:

  • The structural rules:

    • The exchange rule: If a sequent is valid, then any permutation of it (created by permuting its left and right sides independently) is valid.

    • The restricted weakening rule: If Γ,ΔΘ\Gamma, \Delta \vdash \Theta, then Γ,!A,ΔΘ\Gamma, !{A}, \Delta \vdash \Theta, for any AA; conversely and dually, if ΓΔ,Θ\Gamma \vdash \Delta, \Theta, then ΓΔ,?A,Θ\Gamma \vdash \Delta, ?{A}, \Theta for any AA.

    • The restricted contraction rule: If Γ,!A,!A,ΔΘ\Gamma, !{A}, !{A}, \Delta \vdash \Theta, then Γ,!A,ΔΘ\Gamma, !{A}, \Delta \vdash \Theta; conversely and dually, if ΓΔ,?A,?A,Θ\Gamma \vdash \Delta, ?{A}, ?{A}, \Theta, then ΓΔ,?A,Θ\Gamma \vdash \Delta, ?{A}, \Theta.

    • The variable rule: Always, AAA \vdash A;

    • The cut rule: If ΓA,Φ\Gamma \vdash A, \Phi and Ψ,AΔ\Psi,A \vdash \Delta, then Ψ,ΓΔ,Φ\Psi,\Gamma \vdash \Delta,\Phi.

  • The inference rules for each operation:

    • If ΓA,Δ\Gamma \vdash A, \Delta, then Γ,A Δ\Gamma, A^\perp \vdash \Delta; conversely and dually, if Γ,AΔ\Gamma, A \vdash \Delta, then ΓA ,Δ\Gamma \vdash A^\perp, \Delta.

    • If Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta or Γ,B,ΔΘ\Gamma, B, \Delta \vdash \Theta, then Γ,A&B,ΔΘ\Gamma, A \& B, \Delta \vdash \Theta; conversely, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta and ΓΔ,B,Θ\Gamma \vdash \Delta, B, \Theta, then ΓΔ,A&B,Θ\Gamma \vdash \Delta, A \& B, \Theta.

    • Dually, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta or ΓΔ,B,Θ\Gamma \vdash \Delta, B, \Theta, then ΓΔ,AB,Θ\Gamma \vdash \Delta, A \oplus B, \Theta; conversely, if Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta and Γ,B,ΔΘ\Gamma, B, \Delta \vdash \Theta, then Γ,AB,ΔΘ\Gamma, A \oplus B, \Delta \vdash \Theta.

    • If Γ,A,B,ΔΘ\Gamma, A, B, \Delta \vdash \Theta, then Γ,AB,ΔΘ\Gamma, A \otimes B, \Delta \vdash \Theta; conversely, if ΓΔ,A\Gamma \vdash \Delta, A and ΛB,Θ\Lambda \vdash B, \Theta, then Γ,ΛΔ,AB,Θ\Gamma, \Lambda \vdash \Delta, A \otimes B, \Theta.

    • Dually, if ΓΔ,A,B,Θ\Gamma \vdash \Delta, A, B, \Theta, then ΓΔ,AB,Θ\Gamma \vdash \Delta, A \parr B, \Theta; conversely, if Γ,AΔ\Gamma, A \vdash \Delta and B,ΘΛB, \Theta \vdash \Lambda, then Γ,AB,ΘΔ,Λ\Gamma, A \parr B, \Theta \vdash \Delta, \Lambda.

    • Always ΓΔ,,Θ\Gamma \vdash \Delta, \top, \Theta; dually (there is no converse), Γ,0,ΔΘ\Gamma, \mathbf{0}, \Delta \vdash \Theta.

    • If Γ,ΔΘ\Gamma, \Delta \vdash \Theta, then Γ,1,ΔΘ\Gamma, \mathbf{1}, \Delta \vdash \Theta; conversely, 1\vdash \mathbf{1}.

    • Dually, if ΓΔ,Θ\Gamma \vdash \Delta, \Theta, then ΓΔ,,Θ\Gamma \vdash \Delta, \bot, \Theta; conversely, \bot \vdash.

    • If Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta, then Γ,!A,ΔΘ\Gamma, !{A}, \Delta \vdash \Theta; conversely, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta, then ΓΔ,!A,Θ\Gamma \vdash \Delta, !{A}, \Theta, whenever Γ\Gamma consists entirely of propositions of the form !!{-} while Δ\Delta and Θ\Theta consist entirely of propositions of the form ??{-}.

    • Dually, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta, then ΓΔ,?A,Θ\Gamma \vdash \Delta, ?{A}, \Theta; conversely, if Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta, then Γ,?A,ΔΘ\Gamma, ?{A}, \Delta \vdash \Theta, whenever Γ\Gamma and Δ\Delta consist entirely of propositions of the form !!{-} while Θ\Theta consists entirely of propositions of the form ??{-}.

The main point of linear logic is the restricted use of the weakening and contraction rules; if these were universally valid (applying to any AA rather than only to !A!{A} or ?A?{A}), then the additive and multiplicative operations would be equivalent (in the sense defined below) and similarly !A!{A} and ?A?{A} would be equivalent to AA, which would give us classical logic. On the other hand, one can also remove the exchange rule to get a variety of noncommutative logic?; one must then be careful about how to write the other rules (which we have been above).

As usual, there is a theorem of cut elimination showing that the cut rule and identity rule follow from all other rules and the special cases of the identity rule of the form ppp \vdash p for a propositional variable pp.

The propositions AA and BB are (propositionally) equivalent if ABA \vdash B and BAB \vdash A are both valid, which we express by writing ABA \equiv B. It is then a theorem that either may be swapped for the other anywhere in a sequent without affecting its validity. Up to equivalence, negation is an involution, and the operations &\&, \oplus, \otimes, and \parr are all associative, with respective identity elements \top, 0\mathbf{0}, 1\mathbf{1}, and \bot. These operations are also commutative (although this fails for the multiplicative connectives if we drop the exchange rule). The additive connectives are also idempotent (but the multiplicative ones are not).

Remark

There is a more refined notion of equivalence, where we pay attention to specific derivations π:AB\pi: A \vdash B of sequents, and deem two derivations π,π\pi, \pi' of ABA \vdash B Lambek-equivalent if they map to the same morphism under any categorical semantics SS; see below. Given a pair of derivations π:AB\pi: A \vdash B and ρ:BA\rho: B \vdash A, it then makes sense to ask whether they are Lambek-inverse to one another (i.e., whether S(ρ)=S(π) 1S(\rho) = S(\pi)^{-1} under any semantics), so that the derivations exhibit an isomorphism S(A)S(B)S(A) \cong S(B) under any semantics SS. This equivalence relation A LambekBA \equiv_{Lambek} B is strictly stronger than propositional equivalence. It should be observed that all equivalences ABA \equiv B listed below are in fact Lambek equivalences.

We also have distributive laws that explain the adjectives ‘additive’, ‘multiplicative’, and ‘exponential’:

  • Multiplication distributes over addition if one is a conjunction and one is a disjunction:
    • A(BC)(AB)(AC)A \otimes (B \oplus C) \equiv (A \otimes B) \oplus (A \otimes C) (and on the other side);
    • A(B&C)(AB)&(AC)A \parr (B \& C) \equiv (A \parr B) \& (A \parr C) (and on the other side);
    • A00A \otimes \mathbf{0} \equiv \mathbf{0} (and on the other side);
    • AA \parr \top \equiv \top (and on the other side).
  • Exponentiation converts addition into multiplication if all are conjunctions or all are disjunctions:
    • !(A&B)!A!B!{(A \& B)} \equiv !{A} \otimes !{B};
    • ?(AB)?A?B?{(A \oplus B)} \equiv ?{A} \parr ?{B};
    • !1!{\top} \equiv \mathbf{1};
    • ?0?{\mathbf{0}} \equiv \bot.

It is also a theorem that negation (except for the negations of propositional variables) can be defined (up to equivalence) recursively as follows:

  • (A ) A(A^\perp)^\perp \equiv A;
  • (A&B) A B (A \& B)^\perp \equiv A^\perp \oplus B^\perp and vice versa;
  • (AB) A B (A \otimes B)^\perp \equiv A^\perp \parr B^\perp and vice versa;
  • 0\top^\perp \equiv \mathbf{0} and vice versa;
  • 1 \mathbf{1}^\perp \equiv \bot and vice versa;
  • (!A) ?A (!{A})^\perp \equiv ?{A^\perp} and vice versa.

The logical rules for negation can then be proved.

In this way, linear logic in the original sense (interpreted in star-autonomous categories) has a perfect de Morgan duality. But observe that more general variants (interpreted in more general symmetric monoidal categories) need not, see for instance (Hyland–de Paiva 93).

We can also restrict attention to sequents with one term on either side as follows: ΓΔ\Gamma \vdash \Delta is valid if and only if ΓΔ\bigotimes \Gamma \vdash \parr \Delta is valid, where (A,B,C)ABC\bigotimes(A, B, C) \coloneqq A \otimes B \otimes C, etc, and similarly for \parr (using implicitly that these are associative, with identity elements to handle the empty sequence).

We can even restrict attention to sequents with no term on the left side and one term on the right; ABA \vdash B is valid if and only if AB\vdash A \multimap B is valid, where ABA BA \multimap B \coloneqq A^\perp \parr B. In this way, it's possible to ignore sequents entirely and speak only of propositions and valid propositions, eliminating half of the logical rules in the process. However, this approach is not as beautifully symmetric as the full sequent calculus.

Variants

The logic described above is full classical linear logic. There are many important fragments and variants of linear logic, such as:

  • multiplicative linear logic (MLL), which contains only ,\otimes,\parr and their units 1,\mathbf{1},\bot as well as the negation () (-)^\perp.

  • multiplicative-exponential linear logic (MELL), which contains only ,,1,,() \otimes,\parr,\mathbf{1},\bot,(-)^\perp and the exponential modalities !,?!,?.

  • multiplicative-additive linear logic (MALL), which contains everything except the exponential modalities !,?!,?.

  • multiplicative intuitionistic linear logic (MILL), which contains only ,1,\otimes,\mathbf{1},\multimap (the latter now as a primitive operation); in particular there is no longer the involutive negation () (-)^\perp. The sequents are also restricted to have only one formula on the right.

  • intuitionistic linear logic (ILL), which contains all the additive connectives &,,0,\&,\oplus,\mathbf{0},\top as well as the intutionistic multiplicatives ,1,\otimes,\mathbf{1},\multimap and the exponential !!, with one formula on the right as above. In this case all connectives are all independent of each other.

  • full intuitionistic linear logic (FILL), which in addition to ILL contains the multiplicative disjunction \parr, and perhaps the exponential ??. (Sometimes ILL without ,?\parr,? is also called “full” intuitionistic linear logic.)

  • non-commutative linear logics (braided or not)

  • “light” and “soft” linear logics, which limit the use of ! to constrain the computational complexity of proofs

  • first-order linear logic, which adds quantifiers \exists and \forall (sometimes denoted \bigvee and \bigwedge), either over a fixed domain or over varying types. These quantifiers are usually considered “additive”; for a theory that has a certain kind of “multiplicative quantifier” see bunched implication.

One can also consider adding additional rules to linear logic. For instance, by adding the weakening rule (but not the contraction rule) one obtains affine logic, whereas by adding contraction but not weakening one obtains relevance logic. Another rule that is sometimes considered is the mix rule.

Linear-non-linear logic is an equivalent presentation of intuitionistic linear logic that decomposes the !! modality into an adjunction between a cartesian logic and a linear one in which cartesian variables can also appear.

Some models of linear logic allow for a codereliction operation, which allows for one to take the “linear approximation” of a proof !(A)B!(A) \to B. These models lead to the development of differential linear logic, the categorical semantics of which was laid out in (Blute–Cockett–Seely).

Categorical semantics

We discuss the categorical semantics of linear logic. See also at relation between type theory and category theory.

**-autonomous categories

One way to explain linear logic to a category theorist is to say that its models are *-autonomous categories with extra structure (Seely, 1989, prop. 1.5). (If the underlying category is a suplattice then these are commutative quantales, (Yetter 90))

Firstly, there is a monoidal ‘tensor’ connective ABA \otimes B. Negation A A^\bot is modelled by the dual object involution () *(-)^*, while linear implication ABA\multimap B corresponds to the internal hom, which can be defined as (AB ) (A\otimes B^\bot)^\bot. There is a de Morgan dual of the tensor called ‘par’, with AB=(A B ) A \parr B = (A^\bot\otimes B^\bot)^\bot. Tensor and par are the ‘multiplicative’ connectives, which roughly speaking represent the parallel availability of resources.

The ‘additive’ connectives &\& and \oplus, which correspond in another way to traditional conjunction and disjunction, are modelled as usual by products and coproducts. Seely (1989) notes that products are sufficient, as **-autonomy then guarantees the existence of coproducts; that is, they are also linked by de Morgan duality.

Recall also that linear logic recaptures the notion of a resource that can be discarded or copied arbitrarily by the use of the modal operator !! the exponential modality: !A!A denotes an ‘AA-factory’, a resource that can produce zero or more AAs on demand. It is modelled using a suitably monoidal comonad !! on the underlying **-autonomous category. There are various inequivalent ways to make this precise, however; see !-modality for discussion.

An LL sequent

A 1,,A nB 1,,B mA_1,\ldots,A_n \vdash B_1,\ldots,B_m

is interpreted as a morphism

iA i jB j \otimes_i A_i \to \parr_j B_j

The comonoid structure on !A!A then yields the weakening

Γ!AΓIΓ \Gamma\otimes !A \to \Gamma \otimes I \to \Gamma

and contraction

Γ!AΓ!A!A \Gamma\otimes !A \to \Gamma \otimes !A \otimes !A

maps. The corresponding rules are interpreted by precomposing the interpretation of a sequent with one of these maps.

The (co)-Kleisli category of !! is cartesian closed, and the product there coincides with the product in the base category. The exponential (unsurprisingly for a Kleisli category) is B A!ABB^A \cong !A\multimap B.

Particular monoidal and **-autonomous posets for modeling linear logic can be obtained by Day convolution from ternary frames. This includes Girard’s phase spaces as a particular example.

First-order linear logic is correspondingly modeled in a linear hyperdoctrine.

Polycategories

A different way to explain linear logic categorically (though equivalent, in the end) is to start with a categorical structure which lacks any of the connectives, but has sufficient structure to enable us to characterize them with universal properties. If we ignore the exponentials for now, such a structure is given by a polycategory. The polymorphisms

(A,B,C)(D,E,F)(A,B,C) \to (D,E,F)

in a polycategory correspond to sequents A,B,CD,E,FA,B,C \vdash D,E,F in linear logic. The multiplicative connectives are then characterized by representability and corepresentability properties:

(A,B)CABC\frac{(A,B) \to C}{A\otimes B \to C}

and

A(B,C)ABC\frac{A \to (B,C)}{A \to B\parr C}

(Actually, we should allow arbitrarily many unrelated objects to carry through in both cases.) The additives are similarly characterized as categorical products and coproducts, in a polycategorically suitable sense.

Finally, dual objects can be recovered as a sort of “adjoint”:

(A,B)CA(B *,C)\frac{(A,B) \to C}{A \to (B^*,C)}

If all these representing objects exist, then we recover a **-autonomous category.

One merit of the polycategory approach is that it makes the role of the structural rules clearer, and also helps explain why \parr sometimes seems like a disjunction and sometimes like a conjunction. Allowing contraction and weakening on the left corresponds to our polycategory being “left cartesian”; that is, we have “diagonal” and “projection” operations such as Hom(A,A;B)Hom(A;B)Hom(A,A; B) \to Hom(A;B) and Hom(;B)Hom(A,B)Hom(;B) \to Hom(A,B). In the presence of these operations, a representing object is automatically a cartesian product; thus \otimes coincides with &\&. Similarly, allowing contraction and weakening on the right makes the polycategory “right cocartesian”, which causes corepresenting objects to be coproducts and thus \parr to coincide with \oplus.

On the other hand, if we allow “multi-composition” in our polycategory, i.e. we can compose a morphism A(B,C)A \to (B,C) with one (B,C)D(B,C)\to D to obtain a morphism ADA\to D, then our polycategory becomes a PROP, and representing and corepresenting objects must coincide; thus \otimes and \parr become the same. This explains why \parr has both a disjunctive and a conjunctive aspect. Of course, if in addition to multi-composition we have the left and right cartesian properties, then all four connectives coincide (including the categorical product and coproduct) and we have an additive category.

Game semantics

We can interpret any proposition in linear logic as a game between two players: we and they. The overall rules are perfectly symmetric between us and them, although no individual game is. At any given moment in a game, exactly one of these four situations obtains: it is our turn, it is their turn, we have won, or they have won; the last two states continue forever afterwards (and the game is over). If it is our turn (or if they have won), then they are winning; if it is their turn (or if we have won), then we are winning. So there are two ways to win: because the game is over (and a winner has been decided), or because it is forever the other players’ turn (either because the other players have no move or because every move results in the other players’ turn again).

This is a little complicated, but it's important in order to be able to distinguish the four constants:

  • In \top, it is their turn, but they have no moves; the game never ends, but we win.
  • Dually, in 0\mathbf{0}, it is our turn, but we have no moves; the game never ends, but they win.
  • In contrast, in 1\mathbf{1}, the game ends immediately, and we have won.
  • Dually, in \bot, the game ends immediately, and they have won.

The binary operators show how to combine two games into a larger game:

  • In A&BA \& B, it is their turn, and they must choose to play either AA or BB. Once they make their choice, play continues in the chosen game, with ending and winning conditions as in that game.
  • Dually, in ABA \oplus B, it is our turn, and we must choose to play either AA or BB. Once we make our choice, play continues in the chosen game, with ending and winning conditions as in that game.
  • In ABA \otimes B, play continues with both games in parallel. If it is our turn in either game, then it is our turn overall; if it is their turn in both games, then it is their turn overall. If either game ends, then play continues in the other game; if both games end, then the overall game ends. If we have won both games, then we have won overall; if they have won either game, then they have won overall.
  • Dually, in ABA \parr B, play continues with both games in parallel. If it is their turn in either game, then it is their turn overall; if it is our turn in both games, then it is our turn overall. If either game ends, then play continues in the other game; if both games end, then the overall game ends. If they have won both games, then they have won overall; if we have won either game, then we have won overall.

So we can classify things as follows:

  • In a conjunction, they choose what game to play; in a disjunction, we have control. Whoever has control must win at least one game to win overall.
  • In an addition, one game must be played; in a multiplication, all games must be played.

To further clarify the difference between \top and 1\mathbf{1} (the additive and multiplicative versions of truth, both of which we win); consider AA \parr \top and A1A \parr \mathbf{1}. In AA \parr \top, it is always their move (since it is their move in \top, hence their move in at least one game), so we win just as we win \top. (In fact, AA \parr \top \equiv \top.) However, in A1A \parr \mathbf{1}, the game 1\mathbf{1} ends immediately, so play continues as in AA. We have won 1\mathbf{1}, so we only have to end the game to win overall, but there is no guarantee that this will happen. Indeed, in 01\mathbf{0} \parr \mathbf{1}, the game never ends and it is always our turn, so they win. (In 1\bot \parr \mathbf{1}, both games end immediately, and we win. In A1A \otimes \mathbf{1}, we must win both games to win overall, so this reduces to AA; indeed, A1AA \otimes \mathbf{1} \equiv A.)

Negation is easy:

  • To play A A^\perp, simply swap roles and play AA.

There are several ways to think of the exponentials. As before, they have control in a conjunction, while we have control in a disjunction. Whoever has control of !A!{A} or ?A?{A} chooses how many copies of AA to play and must win them all to win overall. There are many variations on whether the player in control can spawn new copies of AA or close old copies of AA prematurely, and whether the other player can play different moves in different copies (whenever the player in control plays the same moves).

Other than the decisions made by the player in control of a game, all moves are made by transmitting resources. Ultimately, these come down to the propositional variables; in the game pp, we must transmit a pp to them, while they must transmit a pp to us in p p^\perp.

A game is valid if we have a strategy to win (whether by putting the game in a state where we have won or by guaranteeing that it is forever their turn). The soundness and completeness of this interpretation is the theorem that AA is a valid game if and only if A\vdash A is a valid sequent. (Recall that all questions of validity of sequents can be reduced to the validity of single propositions.)

Game semantics for linear logic was first proposed by Andreas Blass, in Blass (1992). The semantics here is essentially the same as that proposed by Blass.

Multiple exponential operators

Much as there are many exponential functions (say from \mathbb{R} to \mathbb{R}), even though there is only one addition operation and one multiplication operation, so there can be many versions of the exponential operators !! and ??. (However, there doesn't seem to be any analogue of the logarithm to convert between them.)

More precisely, if we add to the language of linear logic two more operators, !!' and ??', and postulate of them the same rules as for !! and ??, we cannot prove that !A!A!{A} \equiv !'{A} and ?A?A?{A} \equiv ?'{A}. In contrast, if we introduce &\&', \bot', etc, we can prove that the new operators are equivalent to the old ones.

In terms of the categorial interpretation above, there may be many comonads !!; it is not determined by the underlying **-autonomous category. In terms of game/resource semantics, there are several slightly different interpretations of the exponentials.

One sometimes thinks of the exponentials as coming from infinitary applications of the other operations. For example:

  • !A1&A&(AA)&(AAA)&!{A} \coloneqq 1 \& A \& (A \otimes A) \& (A \otimes A \otimes A) \& \cdots,
  • !A(1&A)(1&A)(1&A)!{A} \coloneqq (1 \& A) \otimes (1 \& A) \otimes (1 \& A) \otimes \cdots,
  • !A1&kA&(k 2/2)(AA)&(k 3/6)(AAA)&!{A} \coloneqq 1 \& k A \& (k^2/2) (A \otimes A) \& (k^3/6) (A \otimes A \otimes A) \& \cdots (which is e kA\mathrm{e}^{k A} in an appropriate sense), where nAn A means an nn-fold additive conjunction A&&AA \& \cdots \& A for nn a natural number, and we pretend that kk is a positive number such that k n/n!k^n/{n!} is always a natural number (which of course is impossible).

All of these justify the rules for the exponentials, so again we see that there may be many ways to satisfy these rules.

Interpretation in constructive mathematics

Mike Shulman (Shulman 2018) has proposed an interpretation of linear logic for use in constructive mathematics, called the antithesis interpretation.

The motivation here is that in much of constructive mathematics, especially (but not only) in constructive analysis, statements of interest seem to come in pairs, a statement P +P^+ that amounts to a constructive version of a well-known statement P CP^{\mathrm{C}} from classical mathematics, and a statement P P^- that amounts to a constructive version of ¬P C\neg{P^{\mathrm{C}}}, and which is equivalent to ¬P +\neg{P^+} in classical logic, but which is not the same as ¬P +\neg{P^+} in intuitionistic logic (being sometimes stronger, sometimes weaker, sometimes neither, and similarly for P +P^+ vis-a-vis ¬P \neg{P^-}). For example, if P CP^{\mathrm{C}} is the claim that some real number xx is rational, then P +P^+ is

q,x=q \exists\, q \in \mathbb{Q},\; x = q

(the conventional meaning of ‘xx is rational’ in constructive analysis), while P P^- is

q,x#q \forall\, q \in \mathbb{Q},\; x \# q

(the conventional meaning of ‘xx is irrational’ in constructive analysis); here, #\# is the standard apartness relation between real numbers, whose negation is the equality relation ==. As you can see, the two statements are de Morgan dual to one another, and so are negations of each other in classical logic, but neither is the negation of the other in intuitionistic logic.

That such pairs of statements commonly arise is a truism in constructive mathematics. The point of the antithesis interpretation is that the pair (P +,P )(P^+, P^-) may be derived systematically from a single statement P LP^{\mathrm{L}} in linear logic. And this is a sound interpretation?, in that any reasoning valid in linear logic (or even affine logic) will be constructively valid for both statements; that is, if P LQ LP^{\mathrm{L}} \vdash Q^{\mathrm{L}} in linear (or even affine) logic, then P +Q +P^+ \vdash Q^+ and Q P Q^- \vdash P^- in intuitionistic logic. Therefore, as long as the reasoning is linear (or at least affine), one can prove intuitionistic theorems about both at once.

The antithesis interpretation also explains why there are often both weak and strong versions of P P^- (or even P +P^+ sometimes), having especially to do with the interpretation of disjunction. This comes up already in defining what a set is, because of the nature of the equality relation on a set SS. As in the example about about rational and irrational real numbers, every set should also be equipped with an antithesis inequality relation #\#, and the axioms for this may be derived from the axioms for an equality relation. The transitivity axiom for equality has two obvious interpretations in linear logic, which lead to two different interpretations in intuitionistic logic as axioms for #\#: a weak version (intepreting \parr) stating (for elements x,y,zx, y, z of SS) that if x#zx \# z, then y#zy \# z if x=yx = y, and x#yx \# y if y=zy = z; and a strong version (interpreting \oplus) stating that if x#zx \# z, then x#yx \# y or y#zy \# z. (The weak version really does follow from the strong version because we also automatically have that x=yx = y and x#yx \# y can never both be true.) Ultimately, the weak version says only that #\# is an inequality relation on the set SS, while the strong version says that #\# is an apartness relation.

Since every proposition comes with an antithesis proposition in the antithesis interpretation, the natural notion of a subset is actually a pair of disjoint subsets, leading to a potentially far-reaching reinterpretation of the role of higher-order logic in constructive mathematics, with concepts like a family of collections of subsets becoming a disjoint pair of families of disjoint pairs of collections of disjoint pairs of subsets.

\phantom{-}symbol\phantom{-}\phantom{-}in logic\phantom{-}
A\phantom{A}\inA\phantom{A}element relation
A\phantom{A}:\,:A\phantom{A}typing relation
A\phantom{A}==A\phantom{A}equality
A\phantom{A}\vdashA\phantom{A}A\phantom{A}entailment / sequentA\phantom{A}
A\phantom{A}\topA\phantom{A}A\phantom{A}true / topA\phantom{A}
A\phantom{A}\botA\phantom{A}A\phantom{A}false / bottomA\phantom{A}
A\phantom{A}\RightarrowA\phantom{A}implication
A\phantom{A}\LeftrightarrowA\phantom{A}logical equivalence
A\phantom{A}¬\notA\phantom{A}negation
A\phantom{A}\neqA\phantom{A}negation of equality / apartnessA\phantom{A}
A\phantom{A}\notinA\phantom{A}negation of element relation A\phantom{A}
A\phantom{A}¬¬\not \notA\phantom{A}negation of negationA\phantom{A}
A\phantom{A}\existsA\phantom{A}existential quantificationA\phantom{A}
A\phantom{A}\forallA\phantom{A}universal quantificationA\phantom{A}
A\phantom{A}\wedgeA\phantom{A}logical conjunction
A\phantom{A}\veeA\phantom{A}logical disjunction
symbolin type theory (propositions as types)
A\phantom{A}\toA\phantom{A}function type (implication)
A\phantom{A}×\timesA\phantom{A}product type (conjunction)
A\phantom{A}++A\phantom{A}sum type (disjunction)
A\phantom{A}00A\phantom{A}empty type (false)
A\phantom{A}11A\phantom{A}unit type (true)
A\phantom{A}==A\phantom{A}identity type (equality)
A\phantom{A}\simeqA\phantom{A}equivalence of types (logical equivalence)
A\phantom{A}\sumA\phantom{A}dependent sum type (existential quantifier)
A\phantom{A}\prodA\phantom{A}dependent product type (universal quantifier)
symbolin linear logic
A\phantom{A}\multimapA\phantom{A}A\phantom{A}linear implicationA\phantom{A}
A\phantom{A}\otimesA\phantom{A}A\phantom{A}multiplicative conjunctionA\phantom{A}
A\phantom{A}\oplusA\phantom{A}A\phantom{A}additive disjunctionA\phantom{A}
A\phantom{A}&\&A\phantom{A}A\phantom{A}additive conjunctionA\phantom{A}
A\phantom{A}\invampA\phantom{A}A\phantom{A}multiplicative disjunctionA\phantom{A}
A\phantom{A}!\;!A\phantom{A}A\phantom{A}exponential conjunctionA\phantom{A}

References

The original article:

Review:

and making explicit the categorical semantics of linear logic in the category FinDimVect of finite-dimensional vector spaces:

following a remark in section 2.4.2 of Hyland & Schalk (2003).

The categorical semantics of linear logic in general star-autonomous categories originally appeared in

and for the special case of quantales in

  • David Yetter, Quantales and (noncommutative) linear logic, Journal of Symbolic Logic 55 (1990), 41–64.

and further in view of the Chu construction:

The more general case of of multiplicative intuitionistic linear logic interpreted more generally in symmetric monoidal categories was systematically developed in

  • M.E. Szabo, Algebra of Proofs, Studies in Logic and the Foundations of Mathematics, vol. 88 (1978), North-Holland.

(that was before Girard introduced the “linear” terminology).

More recent articles exploring this include

  • Valeria de Paiva, The Dialectica Categories, Contemporary Mathematics 92, 1989. (web)

  • Richard Blute, Linear logic, coherence, and dinaturality, Dissertation, University of Pennsylvania 1991, published in Theoretical Computer Science archive

    Volume 115 Issue 1, July 5, 1993 Pages 3–41

  • Nick Benton, Gavin Bierman, Valeria de Paiva, Martin Hyland, Term assignments for intuitionistic linear logic. Technical report 262, Cambridge 1992 (citeseer, ps)

  • Martin Hyland, Valeria de Paiva, Full Intuitionistic Linear Logic (extended abstract). Annals of Pure and Applied Logic, 64(3), pp. 273–291, 1993. (pdf)

  • G. Bierman, On Intuitionistic Linear Logic PhD thesis, Computing Laboratory, University of Cambridge, 1995 (pdf)

  • Andrew Graham Barber, Linear Type Theories, Semantics and Action Calculi, PhD thesis, Edinburgh 1997 (web, pdf)

This is reviewed/further discussed in

The relation of dual intuitionistic linear logic and π\pi-calculus is given in

  • Luis Caires, Frank Pfenning. Session Types as Intuitionistic Linear Propositions.

Noncommutative linear logic is discussed for instance in

  • Richard Blute, F. Lamarche, Paul Ruet, Entropic Hopf algebras and models of non-commutative logic, Theory and Applications of Categories, Vol. 10, No. 17, 2002, pp. 424–460. (pdf)

Further discussion of linear type theory is for instance in

  • Chapter 7, Linear type theory pdf

  • Anders Schack-Nielsen, Carsten Schürmann, Linear contextual modal type theory pdf

See also

  • Andreas Blass, 1992. A game semantics for linear logic. Annals of Pure and Applied Logic 56: 183–220. 1992.

  • The article on the English Wikipedia has good summaries of the meanings of the logical operators and of the commonly studied fragments.

Discussion of application of linear logic to quantum logic, quantum computing and generally to quantum physics:

See also at quantum programming languages

Application of linear logic to matrix factorization in Landau–Ginzburg models:

Discussion of linear type theory without units is in

Discussion of inductive types in the context of linear type theory is in

  • Stéphane Gimenez, Towards Generic Inductive Constructions in Systems of Nets (pdf)

The categorical semantics of differential linear logic is introduced in:

The antithesis interpretation is

(This is a prepublication version that does not yet include the name ‘antithesis interpretation’, which was suggested by a referee. We must link to the published version when available.)

Last revised on October 28, 2023 at 06:56:38. See the history of this page for a list of all contributions to it.