nLab
linear logic

Context

Foundations

(0,1)(0,1)-Category theory

Monoidal categories

Contents

Idea

Linear logic is sometimes thought of as being a logic for arguing about resource sensitive issues, but it can also be thought of categorically, or interpreted using Game Semantics, or as being related to Petri nets, or as a particular form of quantum logic. A bit more formally:

Linear logic is a substructural logic in which the contraction rule and the weakening rule are omitted, or at least have their applicability restricted.

In the original definition of (Girard 87) linear logic is the internal logic of/has categorical semantics in star-autonomous categories (Seely 89, prop. 1.5). But more generally linear logic came to refer to the internal logic of any possibly-non-cartesian symmetric closed monoidal category (then usually called multiplicative intuitionistic linear logicMILL) or even polycategory (Szabo 78 see the history section and see also de Paiva 89, Blute 91, Benton-Bierman-de Paiva-Hyland 92, Hyland-de Paiva 93, Bierman 95, Barber 97, Schalk 04, Melliès 09). Under this interpretation, proof nets (or the associated Kelly-Mac Lane graphs) of linear logic are similar to string diagrams for monoidal categories.

Indeed, these more general senses of linear logic still faithfully follow the original motivation for the term “linear” as connoting “resource availability” explained below (and in Girard 87), since the non-cartesianness of the tensor product means the absence of a diagonal map and hence the impossibility of functions to depend on more than a single (linear) copy of their variables.

In addition to these usual “denotational” categorical semantics, linear logic also has an “operational” categorical semantics, called the Geometry of Interaction, in traced monoidal categories.

Although linear logic is traditionally presented in terms of inference rules, it was apparently discovered by Girard while studying coherent spaces.

Quantum logic

Linear logic and linear type theory can be argued to be the proper incarnation of quantum logic (see there for references). In this context the linearity of the logic, hence the absence of diagonal maps in its categorical semantics (in non-cartesian symmetric monoidal categories) reflects the no-cloning theorem of quantum physics.

Resource availability

Unlike traditional logics, which deal with the truth of propositions, linear logic is often described as dealing with the availability of resources. A proposition, if it is true, remains true no matter how we use that fact in proving other propositions. By contrast, in using a resource AA to make available a resource BB, AA itself may be consumed or otherwise modified. Linear logic deals with this by restricting our ability to duplicate or discard resources freely. For example, we have

have cakeeat cake\text{have cake} \vdash \text{eat cake}

from which we can prove

have cake,have cakehave cakeeat cake\text{have cake},\; \text{have cake} \vdash \text{have cake} \wedge \text{eat cake}

which by left contraction (duplication of inputs) in classical logic yields

have cakehave cakeeat cake\text{have cake} \vdash \text{have cake} \wedge \text{eat cake}

Linear logic would disallow the contraction step and treat have cake,have cakeA\text{have cake},\; \text{have cake} \vdash A as explicitly meaning that two slices of cake yield AA. Disallowing contraction then corresponds to the fact that we can’t turn one slice of cake into two (more’s the pity), so you can't have your cake and eat it too.

Game semantics

Linear logic can also be interpreted using a semantics of “games” or “interactions”. Under this interpretation, each proposition in a sequent represents a game being played or a transaction protocol being executed. An assertion of, for instance,

P,QR P, Q \vdash R

means roughly that if I am playing three simultaneous games of PP, QQ, and RR, in which I am the left player in PP and QQ and the right player in RR, then I have a strategy which will enable me to win at least one of them. Now the above statements about “resources” translate into saying that I have to play in all the games I am given and can’t invent new ones on the fly.

As a relevant logic

Linear logic is closely related to notions of relevant logic?, which have been studied for much longer. The goal of relevant logic is to disallow statements like “if pigs can fly, then grass is green” which are true, under the usual logical interpretation of implication, but in which the hypothesis has nothing to do with the conclusion. Clearly there is a relationship with the “resource semantics”: if we want to require that all hypotheses are “used” in a proof then we need to disallow weakening.

Definition

Linear logic is usually given in terms of sequent calculus. There is a set of propositions (although as remarked above, to be thought of more as resources to be acquired than as statements to be proved) which we construct through recursion. Each pair of lists of propositions is a sequent (written as usual with ‘\vdash’ between the lists), some of which are valid; we determine which are valid also through recursion. Technically, the propositional calculus of linear logic also requires a set of propositional variables from which to start; this is usually identified with the set of natural numbers (so the variables are p 0p_0, p 1p_1, etc), although one can also consider the linear logic LL(V)LL(V) where VV is any initial set of propositional variables.

Here we define the set of propositions:

  • Every propositional variable is a proposition.
  • For each proposition AA, there is a proposition A A^\perp, the negation of AA.
  • For each proposition AA and proposition BB, there are four additional propositions:
    • A&BA \& B (read ‘with’), the additive conjunction of AA and BB;
    • ABA \oplus B (read ‘plus’), the additive disjunction of AA and BB;
    • ABA \otimes B (read ‘times’), the multiplicative conjunction of AA and BB;
    • ABA \parr B (read ‘par’ and sometimes written ABA \mid B), the multiplicative disjunction of AA and BB.
  • There are also four constants to go with the four binary operations above:
    • \top (read ‘top’), the additive truth;
    • 0\mathbf{0} (read ‘zero’), the additive falsity;
    • 1\mathbf{1} (read ‘one’), the multiplicative truth;
    • \bot (read ‘bottom’), the multiplicative falsity.
  • For each proposition AA, there are two additional propositions:

The terms “exponential”, “multiplicative”, and “additive” come from the fact that “exponentiation converts addition to multiplication”: we have !(A&B)!A!B!{(A \& B)}\equiv !{A} \otimes !{B} and so on (see below).

However, the connectives and constants can also be grouped in different ways. For instance, the multiplicative conjunction \otimes and additive disjunction \oplus are both positive types, while the additive conjunction &\& and multiplicative disjunction \parr are negative types. Similarly, the multiplicative truth 1\mathbf{1} and the additive falsity 0\mathbf{0} are positive, while the additive truth \top and multiplicative falsity \bot are negative. This grouping has the advantage that the similarity of symbols matches the adjective used.

In relevant logic?, the terms “conjunction” and “disjunction” are often reserved for the additive versions &\& and \oplus, which are written with the traditional notations \wedge and \vee. In this case, the multiplicative conjunction ABA\otimes B is called fusion and denoted ABA\circ B, while the multiplicative disjunction ABA\parr B is called fission and denoted A+BA+B (or sometimes, confusingly, ABA\oplus B). In relevant logic the symbol \bot may also be used for the additive falsity, here denoted 0\mathbf{0}. Also, sometimes the additive connectives are called extensional and the multiplicatives intensional.

Sometimes one does not define the operation of negation, defining only p p^\perp for a propositional variable pp. It is a theorem that every proposition above is equivalent (in the sense defined below) to a proposition in which negation is applied only to propositional variables.

We now define the valid sequents, where we write A,B,CD,E,FA, B, C \vdash D, E, F to state the validity of the sequent consisting of the list (A,B,C)(A,B,C) and the list (D,E,F)(D,E,F) and use Greek letters for sublists:

  • The structural rules:
    • The exchange rule: If a sequent is valid, then any permutation of it (created by permuting its left and right sides independently) is valid.
    • The restricted weakening rule: If Γ,ΔΘ\Gamma, \Delta \vdash \Theta, then Γ,!A,ΔΘ\Gamma, !{A}, \Delta \vdash \Theta, for any AA; conversely and dually, if ΓΔ,Θ\Gamma \vdash \Delta, \Theta, then ΓΔ,?A,Θ\Gamma \vdash \Delta, ?{A}, \Theta for any AA.
    • The restricted contraction rule: If Γ,!A,!A,ΔΘ\Gamma, !{A}, !{A}, \Delta \vdash \Theta, then Γ,!A,ΔΘ\Gamma, !{A}, \Delta \vdash \Theta; conversely and dually, if ΓΔ,?A,?A,Θ\Gamma \vdash \Delta, ?{A}, ?{A}, \Theta, then ΓΔ,?A,Θ\Gamma \vdash \Delta, ?{A}, \Theta.
    • The identity rule?: Always, AAA \vdash A;
    • The cut rule: If ΓA\Gamma \vdash A and AΔA \vdash \Delta, then ΓΔ\Gamma \vdash \Delta.
  • The logical rules for each operation:
    • If ΓA,Δ\Gamma \vdash A, \Delta, then Γ,A Δ\Gamma, A^\perp \vdash \Delta; conversely and dually, if Γ,AΔ\Gamma, A \vdash \Delta, then ΓA ,Δ\Gamma \vdash A^\perp, \Delta.
    • If Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta or Γ,B,ΔΘ\Gamma, B, \Delta \vdash \Theta, then Γ,A&B,ΔΘ\Gamma, A \& B, \Delta \vdash \Theta; conversely, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta and ΓΔ,B,Θ\Gamma \vdash \Delta, B, \Theta, then ΓΔ,A&B,Θ\Gamma \vdash \Delta, A \& B, \Theta.
    • Dually, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta or ΓΔ,B,Θ\Gamma \vdash \Delta, B, \Theta, then ΓΔ,AB,Θ\Gamma \vdash \Delta, A \oplus B, \Theta; conversely, if Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta and Γ,B,ΔΘ\Gamma, B, \Delta \vdash \Theta, then Γ,AB,ΔΘ\Gamma, A \oplus B, \Delta \vdash \Theta.
    • If Γ,A,B,ΔΘ\Gamma, A, B, \Delta \vdash \Theta, then Γ,AB,ΔΘ\Gamma, A \otimes B, \Delta \vdash \Theta; conversely, if ΓΔ,A\Gamma \vdash \Delta, A and ΛB,Θ\Lambda \vdash B, \Theta, then Γ,ΛΔ,AB,Θ\Gamma, \Lambda \vdash \Delta, A \otimes B, \Theta.
    • Dually, if ΓΔ,A,B,Θ\Gamma \vdash \Delta, A, B, \Theta, then ΓΔ,AB,Θ\Gamma \vdash \Delta, A \parr B, \Theta; conversely, if Γ,AΔ\Gamma, A \vdash \Delta and B,ΘΛB, \Theta \vdash \Lambda, then Γ,AB,ΘΔ,Λ\Gamma, A \parr B, \Theta \vdash \Delta, \Lambda.
    • Always ΓΔ,,Θ\Gamma \vdash \Delta, \top, \Theta; dually (there is no converse), Γ,0,ΔΘ\Gamma, \mathbf{0}, \Delta \vdash \Theta.
    • If Γ,ΔΘ\Gamma, \Delta \vdash \Theta, then Γ,1,ΔΘ\Gamma, \mathbf{1}, \Delta \vdash \Theta; conversely, 1\vdash \mathbf{1}.
    • Dually, if ΓΔ,Θ\Gamma \vdash \Delta, \Theta, then ΓΔ,,Θ\Gamma \vdash \Delta, \bot, \Theta; conversely, \bot \vdash.
    • If Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta, then Γ,!A,ΔΘ\Gamma, !{A}, \Delta \vdash \Theta; conversely, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta, then ΓΔ,!A,Θ\Gamma \vdash \Delta, !{A}, \Theta, whenever Γ\Gamma consists entirely of propositions of the form !!{-} while Δ\Delta and Θ\Theta consist entirely of propositions of the form ??{-}.
    • Dually, if ΓΔ,A,Θ\Gamma \vdash \Delta, A, \Theta, then ΓΔ,?A,Θ\Gamma \vdash \Delta, ?{A}, \Theta; conversely, if Γ,A,ΔΘ\Gamma, A, \Delta \vdash \Theta, then Γ,?A,ΔΘ\Gamma, ?{A}, \Delta \vdash \Theta, whenever Γ\Gamma and Δ\Delta consist entirely of propositions of the form !!{-} while Θ\Theta consists entirely of propositions of the form ??{-}.

The main point of linear logic is the restricted use of the weakening and contraction rules; if these were universally valid (applying to any AA rather than only to !A!{A} or ?A?{A}), then the additive and multiplicative operations would be equivalent (in the sense defined below) and similarly !A!{A} and ?A?{A} would be equivalent to AA, which would give us classical logic. On the other hand, one can also remove the exchange rule to get a variety of noncommutative logic?; one must then be careful about how to write the other rules (which we have been above).

As usual, there is a theorem of cut elimination showing that the cut rule and identity rule follow from all other rules and the special cases of the identity rule of the form ppp \vdash p for a propositional variable pp.

The propositions AA and BB are (propositionally) equivalent if ABA \vdash B and BAB \vdash A are both valid, which we express by writing ABA \equiv B. It is then a theorem that either may be swapped for the other anywhere in a sequent without affecting its validity. Up to equivalence, negation is an involution, and the operations &\&, \oplus, \otimes, and \parr are all associative, with respective identity elements \top, 0\mathbf{0}, 1\mathbf{1}, and \bot. These operations are also commutative? (although this fails for the multiplicative connectives if we drop the exchange rule). The additive connectives are also idempotent (but the multiplicative ones are not).

Remark

There is a more refined notion of equivalence, where we pay attention to specific derivations π:AB\pi: A \vdash B of sequents, and deem two derivations π,π\pi, \pi' of ABA \vdash B Lambek-equivalent if they map to the same morphism under any categorical semantics SS; see below. Given a pair of derivations π:AB\pi: A \vdash B and ρ:BA\rho: B \vdash A, it then makes sense to ask whether they are Lambek-inverse to one another (i.e., whether S(ρ)=S(π) 1S(\rho) = S(\pi)^{-1} under any semantics), so that the derivations exhibit an isomorphism S(A)S(B)S(A) \cong S(B) under any semantics SS. This equivalence relation A LambekBA \equiv_{Lambek} B is strictly stronger than propositional equivalence. It should be observed that all equivalences ABA \equiv B listed below are in fact Lambek equivalences.

We also have distributive laws that explain the adjectives ‘additive’, ‘multiplicative’, and ‘exponential’:

  • Multiplication distributes over addition if one is a conjunction and one is a disjunction:
    • A(BC)(AB)(AC)A \otimes (B \oplus C) \equiv (A \otimes B) \oplus (A \otimes C) (and on the other side);
    • A(B&C)(AB)&(AC)A \parr (B \& C) \equiv (A \parr B) \& (A \parr C) (and on the other side);
    • A00A \otimes \mathbf{0} \equiv \mathbf{0} (and on the other side);
    • AA \parr \top \equiv \top (and on the other side).
  • Exponentiation converts addition into multiplication if all are conjunctions or all are disjunctions:
    • !(A&B)!A!B!{(A \& B)} \equiv !{A} \otimes !{B};
    • ?(AB)?A?B?{(A \oplus B)} \equiv ?{A} \parr ?{B};
    • !1!{\top} \equiv \mathbf{1};
    • ?0?{\mathbf{0}} \equiv \bot.

It is also a theorem that negation (except for the negations of propositional variables) can be defined (up to equivalence) recursively as follows:

  • (A ) A(A^\perp)^\perp \equiv A;
  • (A&B) A B (A \& B)^\perp \equiv A^\perp \oplus B^\perp and vice versa;
  • (AB) A B (A \otimes B)^\perp \equiv A^\perp \parr B^\perp and vice versa;
  • 0\top^\perp \equiv \mathbf{0} and vice versa;
  • 1 \mathbf{1}^\perp \equiv \bot and vice versa;
  • (!A) ?A (!{A})^\perp \equiv ?{A^\perp} and vice versa.

The logical rules for negation can then be proved.

In this way, linear logic in the original sense (interpreted in star-autonomous categories) has a perfect de Morgan duality. But observe that more general variants (interpreted in more general symmetric monoidal categories) need not, see for instance (Hyland-de Paiva 93).

We can also restrict attention to sequents with one term on either side as follows: ΓΔ\Gamma \vdash \Delta is valid if and only if ΓΔ\bigotimes \Gamma \vdash \parr \Delta is valid, where (A,B,C)ABC\bigotimes(A, B, C) \coloneqq A \otimes B \otimes C, etc, and similarly for \parr (using implicitly that these are associative, with identity elements to handle the empty sequence).

We can even restrict attention to sequents with no term on the left side and one term on the right; ABA \vdash B is valid if and only if AB\vdash A \multimap B is valid, where ABA BA \multimap B \coloneqq A^\perp \parr B. In this way, it's possible to ignore sequents entirely and speak only of propositions and valid propositions, eliminating half of the logical rules in the process. However, this approach is not as beautifully symmetric as the full sequent calculus.

Variants

The logic described above is full classical linear logic. There are many important fragments and variants of linear logic, such as multiplicative linear logic, intuitionistic linear logic (in which \multimap is a primitive operation), full intuitionistic linear logic (where multiplicatives and additives connectives are all independent of each other), non-commutative linear logics (braided or not), light linear logics, etc.

Categorial semantics

We discuss the categorical semantics of linear logic. See also at relation between type theory and category theory.

**-autonomous categories

One way to explain linear logic to a category theorist is to say that its models are *-autonomous categories with extra structure (Seely, 1989, prop. 1.5). (If the underlying category is a suplattice then these are commutative quantales, (Yetter 90))

Firstly, there is a monoidal ‘tensor’ connective ABA \otimes B. Negation A A^\bot is modelled by the dual object involution () *(-)^*, while linear implication ABA\multimap B corresponds to the internal hom, which can be defined as (AB ) (A\otimes B^\bot)^\bot. There is a de Morgan dual of the tensor called ‘par’, with AB=(A B ) A \parr B = (A^\bot\otimes B^\bot)^\bot. Tensor and par are the ‘multiplicative’ connectives, which roughly speaking represent the parallel availability of resources.

The ‘additive’ connectives &\& and \oplus, which correspond in another way to traditional conjunction and disjunction, are modelled as usual by products and coproducts. Seely (1989) notes that products are sufficient, as **-autonomy then guarantees the existence of coproducts; that is, they are also linked by de Morgan duality.

LL recaptures the notion of a resource that can be discarded or copied arbitrarily by the use of the modal operator !! the !-modality: !A!A denotes an ‘AA-factory’, a resource that can produce zero or more AAs on demand. It is modelled using a comonad !! on the underlying **-autonomous category that is (symmetric) monoidal, in the sense that there is an isomorphism !A!B!(A&B)!A\otimes!B \cong !(A\& B). Since every AA is canonically a symmetric &\&-comonoid (remembering that &\& is the product), !A!A is then a symmetric \otimes-comonoid.

An LL sequent

A 1,,A nB 1,,B mA_1,\ldots,A_n \vdash B_1,\ldots,B_m

is interpreted as a morphism

iA i jB j \otimes_i A_i \to \parr_j B_j

The comonoid structure on !A!A then yields the weakening

Γ!AΓIΓ \Gamma\otimes !A \to \Gamma \otimes I \to \Gamma

and contraction

Γ!AΓ!A!A \Gamma\otimes !A \to \Gamma \otimes !A \otimes !A

maps. The corresponding rules are interpreted by precomposing the interpretation of a sequent with one of these maps.

The (co)-Kleisli category of !! is cartesian closed, and the product there coincides with the product in the base category. The exponential (unsurprisingly for a Kleisli category) is B A!ABB^A \cong !A\multimap B.

Particular monoidal and **-autonomous posets for modeling linear logic can be obtained by Day convolution from ternary frames. This includes Girard’s phase spaces as a particular example.

Polycategories

A different way to explain linear logic categorically (though equivalent, in the end) is to start with a categorical structure which lacks any of the connectives, but has sufficient structure to enable us to characterize them with universal properties. If we ignore the exponentials for now, such a structure is given by a polycategory. The polymorphisms

(A,B,C)(D,E,F)(A,B,C) \to (D,E,F)

in a polycategory correspond to sequents A,B,CD,E,FA,B,C \vdash D,E,F in linear logic. The multiplicative connectives are then characterized by representability and corepresentability properties:

(A,B)CABC\frac{(A,B) \to C}{A\otimes B \to C}

and

A(B,C)ABC\frac{A \to (B,C)}{A \to B\parr C}

(Actually, we should allow arbitrarily many unrelated objects to carry through in both cases.) The additives are similarly characterized as categorical products and coproducts, in a polycategorically suitable sense.

Finally, dual objects can be recovered as a sort of “adjoint”:

(A,B)CA(B *,C)\frac{(A,B) \to C}{A \to (B^*,C)}

If all these representing objects exist, then we recover a **-autonomous category.

One merit of the polycategory approach is that it makes the role of the structural rules clearer, and also helps explain why \parr sometimes seems like a disjunction and sometimes like a conjunction. Allowing contraction and weakening on the left corresponds to our polycategory being “left cartesian”; that is, we have “diagonal” and “projection” operations such as Hom(A,A;B)Hom(A;B)Hom(A,A; B) \to Hom(A;B) and Hom(;B)Hom(A,B)Hom(;B) \to Hom(A,B). In the presence of these operations, a representing object is automatically a cartesian product; thus \otimes coincides with &\&. Similarly, allowing contraction and weakening on the right makes the polycategory “right cocartesian”, which causes corepresenting objects to be coproducts and thus \parr to coincide with \oplus.

On the other hand, if we allow “multi-composition” in our polycategory, i.e. we can compose a morphism A(B,C)A \to (B,C) with one (B,C)D(B,C)\to D to obtain a morphism ADA\to D, then our polycategory becomes a PROP, and representing and corepresenting objects must coincide; thus \otimes and \parr become the same. This explains why \parr has both a disjunctive and a conjunctive aspect. Of course, if in addition to multi-composition we have the left and right cartesian properties, then all four connectives coincide (including the categorical product and coproduct) and we have an additive category.

Game semantics

We can interpret any proposition in linear logic as a game between two players: we and they. The overall rules are perfectly symmetric between us and them, although no individual game is. At any given moment in a game, exactly one of these four situations obtains: it is our turn, it is their turn, we have won, or they have won; the last two states continue forever afterwards (and the game is over). If it is our turn, then they are winning; if it is their turn, then we are winning. So there are two ways to win: because the game is over (and a winner has been decided), or because it is forever the other players turn (either because they have no move or because every move results in its still being their turn).

This is a little complicated, but it's important in order to be able to distinguish the four constants:

  • In \top, it is their turn, but they have no moves; the game never ends, but we win.
  • Dually, in 0\mathbf{0}, it is our turn, but we have no moves; the game never ends, but they win.
  • In contrast, in 1\mathbf{1}, the game ends immediately, and we have won.
  • Dually, in \bot, the game ends immediately, and they have won.

The binary operators show how to combine two games into a larger game:

  • In A&BA \& B, is is their turn, and they must choose to play either AA or BB. Once they make their choice, play continues in the chosen game, with ending and winning conditions as in that game.
  • Dually, in ABA \oplus B, is is our turn, and we must choose to play either AA or BB. Once we make our choice, play continues in the chosen game, with ending and winning conditions as in that game.
  • In ABA \otimes B, play continues with both games in parallel. If it is our turn in either game, then it is our turn overall; if it is their turn in both games, then it is their turn overall. If either game ends, then play continues in the other game; if both games end, then the overall game ends. If we have won both games, then we have won overall; if they have won either game, then they have won overall.
  • Dually, in ABA \parr B, play continues with both games in parallel. If it is their turn in either game, then it is their turn overall; if it is our turn in both games, then it is our turn overall. If either game ends, then play continues in the other game; if both games end, then the overall game ends. If they have won both games, then they have won overall; if we have won either game, then we have won overall.

So we can classify things as follows:

  • In a conjunction, they choose what game to play; in a disjunction, we have control. Whoever has control must win at least one game to win overall.
  • In an addition, one game must be played; in a multiplication, all games must be played.

To further clarify the difference between \top and 1\mathbf{1} (the additive and multiplicative versions of truth, both of which we win); consider AA \parr \top and A1A \parr \mathbf{1}. In AA \parr \top, it is always their move (since it is their move in \top, hence their move in at least one game), so we win just as we win \top. (In fact, AA \parr \top \equiv \top.) However, in A1A \parr \mathbf{1}, the game 1\mathbf{1} ends immediately, so play continues as in AA. We have won 1\mathbf{1}, so we only have to end the game to win overall, but there is no guarantee that this will happen. Indeed, in 01\mathbf{0} \parr \mathbf{1}, the game never ends and it is always our turn, so they win. (In 1\bot \parr \mathbf{1}, both games end immediately, and we win. In A1A \otimes \mathbf{1}, we must win both games to win overall, so this reduces to AA; indeed, A1AA \otimes \mathbf{1} \equiv A.)

Negation is easy:

  • To play A A^\perp, simply swap roles and play AA.

There are several ways to think of the exponentials. As before, they have control in a conjunction, while we have control in a disjunction. Whoever has control of !A!{A} or ?A?{A} chooses how many copies of AA to play and must win them all to win overall. There are many variations on whether the player in control can spawn new copies of AA or close old copies of AA prematurely, and whether the other player can play different moves in different copies (whenever the player in control plays the same moves).

Other than the decisions made by the player in control of a game, all moves are made by transmitting resources. Ultimately, these come down to the propositional variables; in the game pp, we must transmit a pp to them, while they must transmit a pp to us in p p^\perp.

A game is valid if we have a strategy to win (whether by putting the game in a state where we have won or by guaranteeing that it is forever their turn). The soundness and completeness of this interpretation is the theorem that AA is a valid game if and only if A\vdash A is a valid sequent. (Recall that all questions of validity of sequents can be reduced to the validity of single propositions.)

Game semantics for linear logic was first proposed by Andreas Blass, I believe in Blass (1992). The semantics here may or may not be the same as proposed by Blass.

Multiple exponential operators

Much as there are many exponential functions (say from \mathbb{R} to \mathbb{R}), even though there is only one addition operation and one multiplication operation, so there can be many versions of the exponential operators !! and ??. (However, there doesn't seem to be any analogue of the logarithm to convert between them.)

More precisely, if we add to the language of linear logic two more operators, !!' and ??', and postulate of them the same rules as for !! and ??, we cannot prove that !A!A!{A} \equiv !'{A} and ?A?A?{A} \equiv ?'{A}. In contrast, if we introduce &\&', \bot', etc, we can prove that the new operators are equivalent to the old ones.

In terms of the categorial interpretation above, there may be many comonads !!; it is not determined by the underlying **-autonomous category. In terms of game/resource semantics, there are several slightly different interpretations of the exponentials.

One sometimes thinks of the exponentials as coming from infinitary applications of the other operations. For example:

  • !A1&A&(AA)&(AAA)&!{A} \coloneqq 1 \& A \& (A \otimes A) \& (A \otimes A \otimes A) \& \cdots,
  • !A(1&A)(1&A)(1&A)!{A} \coloneqq (1 \& A) \otimes (1 \& A) \otimes (1 \& A) \otimes \cdots,
  • !A1&kA&(k 2/2)(AA)&(k 3/6)(AAA)&!{A} \coloneqq 1 \& k A \& (k^2/2) (A \otimes A) \& (k^3/6) (A \otimes A \otimes A) \& \cdots (which is e kA\mathrm{e}^{k A} in an appropriate sense), where nAn A means an nn-fold additive conjunction A&&AA \& \cdots \& A for nn a natural number, and we pretend that kk is a positive number such that k n/n!k^n/{n!} is always a natural number (which of course is impossible).

All of these justify the rules for the exponentials, so again we see that there may be many ways to satisfy these rules.

References

The original article is

Reviews include

The categorical semantics of linear logic in star-autonomous categories originally appeared in

  • R. A. G. Seely, Linear logic, *\ast-autonomous categories and cofree coalgebras, Contemporary Mathematics 92, 1989. (pdf, ps.gz)

and for the special case of quantales in

  • David Yetter, Quantales and (noncommutative) linear logic, Journal of Symbolic Logic 55 (1990), 41-64.

The more general case of of multiplicative intuitionistic linear logic interpreted more generally in symmetric monoidal categories was systematically developed in

  • M.E. Szabo, Algebra of Proofs, Studies in Logic and the Foundations of Mathematics, vol. 88 (1978), North-Holland.

(that was before Girard introduced the “linear” terminology).

More recent articles exploring this include

  • Richard Blute, Linear logic, coherence, and dinaturality, Dissertation, University of Pennsylvania 1991, published in Theoretical Computer Science archive Volume 115 Issue 1, July 5, 1993 Pages 3 - 41
  • G. Bierman, On Intuitionistic Linear Logic PhD thesis, Computing Laboratory, University of Cambridge, 1995 (pdf)
  • Andrew Graham Barber, Linear Type Theories, Semantics and Action Calculi, PhD thesis, Edinburgh 1997 (web, pdf)

This is reviewed/further discussed in

  • Andrea Schalk, What is a categorical model for linear logic? (pdf)
  • Richard Blute, Philip Scott, Category theory for linear logicians, 2004 (citeseer)

  • Paul-André Melliès , Categorial Semantics of Linear Logic, in Interactive models of computation and program behaviour, Panoramas et synthèses 27, 2009 (pdf)

Noncommutative linear logic is discussed for instance in

  • Richard Blute, F. Lamarche, Paul Ruet, Entropic Hopf algebras and models of non-commutative logic, Theory and Applications of Categories, Vol. 10, No. 17, 2002, pp. 424–460. (pdf)

Further discussion of linear type theory is for instance in

  • Capter 7, Linear type theory pdf

  • Anders Schack-Nielsen, Carsten Schürmann, Linear contextual modal type theory pdf

See also

  • Andreas Blass, A game semantics for linear logic, Annals of Pure and Applied Logic 56: 183–220, 1992.
  • The article on the English Wikipedia has good summaries of the meanings of the logical operators and of the commonly studied fragments.

Discussion of application of linear logic to quantum logic, quantum computing and generally to quantum physics includes

  • Vaughan Pratt, Linear logic for generalized quantum mechanics, in Proc. of Workshop on Physics and Computation (PhysComp’92) (pdf, web)

  • Ross Duncan, Types for quantum mechanics, 2006 (pdf)

  • Gianpiero Cattaneo, Maria Luisa Dalla Chiara, Roberto Giuntini and Francesco Paoli, section 9 of Quantum Logic and Nonclassical Logics, p. 127 in Kurt Engesser, Dov M. Gabbay, Daniel Lehmann (eds.) Handbook of Quantum Logic and Quantum Structures: Quantum Logic, 2009 North Holland

  • Ugo Dal Lago, Claudia Faggian, On Multiplicative Linear Logic, Modality and Quantum Circuits (arXiv:1210.0613)

Discussion for matrix factorization in Landau-Ginzburg models is in

Discussion of linear type theory without units is in

Discussion of inductive types in the context of linear type theory is in

  • Stéphane Gimenez, Towards Generic Inductive Constructions in Systems of Nets (pdf)

Revised on November 11, 2014 21:08:17 by Urs Schreiber (92.151.246.61)