nLab cut rule

Contents

Context

Deduction and Induction

Type theory

natural deduction metalanguage, practical foundations

  1. type formation rule
  2. term introduction rule
  3. term elimination rule
  4. computation rule

type theory (dependent, intensional, observational type theory, homotopy type theory)

syntax object language

computational trinitarianism =
propositions as types +programs as proofs +relation type theory/category theory

logicset theory (internal logic of)category theorytype theory
propositionsetobjecttype
predicatefamily of setsdisplay morphismdependent type
proofelementgeneralized elementterm/program
cut rulecomposition of classifying morphisms / pullback of display mapssubstitution
introduction rule for implicationcounit for hom-tensor adjunctionlambda
elimination rule for implicationunit for hom-tensor adjunctionapplication
cut elimination for implicationone of the zigzag identities for hom-tensor adjunctionbeta reduction
identity elimination for implicationthe other zigzag identity for hom-tensor adjunctioneta conversion
truesingletonterminal object/(-2)-truncated objecth-level 0-type/unit type
falseempty setinitial objectempty type
proposition, truth valuesubsingletonsubterminal object/(-1)-truncated objecth-proposition, mere proposition
logical conjunctioncartesian productproductproduct type
disjunctiondisjoint union (support of)coproduct ((-1)-truncation of)sum type (bracket type of)
implicationfunction set (into subsingleton)internal hom (into subterminal object)function type (into h-proposition)
negationfunction set into empty setinternal hom into initial objectfunction type into empty type
universal quantificationindexed cartesian product (of family of subsingletons)dependent product (of family of subterminal objects)dependent product type (of family of h-propositions)
existential quantificationindexed disjoint union (support of)dependent sum ((-1)-truncation of)dependent sum type (bracket type of)
logical equivalencebijection setobject of isomorphismsequivalence type
support setsupport object/(-1)-truncationpropositional truncation/bracket type
n-image of morphism into terminal object/n-truncationn-truncation modality
equalitydiagonal function/diagonal subset/diagonal relationpath space objectidentity type/path type
completely presented setsetdiscrete object/0-truncated objecth-level 2-type/set/h-set
setset with equivalence relationinternal 0-groupoidBishop set/setoid with its pseudo-equivalence relation an actual equivalence relation
equivalence class/quotient setquotientquotient type
inductioncolimitinductive type, W-type, M-type
higher inductionhigher colimithigher inductive type
-0-truncated higher colimitquotient inductive type
coinductionlimitcoinductive type
presettype without identity types
set of truth valuessubobject classifiertype of propositions
domain of discourseuniverseobject classifiertype universe
modalityclosure operator, (idempotent) monadmodal type theory, monad (in computer science)
linear logic(symmetric, closed) monoidal categorylinear type theory/quantum computation
proof netstring diagramquantum circuit
(absence of) contraction rule(absence of) diagonalno-cloning theorem
synthetic mathematicsdomain specific embedded programming language

homotopy levels

semantics

Foundations

foundations

The basis of it all

 Set theory

set theory

Foundational axioms

foundational axioms

Removing axioms

Contents

Idea

The cut rule in sequent calculus (formal logic) is the structural inference rule that from sequents of the form

ΓA,Δ \Gamma \vdash A , \Delta

and

Π,AΛ \Pi, A \vdash \Lambda

the new sequent

Π,ΓΛ,Δ \Pi, \Gamma \vdash \Lambda, \Delta

may be deduced. This is often written in the form

ΓA,ΔΠ,AΛΠ,ΓΛ,Δcut.\frac{\Gamma \vdash A, \Delta \quad \Pi, A \vdash \Lambda}{\Pi, \Gamma \vdash \Lambda, \Delta} \; cut.

In the categorical semantics where each sequent here is interpreted as a morphism in a category, the cut rule asserts the existence of composition of morphisms.

Cut elimination

“A sequent calculus without cut-elimination is like a car without an engine” – Jean-Yves Girard (in Linear logic)

The cut-elimination theorem (“Gerhard Gentzen‘s Hauptsatz”) asserts that every judgement which has a proof using the cut-rule also has a proof not using it (a “cut-free proof”). While Gentzen's original theorem was for a few particular sequent calculi that he was considering, it is true of many other sequent calculi and is generally seen as desirable. (That said, there are some useful sequent calculi in which it fails.)

Intuitively, the problem in deciding whether a formula BB follows from a formula AA, i.e., deriving ABA \vdash B, is that there could be very complicated steps in the middle, i.e., in typical mathematical arguments one puts together steps ACA \vdash C and CBC \vdash B where CC is potentially a complicated or large formula. For an automated theorem prover, the search space for such CC is potentially infinite. By establishing a cut-elimination theorem for formal systems, one circumvents this problem, and it is quite typical that cut-free proofs build up complex sequents from less complex sequents (cf. subformula property), so that one can decide whether a sequent is provable or derivable by following an inductive procedure.

Cut-elimination is also a key step in deciding whether two proofs of a sequent are the “same” in some suitable sense. In type theory, for instance, the issue is not merely whether ABA \vdash B is provable or whether the function type ABA \multimap B is inhabited (has a proof or a term witnessing that fact), but also the nature of the space of such proofs. Since any proof has a trivial cut-free formulation in a system where all provable sequents in the original system are simply postulated as axioms, a cut-elimination result worthy of the name will not merely replace a proof with one which is cut-free, but with a cut-free proof which is equivalent to the original. This idea is used for instance in proving coherence theorems.

Cut-elimination may also be used to give independent proof-theoretic motivation of the definition of a category, and other basic category theoretic notions, eg. adjunction (see Došen 99).

Connection to identities

In the analogy between the composition and the cut rule, the analogue of identity morphisms (or nullary compositions) is the identity rule

AA A \vdash A

Typically, a cut-elimination algorithm goes hand-in-hand with an algorithm which eliminates the identity rule, or rather which pushes back identities as far as possible, down to identities for basic propositional variables (so for example, pqpqp \wedge q \vdash p \wedge q may be proved using ppp \vdash p and qqq \vdash q, in addition to the rules for \wedge, but ppp \vdash p itself must be adopted as an axiom).

In fact, there is a sense in which elimination of cuts is seen as dual to elimination of identities, analogous to the sense in which beta reduction is seen as dual to eta expansion. Very typically, a normalization scheme on terms first applies eta expansions are far as they will go, and then applies beta reductions as far as they will go, so as to at last reach a normal form. The same goes for rewrite systems on sequent deductions, which first eliminate identities, then eliminate cuts.

Examples of elimination steps

The conversion

Γ,AB,ΔΓAB,ΔΠ 1A,Λ 1Π 2,BΛ 2Π 2,Π 1,ABΛ 2,Λ 1Π 2,Π 1,ΓΛ 2,Λ 1,ΔΓ,AB,ΔΠ 1A,Λ 1Π 1,ΓΛ 1,B,ΔΠ 2,BΛ 2Π 2,Π 1,ΓΛ 2,Λ 1,Δ \frac{\displaystyle \frac{\Gamma, A \vdash B, \Delta}{\Gamma \vdash A \multimap B,\Delta} \;\;\; \frac{\Pi_1 \vdash A,\Lambda_1 \;\;\; \Pi_2,B \vdash \Lambda_2}{\Pi_2,\Pi_1, A\multimap B \vdash \Lambda_2,\Lambda_1}}{\Pi_2,\Pi_1, \Gamma \vdash \Lambda_2,\Lambda_1, \Delta} \quad\to\quad \frac{\displaystyle \frac{\Gamma, A \vdash B, \Delta \;\;\; \Pi_1 \vdash A,\Lambda_1}{\Pi_1,\Gamma \vdash \Lambda_1,B,\Delta} \;\;\; \Pi_2,B \vdash \Lambda_2}{\Pi_2,\Pi_1, \Gamma \vdash \Lambda_2, \Lambda_1,\Delta}

replaces a single cut on the formula ABA \multimap B with a pair of cuts on the formulas AA and BB, in the process eliminating the use of the logical rules R{\multimap}R and L{\multimap}L.

Although this step replaces one cut by two, the cuts have been in effect pushed up the proof tree, to formulas of lower complexity. Cuts are finally eliminated when they have been pushed all the way up to identity axioms on propositional variables, by applying conversions of type

xxaxiomxxaxiomxxcutxxaxiom.\frac{\displaystyle \frac{}{x \vdash x}\; axiom \;\;\;\; \frac{}{x \vdash x}\; axiom}{x \vdash x} \; cut \quad\to\quad \frac{}{x \vdash x}\; axiom.

Likewise, the conversion

ABABAAABABBABBABABA \wedge B \vdash A \wedge B \quad\to\quad \frac { \displaystyle \frac { A \vdash A } { A \wedge B \vdash A } \;\;\; \frac { B \vdash B } { A \wedge B \vdash B} } { A \wedge B \vdash A \wedge B }

reconstructs the identity on ABA \wedge B from identities on AA and on BB, by first applying the R{\wedge}R rule followed by the two L{\wedge}L rules (reading the derivation on the right bottom-up).

(Compare these two conversions arising from cut- and identity-elimination to the lambda calculus conversions (λx.t 1)t 2t 1[t 2/x](\lambda x.t_1) t_2 \to t_1[t_2/x] and tπ 1t,π 2tt \to \langle\pi_1t,\pi_2 t\rangle, i.e., a β\beta reduction and an η\eta expansion respectively.)

Alternative forms of the cut rule

In linear logic (for instance), one sometimes sees sequents written in one-sided form:

Γ.\; \vdash \Gamma.

Here the negation operator is used to mediate between classical two-sided sequents and one-sided sequents, according to a scheme where a sequent Γ,AΔ\Gamma, A \vdash \Delta is associated with a sequent ΓΔ,¬A\Gamma \vdash \Delta, \neg A (each being derivable from the other). Thus one can contemplate sequents where all formulae have been pushed to the right of the entailment symbol \vdash.

For such one-sided sequents, say in multiplicative linear logic, the cut rule may be expressed in the form

Γ,¬AΔ,AΓ,Δcut\frac{\vdash \Gamma, \neg A \;\;\; \vdash \Delta, A}{\vdash \Gamma, \Delta} \; cut

and this rule is ‘dual’ to one which introduces an identity:

¬A,Aidentity.\frac{}{\vdash \neg A, A} \; identity.

Categorically, the cut rule in this form corresponds to the arrow ¬AA\neg A \otimes A \to \bot that implements an evaluation, and the identity rule corresponds to an arrow ¬AA=AA\top \to \neg A \wp A = A \multimap A that names an identity morphism. These two arrows are de Morgan dual to one another.

Cut elimination as a computation

Gerhard Gentzen‘s Hauptsatz is often formulated under the form: “The cut rule is an admissible rule”- ie. the logical system obtained by removing the cut rule proves exactly the same sequents that the original system with the cut rule, it is important that very often, a stronger statement is true.

The theorem of Gentzen was formulated for classical logic and is also true in the other logics with a cut rule such as intuitionistic logic or linear logic. In these two latter systems, one has more, the cut elimination can be formulated as “There exists a cut elimination algorithm specified like this:

  • Entry: proof π 1\pi_{1} of a sequent ΓΔ\Gamma \vdash \Delta
  • Output: proof π 2\pi_{2} of the sequent ΓΔ\Gamma \vdash \Delta which doesn’t use the cut rule“

A better way to present things would be: “Here is an algorithm which takes in entry a proof π 1\pi_{1} of ΓΔ\Gamma \vdash \Delta and returns a proof π 2\pi_{2} of ΓΔ\Gamma \vdash \Delta which doesn’t use the cut rule”, because an actual algorithm is actually given and it is important to know what is this algorithm, we’re not interested only by its existence.

The best way to tell it would be in fact as:

“Here is a rewriting system \rightarrow on the set of proofs of this logic such that if π 1π 2\pi_{1} \rightarrow \pi_{2} then π 1,π 2\pi_{1}, \pi_{2} are proofs of the same sequent, and for every proof π\pi of ΓΔ\Gamma \vdash \Delta, there exists a finite sequence π 0,π 1,,π n\pi_{0}, \pi_{1}, \ldots, \pi_{n} of proofs of ΓΔ\Gamma \vdash \Delta such that π=π 0π 1...π n\pi = \pi_{0} \rightarrow \pi_{1} \rightarrow ... \rightarrow \pi_{n} and π n\pi_{n} doesn’t use the cut rule.”

One can after this study the properties of this particular rewriting system: is it confluent? Is it normalizing?? strongly normalizing? Does it provide a deterministic algorithm? or a nondeterministic algorithm? What is the computational complexity of the algorithms it provides to eliminate cuts?

Unfortunately, the algorithmic aspect of cut elimination is rarely presented as explicitly, and most of the time the cut elimination is presented as the admissibility of the cut rule and the concrete algorithm to eliminate the instance of cuts in a proof π\pi must be extracted by the reader from this proof (of the cut elimination) which nevertheless use it in order to do the proof (of the cut elimination).

We’re not sure whether there isn’t such an algorithm of cut elimination for classical logic and this an ongoing subject of research. We can think that the usual sequent calculus of classical logic must be modified by using ideas such as polarization and focusing in order to understand the computational content of the cut elimination of classical logic.

References

The notion of the cut rule as a structural inference rule originates (under the German name Schnitt) with:

Last revised on May 1, 2024 at 19:07:31. See the history of this page for a list of all contributions to it.