HOME

TheInfoList



OR:

Linear logic is a substructural logic proposed by Jean-Yves Girard as a refinement of classical and intuitionistic logic, joining the dualities of the former with many of the
constructive Although the general English usage of the adjective constructive is "helping to develop or improve something; helpful to someone, instead of upsetting and negative," as in the phrase "constructive criticism," in legal writing ''constructive'' has ...
properties of the latter. Although the logic has also been studied for its own sake, more broadly, ideas from linear logic have been influential in fields such as programming languages,
game semantics Game semantics (german: dialogische Logik, translated as '' dialogical logic'') is an approach to formal semantics that grounds the concepts of truth or validity on game-theoretic concepts, such as the existence of a winning strategy for a play ...
, and quantum physics (because linear logic can be seen as the logic of quantum information theory), as well as
linguistics Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Ling ...
, particularly because of its emphasis on resource-boundedness, duality, and interaction. Linear logic lends itself to many different presentations, explanations, and intuitions. Proof-theoretically, it derives from an analysis of classical sequent calculus in which uses of (the
structural rule In proof theory, a structural rule is an inference rule that does not refer to any logical connective, but instead operates on the judgment or sequents directly. Structural rules often mimic intended meta-theoretic properties of the logic. Logics ...
s)
contraction Contraction may refer to: Linguistics * Contraction (grammar), a shortened word * Poetic contraction, omission of letters for poetic reasons * Elision, omission of sounds ** Syncope (phonology), omission of sounds in a word * Synalepha, merged ...
and weakening are carefully controlled. Operationally, this means that logical deduction is no longer merely about an ever-expanding collection of persistent "truths", but also a way of manipulating ''resources'' that cannot always be duplicated or thrown away at will. In terms of simple denotational models, linear logic may be seen as refining the interpretation of intuitionistic logic by replacing cartesian (closed) categories by symmetric monoidal (closed) categories, or the interpretation of classical logic by replacing Boolean algebras by C*-algebras.


Connectives, duality, and polarity


Syntax

The language of ''classical linear logic'' (CLL) is defined inductively by the BNF notation Here and range over logical atoms. For reasons to be explained below, the connectives ⊗, ⅋, 1, and ⊥ are called ''multiplicatives'', the connectives &, ⊕, ⊤, and 0 are called ''additives'', and the connectives ! and ? are called ''exponentials''. We can further employ the following terminology: Binary connectives ⊗, ⊕, & and ⅋ are associative and commutative; 1 is the unit for ⊗, 0 is the unit for ⊕, ⊥ is the unit for ⅋ and ⊤ is the unit for &. Every proposition in CLL has a dual , defined as follows: Observe that is an involution, i.e., for all propositions. is also called the ''linear negation'' of . The columns of the table suggest another way of classifying the connectives of linear logic, termed : the connectives negated in the left column (⊗, ⊕, 1, 0, !) are called ''positive'', while their duals on the right (⅋, &, ⊥, ⊤, ?) are called ''negative''; cf. table on the right. is not included in the grammar of connectives, but is definable in CLL using linear negation and multiplicative disjunction, by . The connective ⊸ is sometimes pronounced " lollipop", owing to its shape.


Sequent calculus presentation

One way of defining linear logic is as a sequent calculus. We use the letters and to range over list of propositions , also called ''contexts''. A ''sequent'' places a context to the left and the right of the turnstile, written . Intuitively, the sequent asserts that the conjunction of entails the disjunction of (though we mean the "multiplicative" conjunction and disjunction, as explained below). Girard describes classical linear logic using only ''one-sided'' sequents (where the left-hand context is empty), and we follow here that more economical presentation. This is possible because any premises to the left of a turnstile can always be moved to the other side and dualised. We now give
inference rules In the philosophy of logic, a rule of inference, inference rule or transformation rule is a logical form consisting of a function which takes premises, analyzes their syntax, and returns a conclusion (or conclusions). For example, the rule of ...
describing how to build proofs of sequents. First, to formalize the fact that we do not care about the order of propositions inside a context, we add the structural rule of exchange: Note that we do not add the structural rules of weakening and contraction, because we do care about the absence of propositions in a sequent, and the number of copies present. Next we add ''initial sequents'' and ''cuts'': The cut rule can be seen as a way of composing proofs, and initial sequents serve as the units for composition. In a certain sense these rules are redundant: as we introduce additional rules for building proofs below, we will maintain the property that arbitrary initial sequents can be derived from atomic initial sequents, and that whenever a sequent is provable it can be given a cut-free proof. Ultimately, this canonical form property (which can be divided into the
completeness of atomic initial sequents In sequent calculus, the completeness of atomic initial sequents states that initial sequents (where is an arbitrary formula) can be derived from only atomic initial sequents (where is an atomic formula). This theorem plays a role analogous to ...
and the cut-elimination theorem, inducing a notion of analytic proof) lies behind the applications of linear logic in computer science, since it allows the logic to be used in proof search and as a resource-aware
lambda-calculus Lambda calculus (also written as ''λ''-calculus) is a formal system in mathematical logic for expressing computation based on function abstraction and application using variable binding and substitution. It is a universal model of computation ...
. Now, we explain the connectives by giving ''logical rules''. Typically in sequent calculus one gives both "right-rules" and "left-rules" for each connective, essentially describing two modes of reasoning about propositions involving that connective (e.g., verification and falsification). In a one-sided presentation, one instead makes use of negation: the right-rules for a connective (say ⅋) effectively play the role of left-rules for its dual (⊗). So, we should expect a certain "harmony" between the rule(s) for a connective and the rule(s) for its dual.


Multiplicatives

The rules for multiplicative conjunction (⊗) and disjunction (⅋): and for their units: Observe that the rules for multiplicative conjunction and disjunction are admissible for plain ''conjunction'' and ''disjunction'' under a classical interpretation (i.e., they are admissible rules in LK).


Additives

The rules for additive conjunction (&) and disjunction (⊕) : and for their units: Observe that the rules for additive conjunction and disjunction are again admissible under a classical interpretation. But now we can explain the basis for the multiplicative/additive distinction in the rules for the two different versions of conjunction: for the multiplicative connective (⊗), the context of the conclusion () is split up between the premises, whereas for the additive case connective (&) the context of the conclusion () is carried whole into both premises.


Exponentials

The exponentials are used to give controlled access to weakening and contraction. Specifically, we add structural rules of weakening and contraction for ?'d propositions: and use the following logical rules: One might observe that the rules for the exponentials follow a different pattern from the rules for the other connectives, resembling the inference rules governing modalities in sequent calculus formalisations of the normal modal logic S4, and that there is no longer such a clear symmetry between the duals ! and ?. This situation is remedied in alternative presentations of CLL (e.g., the LU presentation).


Remarkable formulas

In addition to the De Morgan dualities described above, some important equivalences in linear logic include: ; Distributivity : By definition of as , the last two distributivity laws also give: (Here is .) ; Exponential isomorphism : ; Linear distributions : A map that is not an isomorphism yet plays a crucial role in linear logic is: Linear distributions are fundamental in the proof theory of linear logic. The consequences of this map were first investigated in and called a "weak distribution". In subsequent work it was renamed to "linear distribution" to reflect the fundamental connection to linear logic. ; Other implications : The following distributivity formulas are not in general an equivalence, only an implication:


Encoding classical/intuitionistic logic in linear logic

Both intuitionistic and classical implication can be recovered from linear implication by inserting exponentials: intuitionistic implication is encoded as , while classical implication can be encoded as or (or a variety of alternative possible translations). The idea is that exponentials allow us to use a formula as many times as we need, which is always possible in classical and intuitionistic logic. Formally, there exists a translation of formulas of intuitionistic logic to formulas of linear logic in a way that guarantees that the original formula is provable in intuitionistic logic if and only if the translated formula is provable in linear logic. Using the Gödel–Gentzen negative translation, we can thus embed classical first-order logic into linear first-order logic.


The resource interpretation

Lafont (1993) first showed how intuitionistic linear logic can be explained as a logic of resources, so providing the logical language with access to formalisms that can be used for reasoning about resources within the logic itself, rather than, as in classical logic, by means of non-logical predicates and relations. Tony Hoare (1985)'s classic example of the vending machine can be used to illustrate this idea. Suppose we represent having a candy bar by the atomic proposition , and having a dollar by . To state the fact that a dollar will buy you one candy bar, we might write the implication . But in ordinary (classical or intuitionistic) logic, from and one can conclude . So, ordinary logic leads us to believe that we can buy the candy bar and keep our dollar! Of course, we can avoid this problem by using more sophisticated encodings, although typically such encodings suffer from the frame problem. However, the rejection of weakening and contraction allows linear logic to avoid this kind of spurious reasoning even with the "naive" rule. Rather than , we express the property of the vending machine as a ''linear'' implication . From and this fact, we can conclude , but ''not'' . In general, we can use the linear logic proposition to express the validity of transforming resource into resource . Running with the example of the vending machine, consider the "resource interpretations" of the other multiplicative and additive connectives. (The exponentials provide the means to combine this resource interpretation with the usual notion of persistent logical truth.) Multiplicative conjunction denotes simultaneous occurrence of resources, to be used as the consumer directs. For example, if you buy a stick of gum and a bottle of soft drink, then you are requesting . The constant 1 denotes the absence of any resource, and so functions as the unit of ⊗. Additive conjunction represents alternative occurrence of resources, the choice of which the consumer controls. If in the vending machine there is a packet of chips, a candy bar, and a can of soft drink, each costing one dollar, then for that price you can buy exactly one of these products. Thus we write . We do ''not'' write , which would imply that one dollar suffices for buying all three products together. However, from , we can correctly deduce , where . The unit ⊤ of additive conjunction can be seen as a wastebasket for unneeded resources. For example, we can write to express that with three dollars you can get a candy bar and some other stuff, without being more specific (for example, chips and a drink, or $2, or $1 and chips, etc.). Additive disjunction represents alternative occurrence of resources, the choice of which the machine controls. For example, suppose the vending machine permits gambling: insert a dollar and the machine may dispense a candy bar, a packet of chips, or a soft drink. We can express this situation as . The constant 0 represents a product that cannot be made, and thus serves as the unit of ⊕ (a machine that might produce or is as good as a machine that always produces because it will never succeed in producing a 0). So unlike above, we cannot deduce from this. Multiplicative disjunction is more difficult to gloss in terms of the resource interpretation, although we can encode back into linear implication, either as or .


Other proof systems


Proof nets

Introduced by Jean-Yves Girard, proof nets have been created to avoid the ''bureaucracy'', that is all the things that make two derivations different in the logical point of view, but not in a "moral" point of view. For instance, these two proofs are "morally" identical: The goal of proof nets is to make them identical by creating a graphical representation of them.


Semantics


Algebraic semantics


Decidability/complexity of entailment

The entailment relation in full CLL is undecidable.For this result and discussion of some of the fragments below, see: When considering fragments of CLL, the decision problem has varying complexity: * Multiplicative linear logic (MLL): only the multiplicative connectives. MLL entailment is NP-complete, even restricting to
Horn clauses In mathematical logic and logic programming, a Horn clause is a logical formula of a particular rule-like form which gives it useful properties for use in logic programming, formal specification, and model theory. Horn clauses are named for the lo ...
in the purely implicative fragment, or to atom-free formulas. * Multiplicative-additive linear logic (MALL): only multiplicatives and additives (i.e., exponential-free). MALL entailment is PSPACE-complete. * Multiplicative-exponential linear logic (MELL): only multiplicatives and exponentials. By reduction from the reachability problem for Petri nets, MELL entailment must be at least EXPSPACE-hard, although decidability itself has had the status of a longstanding open problem. In 2015, a proof of decidability was published in the journal '' TCS'', but was later shown to be erroneous. * Affine linear logic (that is linear logic with weakening, an extension rather than a fragment) was shown to be decidable, in 1995.


Variants

Many variations of linear logic arise by further tinkering with the structural rules: * Affine logic, which forbids contraction but allows global weakening (a decidable extension). *
Strict logic In mathematical writing, the term strict refers to the property of excluding equality and equivalence and often occurs in the context of inequality and monotonic functions. It is often attached to a technical term to indicate that the exclusiv ...
or relevant logic, which forbids weakening but allows global contraction. * Non-commutative logic or ordered logic, which removes the rule of exchange, in addition to barring weakening and contraction. In ordered logic, linear implication divides further into left-implication and right-implication. Different intuitionistic variants of linear logic have been considered. When based on a single-conclusion sequent calculus presentation, like in ILL (Intuitionistic Linear Logic), the connectives ⅋, ⊥, and ? are absent, and linear implication is treated as a primitive connective. In FILL (Full Intuitionistic Linear Logic) the connectives ⅋, ⊥, and ? are present, linear implication is a primitive connective and, similarly to what happens in intuitionistic logic, all connectives (except linear negation) are independent. There are also first- and higher-order extensions of linear logic, whose formal development is somewhat standard (see first-order logic and higher-order logic).


See also

* Chu spaces *
Computability logic Computability logic (CoL) is a research program and mathematical framework for redeveloping logic as a systematic formal theory of computability, as opposed to classical logic which is a formal theory of truth. It was introduced and so named by Gi ...
*
Game semantics Game semantics (german: dialogische Logik, translated as '' dialogical logic'') is an approach to formal semantics that grounds the concepts of truth or validity on game-theoretic concepts, such as the existence of a winning strategy for a play ...
* Geometry of interaction * Intuitionistic logic * Linear logic programming * Linear type system, a substructural type system *
Logic of unity Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the science of deductively valid inferences or of logical truths. It is a formal science investigating how conclusions follow from premis ...
(LU) * Ludics * Proof nets *
Uniqueness type In computing, a unique type guarantees that an object is used in a single-threaded way, with at most a single reference to it. If a value has a unique type, a function applied to it can be optimized to update the value in-place in the object co ...


References


Further reading

* Girard, Jean-Yves.
Linear logic
', Theoretical Computer Science, Vol 50, no 1, pp. 1–102, 1987. * Girard, Jean-Yves, Lafont, Yves, and Taylor, Paul.

'. Cambridge Press, 1989. * Hoare, C. A. R., 1985. ''Communicating Sequential Processes''. Prentice-Hall International. * Lafont, Yves, 1993. ''Introduction to Linear Logic''. Lecture notes from TEMPUS Summer School on ''Algebraic and Categorical Methods in Computer Science'', Brno, Czech Republic. * Troelstra, A.S. ''Lectures on Linear Logic''. CSLI (Center for the Study of Language and Information) Lecture Notes No. 29. Stanford, 1992. * A. S. Troelstra, H. Schwichtenberg (1996). ''Basic Proof Theory''. In series ''Cambridge Tracts in Theoretical Computer Science'', Cambridge University Press, . * Di Cosmo, Roberto, and Danos, Vincent.
The linear logic primer
'.
Introduction to Linear Logic
(Postscript) b
Patrick Lincoln

Introduction to Linear Logic
by Torben Brauner

by Philip Wadler
Linear Logic
b

an
Dale Miller
The Stanford Encyclopedia of Philosophy (Fall 2006 Edition), Edward N. Zalta (ed.).
Overview of linear logic programming
b
Dale Miller
In ''Linear Logic in Computer Science'', edited by Ehrhard, Girard, Ruet, and Scott. Cambridge University Press. London Mathematical Society Lecture Notes, Volume 316, 2004.
Linear Logic Wiki


External links

*
A Linear Logic Prover (llprover)
, available for use online, from: Naoyuki Tamura / Dept of CS / Kobe University / Japan {{DEFAULTSORT:Linear Logic Non-classical logic Substructural logic Logic