HOME

TheInfoList



OR:

In
logic Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the study of deductively valid inferences or logical truths. It examines how conclusions follow from premises based on the structure o ...
and
proof theory Proof theory is a major branchAccording to , proof theory is one of four domains mathematical logic, together with model theory, axiomatic set theory, and recursion theory. consists of four corresponding parts, with part D being about "Proof The ...
, natural deduction is a kind of
proof calculus In mathematical logic, a proof calculus or a proof system is built to prove statements. Overview A proof system includes the components: * Formal language: The set ''L'' of formulas admitted by the system, for example, propositional logic or fir ...
in which
logical reasoning Logical reasoning is a mind, mental Action (philosophy), activity that aims to arrive at a Logical consequence, conclusion in a Rigour, rigorous way. It happens in the form of inferences or arguments by starting from a set of premises and reason ...
is expressed by
inference rules Rules of inference are ways of deriving conclusions from premises. They are integral parts of formal logic, serving as norms of the logical structure of valid arguments. If an argument with true premises follows a rule of inference then the c ...
closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use
axiom An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word (), meaning 'that which is thought worthy or ...
s as much as possible to express the logical laws of
deductive reasoning Deductive reasoning is the process of drawing valid inferences. An inference is valid if its conclusion follows logically from its premises, meaning that it is impossible for the premises to be true and the conclusion to be false. For example, t ...
.


History

Natural deduction grew out of a context of dissatisfaction with the axiomatizations of deductive reasoning common to the systems of
Hilbert David Hilbert (; ; 23 January 1862 – 14 February 1943) was a German mathematician and philosophy of mathematics, philosopher of mathematics and one of the most influential mathematicians of his time. Hilbert discovered and developed a broad ...
,
Frege Friedrich Ludwig Gottlob Frege (; ; 8 November 1848 – 26 July 1925) was a German philosopher, logician, and mathematician. He was a mathematics professor at the University of Jena, and is understood by many to be the father of analytic philos ...
, and Russell (see, e.g.,
Hilbert system In logic, more specifically proof theory, a Hilbert system, sometimes called Hilbert calculus, Hilbert-style system, Hilbert-style proof system, Hilbert-style deductive system or Hilbert–Ackermann system, is a type of formal proof Proof calcu ...
). Such axiomatizations were most famously used by Russell and Whitehead in their mathematical treatise ''
Principia Mathematica The ''Principia Mathematica'' (often abbreviated ''PM'') is a three-volume work on the foundations of mathematics written by the mathematician–philosophers Alfred North Whitehead and Bertrand Russell and published in 1910, 1912, and 1 ...
''. Spurred on by a series of seminars in Poland in 1926 by Łukasiewicz that advocated a more natural treatment of logic, Jaśkowski made the earliest attempts at defining a more natural deduction, first in 1929 using a diagrammatic notation, and later updating his proposal in a sequence of papers in 1934 and 1935. His proposals led to different notations such as Fitch notation or Suppes' method, for which Lemmon gave a variant now known as Suppes–Lemmon notation. Natural deduction in its modern form was independently proposed by the German mathematician
Gerhard Gentzen Gerhard Karl Erich Gentzen (24 November 1909 – 4 August 1945) was a German mathematician and logician. He made major contributions to the foundations of mathematics, proof theory, especially on natural deduction and sequent calculus. He died ...
in 1933, in a dissertation delivered to the faculty of mathematical sciences of the
University of Göttingen The University of Göttingen, officially the Georg August University of Göttingen (, commonly referred to as Georgia Augusta), is a Public university, public research university in the city of Göttingen, Lower Saxony, Germany. Founded in 1734 ...
. The term ''natural deduction'' (or rather, its German equivalent ''natürliches Schließen'') was coined in that paper: Gentzen was motivated by a desire to establish the consistency of
number theory Number theory is a branch of pure mathematics devoted primarily to the study of the integers and arithmetic functions. Number theorists study prime numbers as well as the properties of mathematical objects constructed from integers (for example ...
. He was unable to prove the main result required for the consistency result, the cut elimination theorem—the Hauptsatz—directly for natural deduction. For this reason he introduced his alternative system, the
sequent calculus In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology (called a sequent by Gerhard Gentzen) instead of an unconditional tautology. Each conditional tautolog ...
, for which he proved the Hauptsatz both for classical and
intuitionistic logic Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems ...
. In a series of seminars in 1961 and 1962 Prawitz gave a comprehensive summary of natural deduction calculi, and transported much of Gentzen's work with sequent calculi into the natural deduction framework. His 1965 monograph ''Natural deduction: a proof-theoretical study'', . was to become a reference work on natural deduction, and included applications for modal and
second-order logic In logic and mathematics, second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory. First-order logic quantifies on ...
. In natural deduction, a
proposition A proposition is a statement that can be either true or false. It is a central concept in the philosophy of language, semantics, logic, and related fields. Propositions are the object s denoted by declarative sentences; for example, "The sky ...
is deduced from a collection of premises by applying inference rules repeatedly. The system presented in this article is a minor variation of Gentzen's or Prawitz's formulation, but with a closer adherence to Martin-Löf's description of logical judgments and connectives.


History of notation styles

Natural deduction has had a large variety of notation styles, which can make it difficult to recognize a proof if you're not familiar with one of them. To help with this situation, this article has a section explaining how to read all the notation that it will actually use. This section just explains the historical evolution of notation styles, most of which cannot be shown because there are no illustrations available under a
public copyright license A public license or public copyright license is a license by which a copyright holder as licensor can grant additional copyright permissions to any and all persons in the general public as licensees. By applying a public license to a work, prov ...
– the reader is pointed to th
SEP
an
IEP
for pictures. * Gentzen invented natural deduction using tree-shaped proofs – see for details. * Jaśkowski changed this to a notation that used various nested boxes. * Fitch changed Jaśkowski method of drawing the boxes, creating Fitch notation. * 1940: In a textbook,
Quine Quine may refer to: * Quine (computing), a program that produces its source code as output * Quine's paradox, in logic * Quine (surname), people with the surname ** Willard Van Orman Quine (1908–2000), American philosopher and logician See al ...
indicated antecedent dependencies by line numbers in square brackets, anticipating Suppes' 1957 line-number notation. * 1950: In a textbook, demonstrated a method of using one or more asterisks to the left of each line of proof to indicate dependencies. This is equivalent to Kleene's vertical bars. (It is not totally clear if Quine's asterisk notation appeared in the original 1950 edition or was added in a later edition.) * 1957: An introduction to practical logic theorem proving in a textbook by . This indicated dependencies (i.e. antecedent propositions) by line numbers at the left of each line. * 1963: uses sets of line numbers to indicate antecedent dependencies of the lines of sequential logical arguments based on natural deduction inference rules. * 1965: The entire textbook by is an introduction to logic proofs using a method based on that of Suppes, what is now known as Suppes–Lemmon notation. * 1967: In a textbook, briefly demonstrated two kinds of practical logic proofs, one system using explicit quotations of antecedent propositions on the left of each line, the other system using vertical bar-lines on the left to indicate dependencies.


Notation

Here is a table with the most common notational variants for
logical connective In logic, a logical connective (also called a logical operator, sentential connective, or sentential operator) is a logical constant. Connectives can be used to connect logical formulas. For instance in the syntax of propositional logic, the ...
s.


Gentzen's tree notation

Gentzen, who invented natural deduction, had his own notation style for arguments. This will be exemplified by a simple argument below. Let's say we have a simple example argument in
propositional logic The propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. Sometimes, it is called ''first-order'' propositional logic to contra ...
, such as, "if it's raining then it's cloudly; it is raining; therefore it's cloudy". (This is in
modus ponens In propositional logic, (; MP), also known as (), implication elimination, or affirming the antecedent, is a deductive argument form and rule of inference. It can be summarized as "''P'' implies ''Q.'' ''P'' is true. Therefore, ''Q'' must ...
.) Representing this as a list of propositions, as is common, we would have: :1) ~ P \to Q :2) ~ P :\therefore ~ Q In Gentzen's notation, this would be written like this: :\frac The premises are shown above a line, called the inference line, separated by a comma, which indicates ''combination'' of premises. The conclusion is written below the inference line. The inference line represents ''syntactic consequence'', sometimes called ''deductive consequence'', which is also symbolized with ⊢. So the above can also be written in one line as P \to Q, P \vdash Q. (The turnstile, for syntactic consequence, is of lower precedence than the comma, which represents premise combination, which in turn is of lower precedence than the arrow, used for material implication; so no parentheses are needed to interpret this formula.) Syntactic consequence is contrasted with ''semantic consequence'', which is symbolized with ⊧. In this case, the conclusion follows ''syntactically'' because natural deduction is a syntactic proof system, which assumes
inference rules Rules of inference are ways of deriving conclusions from premises. They are integral parts of formal logic, serving as norms of the logical structure of valid arguments. If an argument with true premises follows a rule of inference then the c ...
as primitives. Gentzen's style will be used in much of this article. Gentzen's discharging annotations used to internalise hypothetical judgments can be avoided by representing proofs as a tree of
sequent In mathematical logic, a sequent is a very general kind of conditional assertion. : A_1,\,\dots,A_m \,\vdash\, B_1,\,\dots,B_n. A sequent may have any number ''m'' of condition formulas ''Ai'' (called " antecedents") and any number ''n'' of ass ...
s ''Γ ⊢A'' instead of a tree of judgments that A (is true).


Suppes–Lemmon notation

Many textbooks use Suppes–Lemmon notation, so this article will also give that – although as of now, this is only included for
propositional logic The propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. Sometimes, it is called ''first-order'' propositional logic to contra ...
, and the rest of the coverage is given only in Gentzen style. A proof, laid out in accordance with the Suppes–Lemmon notation style, is a sequence of lines containing sentences, where each sentence is either an assumption, or the result of applying a rule of proof to earlier sentences in the sequence. Each line of proof is made up of a sentence of proof, together with its annotation, its assumption set, and the current line number. The assumption set lists the assumptions on which the given sentence of proof depends, which are referenced by the line numbers. The annotation specifies which rule of proof was applied, and to which earlier lines, to yield the current sentence. Here's an example proof: This proof will become clearer when the inference rules and their appropriate annotations are specified – see .


Propositional language syntax

This section defines the formal syntax for a propositional logic language, contrasting the common ways of doing so with a Gentzen-style way of doing so.


Common definition styles

In classical
propositional calculus The propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. Sometimes, it is called ''first-order'' propositional logic to contra ...
the
formal language In logic, mathematics, computer science, and linguistics, a formal language is a set of strings whose symbols are taken from a set called "alphabet". The alphabet of a formal language consists of symbols that concatenate into strings (also c ...
\mathcal is usually defined (here: by
recursion Recursion occurs when the definition of a concept or process depends on a simpler or previous version of itself. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in m ...
) as follows: # Each
propositional variable In mathematical logic, a propositional variable (also called a sentence letter, sentential variable, or sentential letter) is an input variable (that can either be true or false) of a truth function. Propositional variables are the basic building ...
is a
formula In science, a formula is a concise way of expressing information symbolically, as in a mathematical formula or a ''chemical formula''. The informal use of the term ''formula'' in science refers to the general construct of a relationship betwe ...
. # "\bot" is a formula. # If \varphi and \psi are formulae, so are (\varphi \land \psi), (\varphi \lor \psi), (\varphi \to \psi), (\varphi \leftrightarrow \psi). # Nothing else is a formula.
Negation In logic, negation, also called the logical not or logical complement, is an operation (mathematics), operation that takes a Proposition (mathematics), proposition P to another proposition "not P", written \neg P, \mathord P, P^\prime or \over ...
(\neg) is defined as implication to falsity :\neg \phi \; \overset \; \phi \to \bot, where \bot (falsum) represents a contradiction or absolute falsehood. Older publications, and publications that do not focus on logical systems like minimal,
intuitionistic In the philosophy of mathematics, intuitionism, or neointuitionism (opposed to preintuitionism), is an approach where mathematics is considered to be purely the result of the constructive mental activity of humans rather than the discovery of f ...
or
Hilbert system In logic, more specifically proof theory, a Hilbert system, sometimes called Hilbert calculus, Hilbert-style system, Hilbert-style proof system, Hilbert-style deductive system or Hilbert–Ackermann system, is a type of formal proof Proof calcu ...
s, take negation as a primitive
logical connective In logic, a logical connective (also called a logical operator, sentential connective, or sentential operator) is a logical constant. Connectives can be used to connect logical formulas. For instance in the syntax of propositional logic, the ...
, meaning it is assumed as a basic operation and not defined in terms of other connectives. Some authors, such as Bostock, use \bot and \top, and also define \neg as primitives.


Gentzen-style definition

A syntax definition can also be given using , by writing well-formed formulas below the inference line and any schematic variables used by those formulas above it. For instance, the equivalent of rules 3 and 4, from Bostock's definition above, is written as follows: :\frac \quad \frac \quad \frac \quad \frac \quad \frac. A different notational convention sees the language's syntax as a categorial grammar with the single category "formula", denoted by the symbol \mathcal. So any elements of the syntax are introduced by categorizations, for which the notation is \varphi : \mathcal, meaning "\varphi is an expression for an object in the category \mathcal." The sentence-letters, then, are introduced by categorizations such as P : \mathcal, Q : \mathcal, R : \mathcal, and so on; the connectives, in turn, are defined by statements similar to the above, but using categorization notation, as seen below: In the rest of this article, the \varphi : \mathcal categorization notation will be used for any Gentzen-notation statements defining the language's grammar; any other statements in Gentzen notation will be inferences, asserting that a sequent follows rather than that an expression is a well-formed formula.


Gentzen-style propositional logic


Gentzen-style inference rules

Let the propositional language \mathcal be inductively defined as \Phi ::= p_1, p_2, \dots \mid \bot \mid (\Phi \to \Phi) \mid (\Phi \land \Phi) \mid (\Phi \lor \Phi). Define negation as \neg \Phi \, \overset \, (\Phi \to \bot). The following is a list of primitive inference rules for natural deduction in propositional logic In this table the Greek letters \varphi, \psi, \chi are '' schemata'', which range over formulas rather than only over atomic propositions. The name of a rule is given to the right of its formula tree. For instance, the first introduction rule is named \land_I, which is short for "conjunction introduction". * Minimal logic: the natural deduction rules are ND_ = \. :Whithout the rules \bot_E and \neg\neg_E the system defines minimal logic (as discussed by Johansson). *
Intuitionistic logic Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems ...
: the natural deduction rules are ND_ = ND_ \cup \. :When the rule \bot_E (
principle of explosion In classical logic, intuitionistic logic, and similar logical systems, the principle of explosion is the law according to which any statement can be proven from a contradiction. That is, from a contradiction, any proposition (including its n ...
) is added to the rules for minimal logic, the system defines intuitionistic logic. :The statement P \to \neg \neg P is valid (already in minimal logic, see example 1 below), unlike the reverse implication which would entail the
law of excluded middle In logic, the law of excluded middle or the principle of excluded middle states that for every proposition, either this proposition or its negation is true. It is one of the three laws of thought, along with the law of noncontradiction and t ...
. *
Classical logic Classical logic (or standard logic) or Frege–Russell logic is the intensively studied and most widely used class of deductive logic. Classical logic has had much influence on analytic philosophy. Characteristics Each logical system in this c ...
: the natural deduction rules are ND_ = ND_ \cup \. :When all listed natural deduction rules are admitted, the system defines classical logic.


Gentzen-style example proofs

Example 1: Proof, within minimal logic, of P \to \neg \neg P. Goal: P \to ((P \to \bot) \to \bot)
Proof: : \cfrac \to_ Example 2: Proof, whithin minimal logic, of A \to \left ( B \to \left ( A \land B \right ) \right ): : \cfrac\ \to_


Fitch-style propositional logic

Fitch developed a system of natural deduction which is characterized by * linear presentation of the proof, instead of presentation as a tree; * subordinate proofs, where assumptions could be opened within a subderivation and discharged later. Later logicians and educators such as
Patrick Suppes Patrick Colonel Suppes (; March 17, 1922 – November 17, 2014) was an American philosopher who made significant contributions to philosophy of science, the theory of measurement, the foundations of quantum mechanics, decision theory, psycholog ...
and E. J. Lemmon rebranded Fitch's system. While they introduced graphical changes—such as replacing indentation with vertical bars—the underlying structure of Fitch-style natural deduction remained intact. These variations are often referred to as the Suppes–Lemmon format, though they are fundamentally based on Fitch's original notation.


Suppes–Lemmon-style propositional logic


Suppes–Lemmon-style inference rules

The linear presentation used in Fitch- and Suppes–Lemmon-style proofs — with line numbers and vertical alignment/assumption sets — makes subproofs clearly visible. Fitch (sparingly and cautiously) used derived rules. Suppes–Lemmon went further and added derived rules to the toolbox of natural deduction rules. Suppes introduced natural deduction using Gentzen-style rules. * He defined negation in terms of contradiction: \neg P \equiv (P \to \bot). * He discussed derived rules explicitly, though not always distinguishing them clearly from primitive ones in layout. * His system is close to minimal, but allows derived steps for brevity. Lemmon formalized more derived rules. He as well defined negation as implication to falsity: \neg P \equiv P \to \bot. This is not stated as a formal definition in ''Beginning Logic'', but it is implicitly assumed throughout the system. Here's how we know: * Use of RAA (Reductio ad Absurdum): Lemmon regularly used RAA in the form: Assume P, derive \bot, then conclude \neg P. : This only works if \neg P is understood as P \to \bot. * Proofs involving contradiction: Lemmon used the fact that from \neg P \land P one can derive \bot. : This requires treating \neg P as P \to \bot, so that modus ponens yields contradiction. * Absence of a primitive “¬” rule: Lemmon did not include a standalone rule for introducing or eliminating ¬. Instead, he derived negation using implication and contradiction. In the table below, based on Lemmon (1978) and Allen & Hand (2022), Lemmon's derived rules are highlighted. They can be derived from the (non-highlighted) Gentzen rules. There are nine primitive rules of proof, which are the rule ''assumption'', plus four pairs of introduction and elimination rules for the binary connectives, and the rules of ''double negation'' and ''reductio ad absurdum'', of which only one is needed. ''Disjunctive Syllogism'' can be used as an easier alternative to the proper ∨-elimination, and MTT is a commonly given rule, although it is not primitive. {, class="wikitable" style="margin:auto;" , + List of Inference Rules , - ! Rule Name ! Alternative names ! Annotation ! Assumption set ! Statement , - , Rule of Assumptions , Assumption , A , The current line number. , At any stage of the argument, introduce a proposition as an assumption of the argument. , - , Conjunction introduction , Ampersand introduction, conjunction (CONJ) , m, n &I , The union of the assumption sets at lines m and n. , From \varphi and \psi at lines m and n, infer \varphi \land \psi. , - , Conjunction elimination , Simplification (S), ampersand elimination , m &E , The same as at line m. , From \varphi \land \psi at line m, infer \varphi and \psi. , - , Disjunction introduction , Addition (ADD) , m ∨I , The same as at line m. , From \varphi at line m, infer \varphi \lor \psi, whatever \psi may be. , - , Disjunction elimination , Wedge elimination, dilemma (DL) , j,k,l,m,n ∨E , The lines j,k,l,m,n. , From \varphi \lor \psi at line j, and an assumption of \varphi at line k, and a derivation of \chi from \varphi at line l, and an assumption of \psi at line m, and a derivation of \chi from \psi at line n, infer \chi. , - , Arrow introduction , Conditional proof (CP), conditional introduction , n, →I (m) , Everything in the assumption set at line n, excepting m, the line where the antecedent was assumed. , From \psi at line n, following from the assumption of \varphi at line m, infer \varphi \to \psi. , - , Arrow elimination , Modus ponendo ponens (MPP), modus ponens (MP), conditional elimination , m, n →E , The union of the assumption sets at lines m and n. , From \varphi \to \psi at line m, and \varphi at line n, infer \psi. , - , Double negation , Double negation elimination , m DN , The same as at line m. , From \neg \neg \varphi at line m, infer \varphi. , - , Reductio ad absurdum , Indirect Proof (IP), negation introduction (¬I), negation elimination (¬E) , m, n RAA (k) , The union of the assumption sets at lines m and n, excluding k (the denied assumption). , From a sentence and its denial{A \wedge B\ }\wedge_I\end{aligned} These notions correspond exactly to β-reduction (beta reduction) and η-conversion (eta conversion) in the
lambda calculus In mathematical logic, the lambda calculus (also written as ''λ''-calculus) is a formal system for expressing computability, computation based on function Abstraction (computer science), abstraction and function application, application using var ...
, using the Curry–Howard isomorphism. By local completeness, we see that every derivation can be converted to an equivalent derivation where the principal connective is introduced. In fact, if the entire derivation obeys this ordering of eliminations followed by introductions, then it is said to be ''normal''. In a normal derivation all eliminations happen above introductions. In most logics, every derivation has an equivalent normal derivation, called a '' normal form''. The existence of normal forms is generally hard to prove using natural deduction alone, though such accounts do exist in the literature, most notably by
Dag Prawitz Dag Prawitz (born 1936, Stockholm) is a Swedish philosopher and logician. He is best known for his work on proof theory and the foundations of natural deduction, and for his contributions to proof-theoretic semantics. Prawitz is a member of the ...
in 1961. It is much easier to show this indirectly by means of a cut-free
sequent calculus In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology (called a sequent by Gerhard Gentzen) instead of an unconditional tautology. Each conditional tautolog ...
presentation.


First and higher-order extensions

The logic of the earlier section is an example of a ''single-sorted'' logic, ''i.e.'', a logic with a single kind of object: propositions. Many extensions of this simple framework have been proposed; in this section we will extend it with a second sort of ''individuals'' or '' terms''. More precisely, we will add a new category, "term", denoted \mathcal{T}. We shall fix a
countable In mathematics, a Set (mathematics), set is countable if either it is finite set, finite or it can be made in one to one correspondence with the set of natural numbers. Equivalently, a set is ''countable'' if there exists an injective function fro ...
set ''V'' of ''variables'', another countable set F of ''function symbols'', and construct terms with the following formation rules: : \frac{v\in V}{v : \mathcal{T \hbox{ var}_F and : \frac{f\in F\qquad t_1 : \mathcal{T}\qquad t_2 : \mathcal{T}\qquad \cdots \qquad t_n : \mathcal{T{f(t_1, t_2,\cdots,t_n) : \mathcal{T \hbox{ app}_F For propositions, we consider a third countable set ''P'' of '' predicates'', and define ''atomic predicates over terms'' with the following formation rule: : \frac{\phi\in P\qquad t_1 : \mathcal{T}\qquad t_2 : \mathcal{T}\qquad \cdots \qquad t_n : \mathcal{T{\phi(t_1, t_2,\cdots,t_n) : \mathcal{F \hbox{ pred}_F The first two rules of formation provide a definition of a term that is effectively the same as that defined in
term algebra Term may refer to: Language *Terminology, context-specific nouns or compound words **Technical term (or ''term of art''), used by specialists in a field ***Scientific terminology, used by scientists *Term (argumentation), part of an argument in d ...
and
model theory In mathematical logic, model theory is the study of the relationship between theory (mathematical logic), formal theories (a collection of Sentence (mathematical logic), sentences in a formal language expressing statements about a Structure (mat ...
, although the focus of those fields of study is quite different from natural deduction. The third rule of formation effectively defines an
atomic formula In mathematical logic, an atomic formula (also known as an atom or a prime formula) is a formula with no deeper propositional structure, that is, a formula that contains no logical connectives or equivalently a formula that has no strict subformu ...
, as in
first-order logic First-order logic, also called predicate logic, predicate calculus, or quantificational logic, is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over ...
, and again in model theory. To these are added a pair of formation rules, defining the notation for '' quantified'' propositions; one for universal (∀) and existential (∃) quantification: : \frac{x\in V \qquad A : \mathcal{F{\forall x.A : \mathcal{F \;\forall_F \qquad\qquad \frac{x\in V \qquad A : \mathcal{F{\exists x.A : \mathcal{F \;\exists_F The
universal quantifier In mathematical logic, a universal quantification is a type of quantifier, a logical constant which is interpreted as "given any", "for all", "for every", or "given an arbitrary element". It expresses that a predicate can be satisfied by e ...
has the introduction and elimination rules: : \cfrac{ \begin{array}{c} \cfrac{}{a : \mathcal{T\text{ u} \\ \vdots \\{} /x \end{array} }{\forall x.A }\;\forall_{I^{u,a \qquad \qquad \frac{\forall x.A \qquad t : \mathcal{T{ /x}\;\forall_{E} The
existential quantifier Existentialism is a family of philosophy, philosophical views and inquiry that explore the human individual's struggle to lead an Authenticity (philosophy), authentic life despite the apparent Absurdity#The Absurd, absurdity or incomprehensibili ...
has the introduction and elimination rules: : \frac{ /x }{\exists x.A}\;\exists_{I} \qquad\qquad \cfrac{ \begin{array}{cc} & \underbrace{\,\cfrac{}{a : \mathcal{T\hbox{ u} \quad \cfrac{}{ /x }\hbox{ v}\,}\\ & \vdots \\ \exists x.A \quad & C \\ \end{array} } {C }\exists_{E^{a,u,v In these rules, the notation 't''/''x''''A'' stands for the substitution of ''t'' for every (visible) instance of ''x'' in ''A'', avoiding capture. As before the superscripts on the name stand for the components that are discharged: the term ''a'' cannot occur in the conclusion of ∀I (such terms are known as ''eigenvariables'' or ''parameters''), and the hypotheses named ''u'' and ''v'' in ∃E are localised to the second premise in a hypothetical derivation. Although the propositional logic of earlier sections was decidable, adding the quantifiers makes the logic undecidable. So far, the quantified extensions are ''first-order'': they distinguish propositions from the kinds of objects quantified over.
Higher-order logic In mathematics and logic, a higher-order logic (abbreviated HOL) is a form of logic that is distinguished from first-order logic by additional quantifiers and, sometimes, stronger semantics. Higher-order logics with their standard semantics are m ...
takes a different approach and has only a single sort of propositions. The quantifiers have as the domain of quantification the very same sort of propositions, as reflected in the formation rules: : \cfrac{ \begin{matrix} \cfrac{}{p : \mathcal{F\hbox{ u} \\ \vdots \\ A: \mathcal{F} \\ \end{matrix {\forall p.A : \mathcal{F \;\forall_{F^u} \qquad\qquad \cfrac{ \begin{matrix} \cfrac{}{p : \mathcal{F\hbox{ u} \\ \vdots \\ A: \mathcal{F} \\ \end{matrix {\exists p.A : \mathcal{F \;\exists_{F^u} A discussion of the introduction and elimination forms for higher-order logic is beyond the scope of this article. It is possible to be in-between first-order and higher-order logics. For example,
second-order logic In logic and mathematics, second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory. First-order logic quantifies on ...
has two kinds of propositions, one kind quantifying over terms, and the second kind quantifying over propositions of the first kind.


Proofs and type theory

The presentation of natural deduction so far has concentrated on the nature of propositions without giving a formal definition of a ''proof''. To formalise the notion of proof, we alter the presentation of hypothetical derivations slightly. We label the antecedents with ''proof variables'' (from some countable set ''V'' of variables), and decorate the succedent with the actual proof. The antecedents or ''hypotheses'' are separated from the succedent by means of a ''
turnstile A turnstile (also called a gateline, baffle gate, automated gate, turn gate in some regions) is a form of gate which allows one person to pass at a time. A turnstile can be configured to enforce One-way traffic#One-way traffic of people, one-way ...
'' (⊢). This modification sometimes goes under the name of ''localised hypotheses''. The following diagram summarises the change. {, style="margin-left: 2em;" , - , ──── u1 ──── u2 ... ──── un J1 J2 Jn ⋮ J , width="10%" align="center" , ⇒ , , u1:J1, u2:J2, ..., un:Jn ⊢ J The collection of hypotheses will be written as Γ when their exact composition is not relevant. To make proofs explicit, we move from the proof-less judgment "''A''" to a judgment: "π ''is a proof of (A)''", which is written symbolically as "π : ''A''". Following the standard approach, proofs are specified with their own formation rules for the judgment "π ''proof''". The simplest possible proof is the use of a labelled hypothesis; in this case the evidence is the label itself. {, style="margin-left: 2em;" , - , u ∈ V ─────── proof-F u proof , width="10%" , , , ───────────────────── hyp u:A ⊢ u : A Let us re-examine some of the connectives with explicit proofs. For conjunction, we look at the introduction rule ∧I to discover the form of proofs of conjunction: they must be a pair of proofs of the two conjuncts. Thus: {, style="margin-left: 2em;" , - , π1 proof π2 proof ──────────────────── pair-F (π1, π2) proof , width="10%" , , , Γ ⊢ π1 : A Γ ⊢ π2 : B ───────────────────────── ∧I Γ ⊢ (π1, π2) : A ∧ B The elimination rules ∧E1 and ∧E2 select either the left or the right conjunct; thus the proofs are a pair of projections—first (fst) and second (snd). {, style="margin-left: 2em;" , - , π proof ─────────── fst-F fst π proof , width="10%" , , , Γ ⊢ π : A ∧ B ───────────── ∧E1 Γ ⊢ fst π : A , - , π proof ─────────── snd-F snd π proof , width="10%" , , , Γ ⊢ π : A ∧ B ───────────── ∧E2 Γ ⊢ snd π : B For implication, the introduction form localises or ''binds'' the hypothesis, written using a λ; this corresponds to the discharged label. In the rule, "Γ, ''u'':''A''" stands for the collection of hypotheses Γ, together with the additional hypothesis ''u''. {, style="margin-left: 2em;" , - , π proof ──────────── λ-F λu. π proof , width="10%" , , , Γ, u:A ⊢ π : B ───────────────── ⊃I Γ ⊢ λu. π : A ⊃ B , - , π1 proof π2 proof ─────────────────── app-F π1 π2 proof , width="10%" , , , Γ ⊢ π1 : A ⊃ B Γ ⊢ π2 : A ──────────────────────────── ⊃E Γ ⊢ π1 π2 : B With proofs available explicitly, one can manipulate and reason about proofs. The key operation on proofs is the substitution of one proof for an assumption used in another proof. This is commonly known as a ''substitution theorem'', and can be proved by induction on the depth (or structure) of the second judgment.


Substitution theorem

: ''If'' Γ ⊢ π1 : ''A'' ''and'' Γ, ''u'':''A'' ⊢ π2 : ''B'', ''then'' Γ ⊢ 1/''u''π2 : B. So far the judgment "Γ ⊢ π : ''A''" has had a purely logical interpretation. In
type theory In mathematics and theoretical computer science, a type theory is the formal presentation of a specific type system. Type theory is the academic study of type systems. Some type theories serve as alternatives to set theory as a foundation of ...
, the logical view is exchanged for a more computational view of objects. Propositions in the logical interpretation are now viewed as ''types'', and proofs as programs in the
lambda calculus In mathematical logic, the lambda calculus (also written as ''λ''-calculus) is a formal system for expressing computability, computation based on function Abstraction (computer science), abstraction and function application, application using var ...
. Thus the interpretation of "π : ''A''" is "''the program'' π has type ''A''". The logical connectives are also given a different reading: conjunction is viewed as product (×), implication as the function
arrow An arrow is a fin-stabilized projectile launched by a bow. A typical arrow usually consists of a long, stiff, straight shaft with a weighty (and usually sharp and pointed) arrowhead attached to the front end, multiple fin-like stabilizers c ...
(→), etc. The differences are only cosmetic, however. Type theory has a natural deduction presentation in terms of formation, introduction and elimination rules; in fact, the reader can easily reconstruct what is known as ''simple type theory'' from the previous sections. The difference between logic and type theory is primarily a shift of focus from the types (propositions) to the programs (proofs). Type theory is chiefly interested in the convertibility or reducibility of programs. For every type, there are canonical programs of that type which are irreducible; these are known as ''canonical forms'' or ''values''. If every program can be reduced to a canonical form, then the type theory is said to be '' normalising'' (or ''weakly normalising''). If the canonical form is unique, then the theory is said to be ''strongly normalising''. Normalisability is a rare feature of most non-trivial type theories, which is a big departure from the logical world. (Recall that almost every logical derivation has an equivalent normal derivation.) To sketch the reason: in type theories that admit recursive definitions, it is possible to write programs that never reduce to a value; such looping programs can generally be given any type. In particular, the looping program has type ⊥, although there is no logical proof of "⊥". For this reason, the ''propositions as types; proofs as programs'' paradigm only works in one direction, if at all: interpreting a type theory as a logic generally gives an inconsistent logic.


Example: Dependent Type Theory

Like logic, type theory has many extensions and variants, including first-order and higher-order versions. One branch, known as
dependent type theory In computer science and logic, a dependent type is a type whose definition depends on a value. It is an overlapping feature of type theory and type systems. In intuitionistic type theory, dependent types are used to encode logic's quantifiers lik ...
, is used in a number of
computer-assisted proof Automation describes a wide range of technologies that reduce human intervention in processes, mainly by predetermining decision criteria, subprocess relationships, and related actions, as well as embodying those predeterminations in machine ...
systems. Dependent type theory allows quantifiers to range over programs themselves. These quantified types are written as Π and Σ instead of ∀ and ∃, and have the following formation rules: {, style="margin-left: 2em;" , - , Γ ⊢ A type Γ, x:A ⊢ B type ───────────────────────────── Π-F Γ ⊢ Πx:A. B type , width="10%" , , , Γ ⊢ A type Γ, x:A ⊢ B type ──────────────────────────── Σ-F Γ ⊢ Σx:A. B type These types are generalisations of the arrow and product types, respectively, as witnessed by their introduction and elimination rules. {, style="margin-left: 2em;" , - , Γ, x:A ⊢ π : B ──────────────────── ΠI Γ ⊢ λx. π : Πx:A. B , width="10%" , , , Γ ⊢ π1 : Πx:A. B Γ ⊢ π2 : A ───────────────────────────── ΠE Γ ⊢ π1 π2 : 2/xB {, style="margin-left: 2em;" , - , Γ ⊢ π1 : A Γ, x:A ⊢ π2 : B ───────────────────────────── ΣI Γ ⊢ (π1, π2) : Σx:A. B , width="10%" , , , Γ ⊢ π : Σx:A. B ──────────────── ΣE1 Γ ⊢ fst π : A {, style="margin-left: 2em;" , - , Γ ⊢ π : Σx:A. B ──────────────────────── ΣE2 Γ ⊢ snd π : ''fst π/xB Dependent type theory in full generality is very powerful: it is able to express almost any conceivable property of programs directly in the types of the program. This generality comes at a steep price — either typechecking is undecidable (
extensional type theory Intuitionistic type theory (also known as constructive type theory, or Martin-Löf type theory (MLTT)) is a type theory and an alternative foundation of mathematics. Intuitionistic type theory was created by Per Martin-Löf, a Swedish mathematicia ...
), or extensional reasoning is more difficult ( intensional type theory). For this reason, some dependent type theories do not allow quantification over arbitrary programs, but rather restrict to programs of a given decidable ''index domain'', for example integers, strings, or linear programs. Since dependent type theories allow types to depend on programs, a natural question to ask is whether it is possible for programs to depend on types, or any other combination. There are many kinds of answers to such questions. A popular approach in type theory is to allow programs to be quantified over types, also known as ''
parametric polymorphism In programming languages and type theory, parametric polymorphism allows a single piece of code to be given a "generic" type, using variables in place of actual types, and then instantiated with particular types as needed. Parametrically polymorph ...
''; of this there are two main kinds: if types and programs are kept separate, then one obtains a somewhat more well-behaved system called '' predicative polymorphism''; if the distinction between program and type is blurred, one obtains the type-theoretic analogue of higher-order logic, also known as '' impredicative polymorphism''. Various combinations of dependency and polymorphism have been considered in the literature, the most famous being the
lambda cube In mathematical logic and type theory, the λ-cube (also written lambda cube) is a framework introduced by Henk Barendregt to investigate the different dimensions in which the calculus of constructions is a generalization of the simply typed λ-c ...
of
Henk Barendregt Hendrik Pieter (Henk) Barendregt (born 18 December 1947, Amsterdam) is a Dutch Mathematical logic, logician, known for his work in lambda calculus and type theory. Life and work Barendregt studied mathematical logic at Utrecht University, obtai ...
. The intersection of logic and type theory is a vast and active research area. New logics are usually formalised in a general type theoretic setting, known as a logical framework. Popular modern logical frameworks such as the
calculus of constructions In mathematical logic and computer science, the calculus of constructions (CoC) is a type theory created by Thierry Coquand. It can serve as both a typed programming language and as constructive foundation for mathematics. For this second reaso ...
and LF are based on higher-order dependent type theory, with various trade-offs in terms of decidability and expressive power. These logical frameworks are themselves always specified as natural deduction systems, which is a testament to the versatility of the natural deduction approach.


Classical and modal logics

For simplicity, the logics presented so far have been
intuitionistic In the philosophy of mathematics, intuitionism, or neointuitionism (opposed to preintuitionism), is an approach where mathematics is considered to be purely the result of the constructive mental activity of humans rather than the discovery of f ...
.
Classical logic Classical logic (or standard logic) or Frege–Russell logic is the intensively studied and most widely used class of deductive logic. Classical logic has had much influence on analytic philosophy. Characteristics Each logical system in this c ...
extends intuitionistic logic with an additional
axiom An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word (), meaning 'that which is thought worthy or ...
or principle of
excluded middle In logic, the law of excluded middle or the principle of excluded middle states that for every proposition, either this proposition or its negation is true. It is one of the three laws of thought, along with the law of noncontradiction and th ...
: :For any proposition p, the proposition p ∨ ¬p is true. This statement is not obviously either an introduction or an elimination; indeed, it involves two distinct connectives. Gentzen's original treatment of excluded middle prescribed one of the following three (equivalent) formulations, which were already present in analogous forms in the systems of
Hilbert David Hilbert (; ; 23 January 1862 – 14 February 1943) was a German mathematician and philosophy of mathematics, philosopher of mathematics and one of the most influential mathematicians of his time. Hilbert discovered and developed a broad ...
and
Heyting __NOTOC__ Arend Heyting (; 9 May 1898 – 9 July 1980) was a Dutch mathematician and logician. Biography Heyting was a student of Luitzen Egbertus Jan Brouwer at the University of Amsterdam, and did much to put intuitionistic logic on a f ...
: {, style="margin-left: 2em;" , - , ────────────── XM1 A ∨ ¬A , width="5%" , , , ¬¬A ────────── XM2 A , width="5%" , , , ──────── ''u'' ¬A ⋮ ''p'' ────── XM3''u, p'' A (XM3 is merely XM2 expressed in terms of E.) This treatment of excluded middle, in addition to being objectionable from a purist's standpoint, introduces additional complications in the definition of normal forms. A comparatively more satisfactory treatment of classical natural deduction in terms of introduction and elimination rules alone was first proposed by Parigot in 1992 in the form of a classical
lambda calculus In mathematical logic, the lambda calculus (also written as ''λ''-calculus) is a formal system for expressing computability, computation based on function Abstraction (computer science), abstraction and function application, application using var ...
called λμ. The key insight of his approach was to replace a truth-centric judgment ''A'' with a more classical notion, reminiscent of the
sequent calculus In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology (called a sequent by Gerhard Gentzen) instead of an unconditional tautology. Each conditional tautolog ...
: in localised form, instead of Γ ⊢ ''A'', he used Γ ⊢ Δ, with Δ a collection of propositions similar to Γ. Γ was treated as a conjunction, and Δ as a disjunction. This structure is essentially lifted directly from classical
sequent calculi In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology (called a sequent by Gerhard Gentzen) instead of an unconditional tautology. Each conditional tautology ...
, but the innovation in λμ was to give a computational meaning to classical natural deduction proofs in terms of a callcc or a throw/catch mechanism seen in
LISP Lisp (historically LISP, an abbreviation of "list processing") is a family of programming languages with a long history and a distinctive, fully parenthesized Polish notation#Explanation, prefix notation. Originally specified in the late 1950s, ...
and its descendants. (See also: first class control.) Another important extension was for modal and other logics that need more than just the basic judgment of truth. These were first described, for the alethic modal logics S4 and S5, in a natural deduction style by Prawitz in 1965, and have since accumulated a large body of related work. To give a simple example, the modal logic S4 requires one new judgment, "''A valid''", that is categorical with respect to truth: :If "A" (is true) under no assumption that "B" (is true), then "A valid". This categorical judgment is internalised as a unary connective ◻''A'' (read "''necessarily A''") with the following introduction and elimination rules: {, style="margin-left: 2em;" , - , A valid ──────── ◻I ◻ A , width="5%" , , , ◻ A ──────── ◻E A Note that the premise "''A valid''" has no defining rules; instead, the categorical definition of validity is used in its place. This mode becomes clearer in the localised form when the hypotheses are explicit. We write "Ω;Γ ⊢ ''A''" where Γ contains the true hypotheses as before, and Ω contains valid hypotheses. On the right there is just a single judgment "''A''"; validity is not needed here since "Ω ⊢ ''A valid''" is by definition the same as "Ω;⋅ ⊢ ''A''". The introduction and elimination forms are then: {, style="margin-left: 2em;" , - , Ω;⋅ ⊢ π : A ──────────────────── ◻I Ω;⋅ ⊢ box π : ◻ A , width="5%" , , , Ω;Γ ⊢ π : ◻ A ────────────────────── ◻E Ω;Γ ⊢ unbox π : A The modal hypotheses have their own version of the hypothesis rule and substitution theorem. {, style="margin-left: 2em;" , - , ─────────────────────────────── valid-hyp Ω, u: (A valid) ; Γ ⊢ u : A


Modal substitution theorem

; : ''If'' Ω;⋅ ⊢ π1 : ''A'' ''and'' Ω, ''u'': (''A valid'') ; Γ ⊢ π2 : ''C'', ''then'' Ω;Γ ⊢ 1/''u''π2 : ''C''. This framework of separating judgments into distinct collections of hypotheses, also known as ''multi-zoned'' or ''polyadic'' contexts, is very powerful and extensible; it has been applied for many different modal logics, and also for
linear In mathematics, the term ''linear'' is used in two distinct senses for two different properties: * linearity of a '' function'' (or '' mapping''); * linearity of a '' polynomial''. An example of a linear function is the function defined by f(x) ...
and other
substructural logic In logic, a substructural logic is a logic lacking one of the usual structural rules (e.g. of classical and intuitionistic logic), such as weakening, contraction, exchange or associativity. Two of the more significant substructural logics a ...
s, to give a few examples. However, relatively few systems of modal logic can be formalised directly in natural deduction. To give proof-theoretic characterisations of these systems, extensions such as labelling or systems of deep inference. The addition of labels to formulae permits much finer control of the conditions under which rules apply, allowing the more flexible techniques of
analytic tableau In proof theory, the semantic tableau (; plural: tableaux), also called an analytic tableau, truth tree, or simply tree, is a decision procedure for sentential and related logics, and a proof procedure for formulae of first-order logic. An analy ...
x to be applied, as has been done in the case of labelled deduction. Labels also allow the naming of worlds in Kripke semantics; presents an influential technique for converting frame conditions of modal logics in Kripke semantics into inference rules in a natural deduction formalisation of hybrid logic. surveys the application of many proof theories, such as Avron and Pottinger's
hypersequent In mathematical logic, the hypersequent framework is an extension of the proof-theoretical framework of sequent calculi used in structural proof theory to provide analytic calculi for logics that are not captured in the sequent framework. A hyper ...
s and Belnap's display logic to such modal logics as S5 and B.


Comparison with sequent calculus

The sequent calculus is the chief alternative to natural deduction as a foundation of
mathematical logic Mathematical logic is the study of Logic#Formal logic, formal logic within mathematics. Major subareas include model theory, proof theory, set theory, and recursion theory (also known as computability theory). Research in mathematical logic com ...
. In natural deduction the flow of information is bi-directional: elimination rules flow information downwards by deconstruction, and introduction rules flow information upwards by assembly. Thus, a natural deduction proof does not have a purely bottom-up or top-down reading, making it unsuitable for automation in proof search. To address this fact, Gentzen in 1935 proposed his
sequent calculus In mathematical logic, sequent calculus is a style of formal logical argumentation in which every line of a proof is a conditional tautology (called a sequent by Gerhard Gentzen) instead of an unconditional tautology. Each conditional tautolog ...
, though he initially intended it as a technical device for clarifying the consistency of
predicate logic First-order logic, also called predicate logic, predicate calculus, or quantificational logic, is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables ove ...
.
Kleene Stephen Cole Kleene ( ; January 5, 1909 – January 25, 1994) was an American mathematician. One of the students of Alonzo Church, Kleene, along with Rózsa Péter, Alan Turing, Emil Post, and others, is best known as a founder of the branch of ...
, in his seminal 1952 book ''Introduction to Metamathematics'', gave the first formulation of the sequent calculus in the modern style.. See also . In the sequent calculus all inference rules have a purely bottom-up reading. Inference rules can apply to elements on both sides of the
turnstile A turnstile (also called a gateline, baffle gate, automated gate, turn gate in some regions) is a form of gate which allows one person to pass at a time. A turnstile can be configured to enforce One-way traffic#One-way traffic of people, one-way ...
. (To differentiate from natural deduction, this article uses a double arrow ⇒ instead of the right tack ⊢ for sequents.) The introduction rules of natural deduction are viewed as ''right rules'' in the sequent calculus, and are structurally very similar. The elimination rules on the other hand turn into ''left rules'' in the sequent calculus. To give an example, consider disjunction; the right rules are familiar: {, style="margin-left: 2em;" , - , Γ ⇒ A ───────── ∨R1 Γ ⇒ A ∨ B , with="10%" , , , Γ ⇒ B ───────── ∨R2 Γ ⇒ A ∨ B On the left: {, style="margin-left: 2em;" , - , Γ, u:A ⇒ C Γ, v:B ⇒ C ─────────────────────────── ∨L Γ, w: (A ∨ B) ⇒ C Recall the ∨E rule of natural deduction in localised form: {, style="margin-left: 2em;" , - , Γ ⊢ A ∨ B Γ, u:A ⊢ C Γ, v:B ⊢ C ─────────────────────────────────────── ∨E Γ ⊢ C The proposition ''A ∨ B'', which is the succedent of a premise in ∨E, turns into a hypothesis of the conclusion in the left rule ∨L. Thus, left rules can be seen as a sort of inverted elimination rule. This observation can be illustrated as follows: {, align="center" , - ! natural deduction , ! sequent calculus , - ,
 ────── hyp
 , 
 ,  elim. rules
 , 
 ↓
 ────────────────────── ↑↓ meet
 ↑
 , 
 ,  intro. rules
 , 
 conclusion
, width="20%" align="center" , ⇒ , ,
 ─────────────────────────── init
 ↑            ↑
 ,             , 
 ,  left rules ,  right rules
 ,             , 
 conclusion
In the sequent calculus, the left and right rules are performed in lock-step until one reaches the ''initial sequent'', which corresponds to the meeting point of elimination and introduction rules in natural deduction. These initial rules are superficially similar to the hypothesis rule of natural deduction, but in the sequent calculus they describe a ''transposition'' or a ''handshake'' of a left and a right proposition: {, style="margin-left: 2em;" , - , ────────── init Γ, u:A ⇒ A The correspondence between the sequent calculus and natural deduction is a pair of soundness and completeness theorems, which are both provable by means of an inductive argument. ; Soundness of ⇒ wrt. ⊢ : ''If'' Γ ⇒ ''A'', ''then'' Γ ⊢ ''A''. ; Completeness of ⇒ wrt. ⊢ : ''If'' Γ ⊢ ''A'', ''then'' Γ ⇒ ''A''. It is clear by these theorems that the sequent calculus does not change the notion of truth, because the same collection of propositions remain true. Thus, one can use the same proof objects as before in sequent calculus derivations. As an example, consider the conjunctions. The right rule is virtually identical to the introduction rule {, style="margin-left: 2em;" , - ! sequent calculus , ! natural deduction , - , Γ ⇒ π1 : A Γ ⇒ π2 : B ─────────────────────────── ∧R Γ ⇒ (π1, π2) : A ∧ B , width="20%" , , , Γ ⊢ π1 : A Γ ⊢ π2 : B ───────────────────────── ∧I Γ ⊢ (π1, π2) : A ∧ B The left rule, however, performs some additional substitutions that are not performed in the corresponding elimination rules. {, style="margin-left: 2em;" , - ! sequent calculus , ! natural deduction , - , Γ, u:A ⇒ π : C ──────────────────────────────── ∧L1 Γ, v: (A ∧ B) ⇒ ''fst v/uπ : C , width="20%" , , , Γ ⊢ π : A ∧ B ───────────── ∧E1 Γ ⊢ fst π : A , - , Γ, u:B ⇒ π : C ──────────────────────────────── ∧L2 Γ, v: (A ∧ B) ⇒ ''snd v/uπ : C , width="20%" , , , Γ ⊢ π : A ∧ B ───────────── ∧E2 Γ ⊢ snd π : B The kinds of proofs generated in the sequent calculus are therefore rather different from those of natural deduction. The sequent calculus produces proofs in what is known as the ''β-normal η-long'' form, which corresponds to a canonical representation of the normal form of the natural deduction proof. If one attempts to describe these proofs using natural deduction itself, one obtains what is called the ''intercalation calculus'' (first described by John Byrnes), which can be used to formally define the notion of a ''normal form'' for natural deduction. The substitution theorem of natural deduction takes the form of a
structural rule In the logical discipline of proof theory, a structural rule is an inference rule of a sequent calculus that does not refer to any logical connective but instead operates on the sequents directly. Structural rules often mimic the intended meta-the ...
or structural theorem known as ''cut'' in the sequent calculus.


Cut (substitution)

: ''If'' Γ ⇒ π1 : ''A'' ''and'' Γ, ''u'':''A'' ⇒ π2 : ''C'', ''then'' Γ ⇒ 1/uπ2 : ''C''. In most well behaved logics, cut is unnecessary as an inference rule, though it remains provable as a
meta-theorem In logic, a metatheorem is a statement about a formal system proven in a metalanguage. Unlike theorems proved within a given formal system, a metatheorem is proved within a metatheory, and may reference concepts that are present in the metatheory ...
; the superfluousness of the cut rule is usually presented as a computational process, known as ''cut elimination''. This has an interesting application for natural deduction; usually it is extremely tedious to prove certain properties directly in natural deduction because of an unbounded number of cases. For example, consider showing that a given proposition is ''not'' provable in natural deduction. A simple inductive argument fails because of rules like ∨E or E which can introduce arbitrary propositions. However, we know that the sequent calculus is complete with respect to natural deduction, so it is enough to show this unprovability in the sequent calculus. Now, if cut is not available as an inference rule, then all sequent rules either introduce a connective on the right or the left, so the depth of a sequent derivation is fully bounded by the connectives in the final conclusion. Thus, showing unprovability is much easier, because there are only a finite number of cases to consider, and each case is composed entirely of sub-propositions of the conclusion. A simple instance of this is the ''global consistency'' theorem: "⋅ ⊢ ⊥" is not provable. In the sequent calculus version, this is manifestly true because there is no rule that can have "⋅ ⇒ ⊥" as a conclusion! Proof theorists often prefer to work on cut-free sequent calculus formulations because of such properties.


See also


Notes


References


General references

* * * * * * * : * : * Translated and with appendices by Paul Taylor and Yves Lafont. * * Reprinted in ''Polish logic 1920–39'', ed. Storrs McCall. * * * * * * * Lecture notes to a short course at Università degli Studi di Siena, April 1983. * * * * * * * * * * * * MSc thesis. * * * * * *


External links

* * * * * * {{DEFAULTSORT:Natural Deduction Logical calculi Deductive reasoning Proof theory Methods of proof