HOME

TheInfoList



OR:

Propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes
zeroth-order logic Zeroth-order logic is first-order logic without variables or quantifiers. Some authors use the phrase "zeroth-order logic" as a synonym for the propositional calculus,. but an alternative definition extends propositional logic by adding constants ...
. It deals with
propositions In logic and linguistics, a proposition is the meaning of a declarative sentence. In philosophy, " meaning" is understood to be a non-linguistic entity which is shared by all sentences with the same meaning. Equivalently, a proposition is the no ...
(which can be true or false) and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives. Propositions that contain no logical connectives are called atomic propositions. Unlike first-order logic, propositional logic does not deal with non-logical objects, predicates about them, or quantifiers. However, all the machinery of propositional logic is included in first-order logic and higher-order logics. In this sense, propositional logic is the foundation of first-order logic and higher-order logic.


Explanation

Logical connectives are found in natural languages. In English for example, some examples are "and" ( conjunction), "or" ( disjunction), "not" (
negation In logic, negation, also called the logical complement, is an operation that takes a proposition P to another proposition "not P", written \neg P, \mathord P or \overline. It is interpreted intuitively as being true when P is false, and false ...
) and "if" (but only when used to denote material conditional). The following is an example of a very simple inference within the scope of propositional logic: :Premise 1: If it's raining then it's cloudy. :Premise 2: It's raining. :Conclusion: It's cloudy. Both premises and the conclusion are propositions. The premises are taken for granted, and with the application of modus ponens (an
inference rule In the philosophy of logic, a rule of inference, inference rule or transformation rule is a logical form consisting of a function which takes premises, analyzes their syntax, and returns a conclusion (or conclusions). For example, the rule of ...
), the conclusion follows. As propositional logic is not concerned with the structure of propositions beyond the point where they can't be decomposed any more by logical connectives, this inference can be restated replacing those ''atomic'' statements with statement letters, which are interpreted as variables representing statements: :Premise 1: P \to Q :Premise 2: P :Conclusion: Q The same can be stated succinctly in the following way: :\frac When is interpreted as "It's raining" and as "it's cloudy" the above symbolic expressions can be seen to correspond exactly with the original expression in natural language. Not only that, but they will also correspond with any other inference of this ''form'', which will be valid on the same basis this inference is. Propositional logic may be studied through a formal system in which formulas of a
formal language In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules. The alphabet of a formal language consists of s ...
may be interpreted to represent
propositions In logic and linguistics, a proposition is the meaning of a declarative sentence. In philosophy, " meaning" is understood to be a non-linguistic entity which is shared by all sentences with the same meaning. Equivalently, a proposition is the no ...
. A system of
axiom An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word (), meaning 'that which is thought worthy o ...
s and
inference rules In the philosophy of logic, a rule of inference, inference rule or transformation rule is a logical form consisting of a function which takes premises, analyzes their syntax, and returns a conclusion (or conclusions). For example, the rule of ...
allows certain formulas to be derived. These derived formulas are called theorems and may be interpreted to be true propositions. A constructed sequence of such formulas is known as a '' derivation'' or ''proof'' and the last formula of the sequence is the theorem. The derivation may be interpreted as proof of the proposition represented by the theorem. When a formal system is used to represent formal logic, only statement letters (usually capital roman letters such as P, Q and R) are represented directly. The natural language propositions that arise when they're interpreted are outside the scope of the system, and the relation between the formal system and its interpretation is likewise outside the formal system itself. In classical truth-functional propositional logic, formulas are interpreted as having precisely one of two possible truth values, the truth value of ''true'' or the truth value of ''false''. The principle of bivalence and the law of excluded middle are upheld. Truth-functional propositional logic defined as such and systems isomorphic to it are considered to be
zeroth-order logic Zeroth-order logic is first-order logic without variables or quantifiers. Some authors use the phrase "zeroth-order logic" as a synonym for the propositional calculus,. but an alternative definition extends propositional logic by adding constants ...
. However, alternative propositional logics are also possible. For more, see Other logical calculi below.


History

Although propositional logic (which is interchangeable with propositional calculus) had been hinted by earlier philosophers, it was developed into a formal logic ( Stoic logic) by Chrysippus in the 3rd century BC and expanded by his successor
Stoics Stoicism is a school of Hellenistic philosophy founded by Zeno of Citium in Athens in the early 3rd century BCE. It is a philosophy of personal virtue ethics informed by its system of logic and its views on the natural world, asserting th ...
. The logic was focused on
proposition In logic and linguistics, a proposition is the meaning of a declarative sentence. In philosophy, "meaning" is understood to be a non-linguistic entity which is shared by all sentences with the same meaning. Equivalently, a proposition is the no ...
s. This advancement was different from the traditional syllogistic logic, which was focused on terms. However, most of the original writings were lost and the propositional logic developed by the Stoics was no longer understood later in antiquity. Consequently, the system was essentially reinvented by Peter Abelard in the 12th century. Propositional logic was eventually refined using symbolic logic. The 17th/18th-century mathematician
Gottfried Leibniz Gottfried Wilhelm (von) Leibniz . ( – 14 November 1716) was a German polymath active as a mathematician, philosopher, scientist and diplomat. He is one of the most prominent figures in both the history of philosophy and the history of mat ...
has been credited with being the founder of symbolic logic for his work with the calculus ratiocinator. Although his work was the first of its kind, it was unknown to the larger logical community. Consequently, many of the advances achieved by Leibniz were recreated by logicians like
George Boole George Boole (; 2 November 1815 – 8 December 1864) was a largely self-taught English mathematician, philosopher, and logician, most of whose short career was spent as the first professor of mathematics at Queen's College, Cork in ...
and Augustus De Morgan—completely independent of Leibniz. Just as propositional logic can be considered an advancement from the earlier syllogistic logic, Gottlob Frege's
predicate logic First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quanti ...
can be also considered an advancement from the earlier propositional logic. One author describes predicate logic as combining "the distinctive features of syllogistic logic and propositional logic." Consequently, predicate logic ushered in a new era in logic's history; however, advances in propositional logic were still made after Frege, including natural deduction, truth trees and truth tables. Natural deduction was invented by Gerhard Gentzen and Jan Łukasiewicz. Truth trees were invented by Evert Willem Beth. The invention of truth tables, however, is of uncertain attribution. Within works by FregeTruth in Frege
/ref> and
Bertrand Russell Bertrand Arthur William Russell, 3rd Earl Russell, (18 May 1872 – 2 February 1970) was a British mathematician, philosopher, logician, and public intellectual. He had a considerable influence on mathematics, logic, set theory, linguistics, ar ...
, are ideas influential to the invention of truth tables. The actual tabular structure (being formatted as a table), itself, is generally credited to either Ludwig Wittgenstein or Emil Post (or both, independently). Besides Frege and Russell, others credited with having ideas preceding truth tables include Philo, Boole,
Charles Sanders Peirce Charles Sanders Peirce ( ; September 10, 1839 – April 19, 1914) was an American philosopher, logician, mathematician and scientist who is sometimes known as "the father of pragmatism". Educated as a chemist and employed as a scientist for ...
, and Ernst Schröder. Others credited with the tabular structure include Jan Łukasiewicz, Alfred North Whitehead, William Stanley Jevons, John Venn, and Clarence Irving Lewis. Ultimately, some have concluded, like John Shosky, that "It is far from clear that any one person should be given the title of 'inventor' of truth-tables.".


Terminology

In general terms, a calculus is a formal system that consists of a set of syntactic expressions ('' well-formed formulas''), a distinguished subset of these expressions (axioms), plus a set of formal rules that define a specific
binary relation In mathematics, a binary relation associates elements of one set, called the ''domain'', with elements of another set, called the ''codomain''. A binary relation over Set (mathematics), sets and is a new set of ordered pairs consisting of ele ...
, intended to be interpreted as logical equivalence, on the space of expressions. When the formal system is intended to be a logical system, the expressions are meant to be interpreted as statements, and the rules, known to be ''inference rules'', are typically intended to be truth-preserving. In this setting, the rules, which may include axioms, can then be used to derive ("infer") formulas representing true statements—from given formulas representing true statements. The set of axioms may be empty, a nonempty finite set, or a countably infinite set (see
axiom schema In mathematical logic, an axiom schema (plural: axiom schemata or axiom schemas) generalizes the notion of axiom. Formal definition An axiom schema is a formula in the metalanguage of an axiomatic system, in which one or more schematic variables ...
). A
formal grammar In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) describes how to form strings from a language's alphabet that are valid according to the language's syntax. A grammar does not describe ...
recursively defines the expressions and well-formed formulas of the
language Language is a structured system of communication. The structure of a language is its grammar and the free components are its vocabulary. Languages are the primary means by which humans communicate, and may be conveyed through a variety of ...
. In addition a semantics may be given which defines truth and valuations (or interpretations). The
language Language is a structured system of communication. The structure of a language is its grammar and the free components are its vocabulary. Languages are the primary means by which humans communicate, and may be conveyed through a variety of ...
of a propositional calculus consists of # a set of primitive symbols, variously referred to as '' atomic formulas'', ''placeholders'', ''proposition letters'', or ''variables'', and # a set of operator symbols, variously interpreted as '' logical operators'' or ''logical connectives''. A '' well-formed formula'' is any atomic formula, or any formula that can be built up from atomic formulas by means of operator symbols according to the rules of the grammar. Mathematicians sometimes distinguish between propositional constants, propositional variables, and schemata. Propositional constants represent some particular proposition, while propositional variables range over the set of all atomic propositions. Schemata, however, range over all propositions. It is common to represent propositional constants by , , and , propositional variables by , , and , and schematic letters are often Greek letters, most often , , and .


Basic concepts

The following outlines a standard propositional calculus. Many different formulations exist which are all more or less equivalent, but differ in the details of: # their language (i.e., the particular collection of primitive symbols and operator symbols), # the set of axioms, or distinguished formulas, and # the set of inference rules. Any given proposition may be represented with a letter called a 'propositional constant', analogous to representing a number by a letter in mathematics (e.g., ). All propositions require exactly one of two truth-values: true or false. For example, let be the proposition that it is raining outside. This will be true () if it is raining outside, and false otherwise (). *We then define truth-functional operators, beginning with negation. represents the negation of , which can be thought of as the denial of . In the example above, expresses that it is not raining outside, or by a more standard reading: "It is not the case that it is raining outside." When is true, is false; and when is false, is true. As a result, always has the same truth-value as . *Conjunction is a truth-functional connective which forms a proposition out of two simpler propositions, for example, and . The conjunction of and is written , and expresses that each are true. We read as " and ". For any two propositions, there are four possible assignments of truth values: *# is true and is true *# is true and is false *# is false and is true *# is false and is false :The conjunction of and is true in case 1, and is false otherwise. Where is the proposition that it is raining outside and is the proposition that a cold-front is over Kansas, is true when it is raining outside ''and'' there is a cold-front over Kansas. If it is not raining outside, then is false; and if there is no cold-front over Kansas, then is also false. *Disjunction resembles conjunction in that it forms a proposition out of two simpler propositions. We write it , and it is read " or ". It expresses that either or is true. Thus, in the cases listed above, the disjunction of with is true in all cases—except case 4. Using the example above, the disjunction expresses that it is either raining outside, or there is a cold front over Kansas. (Note, this use of disjunction is supposed to resemble the use of the English word "or". However, it is most like the English inclusive "or", which can be used to express the truth of at least one of two propositions. It is not like the English
exclusive Exclusive may refer to: Arts and entertainment * ''Exclusive'' (album), by R&B singer Chris Brown * ''Exclusive'' (EP), an EP by U2 * ''Exclusive'', a 1937 American film * ''Exclusive'', a 1989 play by Jeffrey Archer * ''Exclusive'' (TV serie ...
"or", which expresses the truth of exactly one of two propositions. In other words, the exclusive "or" is false when both and are true (case 1), and similarly is false when both and are false (case 4). An example of the exclusive or is: You may have a bagel or a pastry, but not both. Often in natural language, given the appropriate context, the addendum "but not both" is omitted—but implied. In mathematics, however, "or" is always inclusive or; if exclusive or is meant it will be specified, possibly by "xor".) *Material conditional also joins two simpler propositions, and we write , which is read "if then ". The proposition to the left of the arrow is called the antecedent, and the proposition to the right is called the consequent. (There is no such designation for conjunction or disjunction, since they are commutative operations.) It expresses that is true whenever is true. Thus is true in every case above except case 2, because this is the only case when is true but is not. Using the example, if then expresses that if it is raining outside, then there is a cold-front over Kansas. The material conditional is often confused with physical causation. The material conditional, however, only relates two propositions by their truth-values—which is not the relation of cause and effect. It is contentious in the literature whether the material implication represents logical causation. *Biconditional joins two simpler propositions, and we write , which is read " if and only if ". It expresses that and have the same truth-value, and in cases 1 and 4. ' is true if and only if ' is true, and is false otherwise. It is very helpful to look at the truth tables for these different operators, as well as the method of analytic tableaux.


Closure under operations

Propositional logic is closed under truth-functional connectives. That is to say, for any proposition , is also a proposition. Likewise, for any propositions and , is a proposition, and similarly for disjunction, conditional, and biconditional. This implies that, for instance, is a proposition, and so it can be conjoined with another proposition. In order to represent this, we need to use parentheses to indicate which proposition is conjoined with which. For instance, is not a well-formed formula, because we do not know if we are conjoining with or if we are conjoining with . Thus we must write either to represent the former, or to represent the latter. By evaluating the truth conditions, we see that both expressions have the same truth conditions (will be true in the same cases), and moreover that any proposition formed by arbitrary conjunctions will have the same truth conditions, regardless of the location of the parentheses. This means that conjunction is associative, however, one should not assume that parentheses never serve a purpose. For instance, the sentence does not have the same truth conditions of , so they are different sentences distinguished only by the parentheses. One can verify this by the truth-table method referenced above. Note: For any arbitrary number of propositional constants, we can form a finite number of cases which list their possible truth-values. A simple way to generate this is by truth-tables, in which one writes , , ..., , for any list of propositional constants—that is to say, any list of propositional constants with entries. Below this list, one writes rows, and below one fills in the first half of the rows with true (or T) and the second half with false (or F). Below one fills in one-quarter of the rows with T, then one-quarter with F, then one-quarter with T and the last quarter with F. The next column alternates between true and false for each eighth of the rows, then sixteenths, and so on, until the last propositional constant varies between T and F for each row. This will give a complete listing of cases or truth-value assignments possible for those propositional constants.


Argument

The propositional calculus then defines an ''
argument An argument is a statement or group of statements called premises intended to determine the degree of truth or acceptability of another statement called conclusion. Arguments can be studied from three main perspectives: the logical, the dialect ...
'' to be a list of propositions. A valid argument is a list of propositions, the last of which follows from—or is implied by—the rest. All other arguments are invalid. The simplest valid argument is modus ponens, one instance of which is the following list of propositions: : \begin 1. & P \to Q \\ 2. & P \\ \hline \therefore & Q \end This is a list of three propositions, each line is a proposition, and the last follows from the rest. The first two lines are called premises, and the last line the conclusion. We say that any proposition follows from any set of propositions (P_1, ..., P_n), if must be true whenever every member of the set (P_1, ..., P_n) is true. In the argument above, for any and , whenever and are true, necessarily is true. Notice that, when is true, we cannot consider cases 3 and 4 (from the truth table). When is true, we cannot consider case 2. This leaves only case 1, in which is also true. Thus is implied by the premises. This generalizes schematically. Thus, where and may be any propositions at all, : \begin 1. & \varphi \to \psi \\ 2. & \varphi \\ \hline \therefore & \psi \end Other argument forms are convenient, but not necessary. Given a complete set of axioms (see below for one such set), modus ponens is sufficient to prove all other argument forms in propositional logic, thus they may be considered to be a derivative. Note, this is not true of the extension of propositional logic to other logics like first-order logic. First-order logic requires at least one additional rule of inference in order to obtain completeness. The significance of argument in formal logic is that one may obtain new truths from established truths. In the first example above, given the two premises, the truth of is not yet known or stated. After the argument is made, is deduced. In this way, we define a deduction system to be a set of all propositions that may be deduced from another set of propositions. For instance, given the set of propositions A = \, we can define a deduction system, , which is the set of all propositions which follow from . Reiteration is always assumed, so P \lor Q, \neg Q \land R, (P \lor Q) \to R \in \Gamma. Also, from the first element of , last element, as well as modus ponens, is a consequence, and so R \in \Gamma. Because we have not included sufficiently complete axioms, though, nothing else may be deduced. Thus, even though most deduction systems studied in propositional logic are able to deduce (P \lor Q) \leftrightarrow (\neg P \to Q), this one is too weak to prove such a proposition.


Generic description of a propositional calculus

A propositional calculus is a formal system \mathcal = \mathcal \left( \Alpha,\ \Omega,\ \Zeta,\ \Iota \right), where: The ''language'' of \mathcal, also known as its set of ''formulas'', '' well-formed formulas'', is inductively defined by the following rules: # Base: Any element of the alpha set \Alpha is a formula of \mathcal. # If p_1, p_2, \ldots, p_j are formulas and f is in \Omega_j, then \left( f p_1 p_2 \ldots p_j \right) is a formula. # Closed: Nothing else is a formula of \mathcal. Repeated applications of these rules permits the construction of complex formulas. For example: * By rule 1, is a formula. * By rule 2, \neg p is a formula. * By rule 1, is a formula. * By rule 2, ( \neg p \lor q ) is a formula.


Example 1. Simple axiom system

Let \mathcal_1 = \mathcal(\Alpha,\Omega,\Zeta,\Iota), where \Alpha, \Omega, \Zeta, \Iota are defined as follows: * The set \Alpha, the countably infinite set of symbols that serve to represent logical propositions: *: \Alpha = \. * The functionally complete set \Omega of logical operators (logical connectives and negation) is as follows. Of the three connectives for conjunction, disjunction, and implication (\wedge, \lor, and ), one can be taken as primitive and the other two can be defined in terms of it and negation ().Wernick, William (1942) "Complete Sets of Logical Functions," ''Transactions of the American Mathematical Society'' 51, pp. 117–132. Alternatively, all of the logical operators may be defined in terms of a sole sufficient operator, such as the Sheffer stroke (nand). The biconditional (a \leftrightarrow b) can of course be defined in terms of conjunction and implication as (a \to b) \land (b \to a). Adopting negation and implication as the two primitive operations of a propositional calculus is tantamount to having the omega set \Omega = \Omega_1 \cup \Omega_2 partition as follows: *: \Omega_1 = \, *: \Omega_2 = \. Then a \lor b is defined as \neg a \to b, and a \land b is defined as \neg(a \to \neg b). * The set \Iota (the set of initial points of logical deduction, i.e., logical axioms) is the axiom system proposed by Jan Łukasiewicz, and used as the propositional-calculus part of a Hilbert system. The axioms are all substitution instances of: ** p \to (q \to p) ** (p \to (q \to r)) \to ((p \to q) \to (p \to r)) ** (\neg p \to \neg q) \to (q \to p) * The set \Zeta of transformation rules (rules of inference) is the sole rule modus ponens (i.e., from any formulas of the form \varphi and (\varphi \to \psi), infer \psi). This system is used in Metamathbr>set.mm
formal proof database.


Example 2. Natural deduction system

Let \mathcal_2 = \mathcal(\Alpha, \Omega, \Zeta, \Iota), where \Alpha, \Omega, \Zeta, \Iota are defined as follows: * The alpha set \Alpha, is a countably infinite set of symbols, for example: *: \Alpha = \. * The omega set \Omega = \Omega_1 \cup \Omega_2 partitions as follows: *: \Omega_1 = \, *: \Omega_2 = \. In the following example of a propositional calculus, the transformation rules are intended to be interpreted as the inference rules of a so-called ''
natural deduction system In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use ax ...
''. The particular system presented here has no initial points, which means that its interpretation for logical applications derives its theorems from an empty axiom set. * The set of initial points is empty, that is, \Iota = \varnothing. * The set of transformation rules, \Zeta, is described as follows: Our propositional calculus has eleven inference rules. These rules allow us to derive other true formulas given a set of formulas that are assumed to be true. The first ten simply state that we can infer certain well-formed formulas from other well-formed formulas. The last rule however uses hypothetical reasoning in the sense that in the premise of the rule we temporarily assume an (unproven) hypothesis to be part of the set of inferred formulas to see if we can infer a certain other formula. Since the first ten rules don't do this they are usually described as ''non-hypothetical'' rules, and the last one as a ''hypothetical'' rule. In describing the transformation rules, we may introduce a metalanguage symbol \vdash. It is basically a convenient shorthand for saying "infer that". The format is \Gamma \vdash \psi, in which is a (possibly empty) set of formulas called premises, and is a formula called conclusion. The transformation rule \Gamma \vdash \psi means that if every proposition in is a theorem (or has the same truth value as the axioms), then is also a theorem. Note that considering the following rule Conjunction introduction, we will know whenever has more than one formula, we can always safely reduce it into one formula using conjunction. So for short, from that time on we may represent as one formula instead of a set. Another omission for convenience is when is an empty set, in which case may not appear. ; Negation introduction: From (p \to q) and (p \to \neg q), infer \neg p. : That is, \ \vdash \neg p. ;
Negation elimination In logic, negation, also called the logical complement, is an operation that takes a proposition P to another proposition "not P", written \neg P, \mathord P or \overline. It is interpreted intuitively as being true when P is false, and false ...
: From \neg p, infer (p \to r). : That is, \ \vdash (p \to r). ; Double negation elimination: From \neg \neg p, infer . : That is, \neg \neg p \vdash p. ; Conjunction introduction: From and , infer (p \land q). : That is, \ \vdash (p \land q). ; Conjunction elimination: From (p \land q), infer . : From (p \land q), infer . : That is, (p \land q) \vdash p and (p \land q) \vdash q. ; Disjunction introduction: From , infer (p \lor q). : From , infer (p \lor q). : That is, p \vdash (p \lor q) and q \vdash (p \lor q). ; Disjunction elimination: From (p \lor q) and (p \to r) and (q \to r), infer . : That is, \ \vdash r. ; Biconditional introduction: From (p \to q) and (q \to p), infer (p \leftrightarrow q). : That is, \ \vdash (p \leftrightarrow q). ; Biconditional elimination: From (p \leftrightarrow q), infer (p \to q). : From (p \leftrightarrow q), infer (q \to p). : That is, (p \leftrightarrow q) \vdash (p \to q) and (p \leftrightarrow q) \vdash (q \to p). ; Modus ponens (conditional elimination) : From and (p \to q), infer . : That is, \ \vdash q. ; Conditional proof (conditional introduction) : From ccepting allows a proof of infer (p \to q). : That is, (p \vdash q) \vdash (p \to q).


Basic and derived argument forms


Proofs in propositional calculus

One of the main uses of a propositional calculus, when interpreted for logical applications, is to determine relations of logical equivalence between propositional formulas. These relationships are determined by means of the available transformation rules, sequences of which are called ''derivations'' or ''proofs''. In the discussion to follow, a proof is presented as a sequence of numbered lines, with each line consisting of a single formula followed by a ''reason'' or ''justification'' for introducing that formula. Each premise of the argument, that is, an assumption introduced as an hypothesis of the argument, is listed at the beginning of the sequence and is marked as a "premise" in lieu of other justification. The conclusion is listed on the last line. A proof is complete if every line follows from the previous ones by the correct application of a transformation rule. (For a contrasting approach, see proof-trees).


Example of a proof in natural deduction system

* To be shown that . * One possible proof of this (which, though valid, happens to contain more steps than are necessary) may be arranged as follows: Interpret A \vdash A as "Assuming , infer ". Read \vdash A \to A as "Assuming nothing, infer that implies ", or "It is a tautology that implies ", or "It is always true that implies ".


Example of a proof in a classical propositional calculus system

We now prove the same theorem A \to A in the axiomatic system by Jan Łukasiewicz described above, which is an example of a Hilbert-style deductive system for the classical propositional calculus. The axioms are: :(A1) (p \to (q \to p)) :(A2) ((p \to (q \to r)) \to ((p \to q) \to (p \to r))) :(A3) ((\neg p \to \neg q) \to (q \to p)) And the proof is as follows: # A \to ((B \to A) \to A)       (instance of (A1)) # (A \to ((B \to A) \to A)) \to ((A \to (B \to A)) \to (A \to A))       (instance of (A2)) # (A \to (B \to A)) \to (A \to A)       (from (1) and (2) by modus ponens) # A \to (B \to A)       (instance of (A1)) # A \to A       (from (4) and (3) by modus ponens)


Soundness and completeness of the rules

The crucial properties of this set of rules are that they are ''
sound In physics, sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid or solid. In human physiology and psychology, sound is the ''reception'' of such waves and their ''perception'' by ...
'' and ''complete''. Informally this means that the rules are correct and that no other rules are required. These claims can be made more formal as follows. Note that the proofs for the soundness and completeness of the propositional logic are not themselves proofs in propositional logic ; these are theorems in ZFC used as a metatheory to prove properties of propositional logic. We define a ''truth assignment'' as a function that maps propositional variables to true or false. Informally such a truth assignment can be understood as the description of a possible state of affairs (or possible world) where certain statements are true and others are not. The semantics of formulas can then be formalized by defining for which "state of affairs" they are considered to be true, which is what is done by the following definition. We define when such a truth assignment satisfies a certain well-formed formula with the following rules: * satisfies the propositional variable
if and only if In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false. The connective is bi ...
* satisfies if and only if does not satisfy * satisfies if and only if satisfies both and * satisfies if and only if satisfies at least one of either or * satisfies if and only if it is not the case that satisfies but not * satisfies if and only if satisfies both and or satisfies neither one of them With this definition we can now formalize what it means for a formula to be implied by a certain set of formulas. Informally this is true if in all worlds that are possible given the set of formulas the formula also holds. This leads to the following formal definition: We say that a set of well-formed formulas ''semantically entails'' (or ''implies'') a certain well-formed formula if all truth assignments that satisfy all the formulas in also satisfy . Finally we define ''syntactical entailment'' such that is syntactically entailed by if and only if we can derive it with the inference rules that were presented above in a finite number of steps. This allows us to formulate exactly what it means for the set of inference rules to be sound and complete: Soundness: If the set of well-formed formulas ''syntactically'' entails the well-formed formula then ''semantically'' entails . Completeness: If the set of well-formed formulas ''semantically'' entails the well-formed formula then ''syntactically'' entails . For the above set of rules this is indeed the case.


Sketch of a soundness proof

(For most logical systems, this is the comparatively "simple" direction of proof) Notational conventions: Let be a variable ranging over sets of sentences. Let and range over sentences. For " syntactically entails " we write " proves ". For " semantically entails " we write " implies ". We want to show: (if proves , then implies ). We note that " proves " has an inductive definition, and that gives us the immediate resources for demonstrating claims of the form "If proves , then ...". So our proof proceeds by induction. Notice that Basis Step II can be omitted for natural deduction systems because they have no axioms. When used, Step II involves showing that each of the axioms is a (semantic) logical truth. The Basis steps demonstrate that the simplest provable sentences from are also implied by , for any . (The proof is simple, since the semantic fact that a set implies any of its members, is also trivial.) The Inductive step will systematically cover all the further sentences that might be provable—by considering each case where we might reach a logical conclusion using an inference rule—and shows that if a new sentence is provable, it is also logically implied. (For example, we might have a rule telling us that from "" we can derive " or ". In III.a We assume that if is provable it is implied. We also know that if is provable then " or " is provable. We have to show that then " or " too is implied. We do so by appeal to the semantic definition and the assumption we just made. is provable from , we assume. So it is also implied by . So any semantic valuation making all of true makes true. But any valuation making true makes " or " true, by the defined semantics for "or". So any valuation which makes all of true makes " or " true. So " or " is implied.) Generally, the Inductive step will consist of a lengthy but simple case-by-case analysis of all the rules of inference, showing that each "preserves" semantic implication. By the definition of provability, there are no sentences provable other than by being a member of , an axiom, or following by a rule; so if all of those are semantically implied, the deduction calculus is sound.


Sketch of completeness proof

(This is usually the much harder direction of proof.) We adopt the same notational conventions as above. We want to show: If implies , then proves . We proceed by contraposition: We show instead that if does not prove then does not imply . If we show that there is a model where does not hold despite being true, then obviously does not imply . The idea is to build such a model out of our very assumption that does not prove . Thus every system that has modus ponens as an inference rule, and proves the following theorems (including substitutions thereof) is complete: * p \to (\neg p \to q) * (p \to q) \to ((\neg p \to q) \to q) * p \to (q \to (p \to q)) * p \to (\neg q \to \neg (p \to q)) * \neg p \to (p \to q) * p \to p * p \to (q \to p) * (p \to (q \to r)) \to ((p \to q) \to (p \to r)) The first five are used for the satisfaction of the five conditions in stage III above, and the last three for proving the deduction theorem.


Example

As an example, it can be shown that as any other tautology, the three axioms of the classical propositional calculus system described earlier can be proven in any system that satisfies the above, namely that has modus ponens as an inference rule, and proves the above eight theorems (including substitutions thereof). Out of the eight theorems, the last two are two of the three axioms; the third axiom, (\neg q \to \neg p) \to (p \to q), can be proven as well, as we now show. For the proof we may use the hypothetical syllogism theorem (in the form relevant for this axiomatic system), since it only relies on the two axioms that are already in the above set of eight theorems. The proof then is as follows: # q \to (p \to q)       (instance of the 7th theorem) # (q \to (p \to q)) \to ((\neg q \to \neg p) \to (q \to (p \to q)))       (instance of the 7th theorem) # (\neg q \to \neg p) \to (q \to (p \to q))       (from (1) and (2) by modus ponens) # (\neg p \to (p \to q)) \to ((\neg q \to \neg p) \to (\neg q\to (p\to q)))       (instance of the hypothetical syllogism theorem) # (\neg p \to (p \to q))       (instance of the 5th theorem) # (\neg q \to \neg p) \to (\neg q\to (p\to q))       (from (5) and (4) by modus ponens) # (q \to (p \to q)) \to ((\neg q \to (p \to q)) \to (p \to q))       (instance of the 2nd theorem) # ((q \to (p \to q)) \to ((\neg q \to (p \to q)) \to (p \to q)) ) \to ((\neg q \to \neg p) \to ((q \to (p \to q)) \to ((\neg q \to (p \to q)) \to (p \to q))))       (instance of the 7th theorem) # (\neg q \to \neg p) \to ((q \to (p \to q)) \to ((\neg q \to (p \to q)) \to (p \to q)))       (from (7) and (8) by modus ponens) # ((\neg q \to \neg p) \to ((q \to (p \to q)) \to ((\neg q \to (p \to q)) \to (p \to q)))) \to #:: (((\neg q \to \neg p) \to (q \to (p \to q))) \to ((\neg q \to \neg p) \to ((\neg q \to (p \to q)) \to (p \to q))))       (instance of the 8th theorem) # ((\neg q \to \neg p) \to (q \to (p \to q))) \to ((\neg q \to \neg p) \to ((\neg q \to (p \to q)) \to (p \to q)))       (from (9) and (10) by modus ponens) # (\neg q \to \neg p) \to ((\neg q \to (p \to q)) \to (p \to q))       (from (3) and (11) by modus ponens) # ((\neg q \to \neg p) \to ((\neg q \to (p \to q)) \to (p \to q))) \to (((\neg q \to \neg p) \to (\neg q \to (p \to q))) \to ((\neg q \to \neg p) \to (p \to q)))       (instance of the 8th theorem) # ((\neg q \to \neg p) \to (\neg q \to (p \to q))) \to ((\neg q \to \neg p) \to (p \to q))       (from (12) and (13) by modus ponens) # (\neg q \to \neg p) \to (p \to q)       (from (6) and (14) by modus ponens)


Verifying completeness for the classical propositional calculus system

We now verify that the classical propositional calculus system described earlier can indeed prove the required eight theorems mentioned above. We use several lemmas proven here: : (DN1) \neg \neg p \to p - Double negation (one direction) : (DN2) p \to \neg \neg p - Double negation (another direction) : (HS1) (q \to r) \to ((p \to q) \to (p \to r)) - one form of Hypothetical syllogism : (HS2) (p \to q) \to ((q \to r) \to (p \to r)) - another form of Hypothetical syllogism : (TR1) (p \to q) \to (\neg q \to \neg p) - Transposition :(TR2) (\neg p \to q) \to (\neg q \to p) - another form of transposition. :(L1) p \to ((p \to q) \to q) :(L3) (\neg p \to p) \to p We also use the method of the hypothetical syllogism metatheorem as a shorthand for several proof steps. * p \to (\neg p \to q) - proof: *# p \to (\neg q \to p)       (instance of (A1)) *# (\neg q \to p) \to (\neg p \to \neg\neg q)       (instance of (TR1)) *# p \to (\neg p \to \neg\neg q)       (from (1) and (2) using the hypothetical syllogism metatheorem) *# \neg\neg q \to q       (instance of (DN1)) *# (\neg\neg q \to q) \to ((\neg p \to \neg\neg q) \to (\neg p \to q))       (instance of (HS1)) *# (\neg p \to \neg\neg q) \to (\neg p \to q)       (from (4) and (5) using modus ponens) *# p \to (\neg p \to q)       (from (3) and (6) using the hypothetical syllogism metatheorem) * (p \to q) \to ((\neg p \to q) \to q) - proof: *# (p \to q) \to ((\neg q \to p) \to (\neg q \to q))       (instance of (HS1)) *# (\neg q \to q) \to q       (instance of (L3)) *# ((\neg q \to q) \to q) \to (((\neg q \to p) \to (\neg q \to q)) \to ((\neg q \to p) \to q))       (instance of (HS1)) *# ((\neg q \to p) \to (\neg q \to q)) \to ((\neg q \to p) \to q)       (from (2) and (3) by modus ponens) *# (p \to q) \to ((\neg q \to p) \to q)       (from (1) and (4) using the hypothetical syllogism metatheorem) *# (\neg p \to q) \to (\neg q \to p)       (instance of (TR2)) *# ((\neg p \to q) \to (\neg q \to p)) \to (((\neg q \to p) \to q) \to ((\neg p \to q) \to q))       (instance of (HS2)) *# ((\neg q \to p) \to q) \to ((\neg p \to q) \to q)       (from (6) and (7) using modus ponens) *# (p \to q) \to ((\neg p \to q) \to q)       (from (5) and (8) using the hypothetical syllogism metatheorem) * p \to (q \to (p \to q)) - proof: *# q \to (p \to q)       (instance of (A1)) *# (q \to (p \to q)) \to (p \to (q \to (p \to q)))       (instance of (A1)) *# p \to (q \to (p \to q))       (from (1) and (2) using modus ponens) * p \to (\neg q \to \neg (p \to q)) - proof: *# p \to ((p \to q) \to q)       (instance of (L1)) *# ((p \to q) \to q) \to (\neg q \to \neg (p \to q))       (instance of (TR1)) *# p \to (\neg q \to \neg (p \to q))       (from (1) and (2) using the hypothetical syllogism metatheorem) * \neg p \to (p \to q) - proof: *# \neg p \to (\neg q \to \neg p)       (instance of (A1)) *# (\neg q \to \neg p) \to (p \to q)       (instance of (A3)) *# \neg p \to (p \to q)       (from (1) and (2) using the hypothetical syllogism metatheorem) * p \to p - proof given in the proof example above * p \to (q \to p) - axiom (A1) * (p \to (q \to r)) \to ((p \to q) \to (p \to r)) - axiom (A2)


Another outline for a completeness proof

If a formula is a tautology, then there is a truth table for it which shows that each valuation yields the value true for the formula. Consider such a valuation. By mathematical induction on the length of the subformulas, show that the truth or falsity of the subformula follows from the truth or falsity (as appropriate for the valuation) of each propositional variable in the subformula. Then combine the lines of the truth table together two at a time by using "( is true implies ) implies (( is false implies ) implies )". Keep repeating this until all dependencies on propositional variables have been eliminated. The result is that we have proved the given tautology. Since every tautology is provable, the logic is complete.


Interpretation of a truth-functional propositional calculus

An interpretation of a truth-functional propositional calculus \mathcal is an assignment to each propositional symbol of \mathcal of one or the other (but not both) of the truth values truth (T) and falsity (F), and an assignment to the connective symbols of \mathcal of their usual truth-functional meanings. An interpretation of a truth-functional propositional calculus may also be expressed in terms of truth tables. For n distinct propositional symbols there are 2^n distinct possible interpretations. For any particular symbol a, for example, there are 2^1=2 possible interpretations: # a is assigned T, or # a is assigned F. For the pair a, b there are 2^2=4 possible interpretations: # both are assigned T, # both are assigned F, # a is assigned T and b is assigned F, or # a is assigned F and b is assigned T. Since \mathcal has \aleph_0, that is, denumerably many propositional symbols, there are 2^=\mathfrak c, and therefore
uncountably many In mathematics, an uncountable set (or uncountably infinite set) is an infinite set that contains too many elements to be countable. The uncountability of a set is closely related to its cardinal number: a set is uncountable if its cardinal numb ...
distinct possible interpretations of \mathcal.


Interpretation of a sentence of truth-functional propositional logic

If and are formulas of \mathcal and \mathcal is an interpretation of \mathcal then the following definitions apply: * A sentence of propositional logic is ''true under an interpretation'' \mathcal if \mathcal assigns the truth value T to that sentence. If a sentence is true under an interpretation, then that interpretation is called a ''model'' of that sentence. * is ''false under an interpretation'' \mathcal if is not true under \mathcal. * A sentence of propositional logic is ''logically valid'' if it is true under every interpretation. *: \models means that is logically valid. * A sentence of propositional logic is a '' semantic consequence'' of a sentence if there is no interpretation under which is true and is false. * A sentence of propositional logic is '' consistent'' if it is true under at least one interpretation. It is inconsistent if it is not consistent. Some consequences of these definitions: * For any given interpretation a given formula is either true or false. * No formula is both true and false under the same interpretation. * is false for a given interpretation \neg\phi is true for that interpretation; and is true under an interpretation \neg\phi is false under that interpretation. * If and (\phi \to \psi) are both true under a given interpretation, then is true under that interpretation. * If \models_\phi and \models_(\phi \to \psi), then \models_\psi. * \neg\phi is true under \mathcal is not true under \mathcal. * (\phi \to \psi) is true under \mathcal either is not true under \mathcal or is true under \mathcal. * A sentence of propositional logic is a semantic consequence of a sentence (\phi \to \psi) is
logically valid In logic, specifically in deductive reasoning, an argument is valid if and only if it takes a form that makes it impossible for the premises to be true and the conclusion nevertheless to be false. It is not required for a valid argument to ha ...
, that is, \phi \models_ \psi \models_(\phi \to \psi).


Alternative calculus

It is possible to define another version of propositional calculus, which defines most of the syntax of the logical operators by means of axioms, and which uses only one inference rule.


Axioms

Let , , and stand for well-formed formulas. (The well-formed formulas themselves would not contain any Greek letters, but only capital Roman letters, connective operators, and parentheses.) Then the axioms are as follows: *Axiom may be considered to be a "distributive property of implication with respect to implication." *Axioms and correspond to "conjunction elimination". The relation between and reflects the commutativity of the conjunction operator. *Axiom corresponds to "conjunction introduction." *Axioms and correspond to "disjunction introduction." The relation between and reflects the commutativity of the disjunction operator. *Axiom corresponds to "reductio ad absurdum." *Axiom says that "anything can be deduced from a contradiction." *Axiom is called " tertium non-datur" (
Latin Latin (, or , ) is a classical language belonging to the Italic branch of the Indo-European languages. Latin was originally a dialect spoken in the lower Tiber area (then known as Latium) around present-day Rome, but through the power ...
: "a third is not given") and reflects the semantic valuation of propositional formulas: a formula can have a truth-value of either true or false. There is no third truth-value, at least not in classical logic. Intuitionistic logicians do not accept the axiom .


Inference rule

The inference rule is modus ponens: : \frac .


Meta-inference rule

Let a demonstration be represented by a sequence, with hypotheses to the left of the
turnstile A turnstile (also called a turnpike, gateline, baffle gate, automated gate, turn gate in some regions) is a form of gate which allows one person to pass at a time. A turnstile can be configured to enforce one-way human traffic. In addition, a ...
and the conclusion to the right of the turnstile. Then the deduction theorem can be stated as follows: : ''If the sequence'' :: \phi_1, \ \phi_2, \ ... , \ \phi_n, \ \chi \vdash \psi : ''has been demonstrated, then it is also possible to demonstrate the sequence'' :: \phi_1, \ \phi_2, \ ..., \ \phi_n \vdash \chi \to \psi . This deduction theorem (DT) is not itself formulated with propositional calculus: it is not a theorem of propositional calculus, but a theorem about propositional calculus. In this sense, it is a meta-theorem, comparable to theorems about the soundness or completeness of propositional calculus. On the other hand, DT is so useful for simplifying the syntactical proof process that it can be considered and used as another inference rule, accompanying modus ponens. In this sense, DT corresponds to the natural conditional proof inference rule which is part of the first version of propositional calculus introduced in this article. The converse of DT is also valid: : ''If the sequence'' :: \phi_1, \ \phi_2, \ ..., \ \phi_n \vdash \chi \to \psi : ''has been demonstrated, then it is also possible to demonstrate the sequence'' :: \phi_1, \ \phi_2, \ ... , \ \phi_n, \ \chi \vdash \psi in fact, the validity of the converse of DT is almost trivial compared to that of DT: : ''If'' :: \phi_1, \ ... , \ \phi_n \vdash \chi \to \psi : ''then'' :: 1: \phi_1, \ ... , \ \phi_n, \ \chi \vdash \chi \to \psi :: 2: \phi_1, \ ... , \ \phi_n, \ \chi \vdash \chi : ''and from (1) and (2) can be deduced'' :: 3: \phi_1, \ ... , \ \phi_n, \ \chi \vdash \psi : ''by means of modus ponens, Q.E.D.'' The converse of DT has powerful implications: it can be used to convert an axiom into an inference rule. For example, by axiom AND-1 we have, : \vdash \phi \wedge \chi \to \phi, which can be transformed by means of the converse of the deduction theorem into : \phi \wedge \chi \vdash \phi, which tells us that the inference rule : \frac is admissible. This inference rule is conjunction elimination, one of the ten inference rules used in the first version (in this article) of the propositional calculus.


Example of a proof

The following is an example of a (syntactical) demonstration, involving only axioms and : Prove: A \to A (Reflexivity of implication). Proof: # (A \to ((B \to A) \to A)) \to ((A \to (B \to A)) \to (A \to A)) #: Axiom with \phi = A, \chi = B \to A, \psi = A # A \to ((B \to A) \to A) #: Axiom with \phi = A, \chi = B \to A # (A \to (B \to A)) \to (A \to A) #: From (1) and (2) by modus ponens. # A \to (B \to A) #: Axiom with \phi = A, \chi = B # A \to A #: From (3) and (4) by modus ponens.


Equivalence to equational logics

The preceding alternative calculus is an example of a Hilbert-style deduction system. In the case of propositional systems the axioms are terms built with logical connectives and the only inference rule is modus ponens. Equational logic as standardly used informally in high school algebra is a different kind of calculus from Hilbert systems. Its theorems are equations and its inference rules express the properties of equality, namely that it is a congruence on terms that admits substitution. Classical propositional calculus as described above is equivalent to
Boolean algebra In mathematics and mathematical logic, Boolean algebra is a branch of algebra. It differs from elementary algebra in two ways. First, the values of the variables are the truth values ''true'' and ''false'', usually denoted 1 and 0, whereas ...
, while
intuitionistic propositional calculus Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems ...
is equivalent to Heyting algebra. The equivalence is shown by translation in each direction of the theorems of the respective systems. Theorems \phi of classical or intuitionistic propositional calculus are translated as equations \phi = 1 of Boolean or Heyting algebra respectively. Conversely theorems x = y of Boolean or Heyting algebra are translated as theorems (x \to y) \land (y \to x) of classical or intuitionistic calculus respectively, for which x \equiv y is a standard abbreviation. In the case of Boolean algebra x = y can also be translated as (x \land y) \lor (\neg x \land \neg y), but this translation is incorrect intuitionistically. In both Boolean and Heyting algebra, inequality x \le y can be used in place of equality. The equality x = y is expressible as a pair of inequalities x \le y and y \le x. Conversely the inequality x \le y is expressible as the equality x \land y = x, or as x \lor y = y. The significance of inequality for Hilbert-style systems is that it corresponds to the latter's deduction or entailment symbol \vdash. An entailment :: \phi_1, \ \phi_2, \ \dots, \ \phi_n \vdash \psi is translated in the inequality version of the algebraic framework as :: \phi_1\ \land\ \phi_2\ \land\ \dots\ \land \ \phi_n\ \ \le\ \ \psi Conversely the algebraic inequality x \le y is translated as the entailment ::x\ \vdash\ y. The difference between implication x \to y and inequality or entailment x \le y or x\ \vdash\ y is that the former is internal to the logic while the latter is external. Internal implication between two terms is another term of the same kind. Entailment as external implication between two terms expresses a metatruth outside the language of the logic, and is considered part of the metalanguage. Even when the logic under study is intuitionistic, entailment is ordinarily understood classically as two-valued: either the left side entails, or is less-or-equal to, the right side, or it is not. Similar but more complex translations to and from algebraic logics are possible for natural deduction systems as described above and for the sequent calculus. The entailments of the latter can be interpreted as two-valued, but a more insightful interpretation is as a set, the elements of which can be understood as abstract proofs organized as the morphisms of a
category Category, plural categories, may refer to: Philosophy and general uses *Categorization, categories in cognitive science, information science and generally * Category of being * ''Categories'' (Aristotle) * Category (Kant) * Categories (Peirce) ...
. In this interpretation the cut rule of the sequent calculus corresponds to composition in the category. Boolean and Heyting algebras enter this picture as special categories having at most one morphism per homset, i.e., one proof per entailment, corresponding to the idea that existence of proofs is all that matters: any proof will do and there is no point in distinguishing them.


Graphical calculi

It is possible to generalize the definition of a formal language from a set of finite sequences over a finite basis to include many other sets of mathematical structures, so long as they are built up by finitary means from finite materials. What's more, many of these families of formal structures are especially well-suited for use in logic. For example, there are many families of graphs that are close enough analogues of formal languages that the concept of a calculus is quite easily and naturally extended to them. Many species of graphs arise as ''
parse graph A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term ''parse tree'' itself is used primarily in comp ...
s'' in the syntactic analysis of the corresponding families of text structures. The exigencies of practical computation on formal languages frequently demand that text strings be converted into pointer structure renditions of parse graphs, simply as a matter of checking whether strings are well-formed formulas or not. Once this is done, there are many advantages to be gained from developing the graphical analogue of the calculus on strings. The mapping from strings to parse graphs is called '' parsing'' and the inverse mapping from parse graphs to strings is achieved by an operation that is called '' traversing'' the graph.


Other logical calculi

Propositional calculus is about the simplest kind of logical calculus in current use. It can be extended in several ways. ( Aristotelian "syllogistic" calculus, which is largely supplanted in modern logic, is in ''some'' ways simpler – but in other ways more complex – than propositional calculus.) The most immediate way to develop a more complex logical calculus is to introduce rules that are sensitive to more fine-grained details of the sentences being used.
First-order logic First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quanti ...
(a.k.a. first-order predicate logic) results when the "atomic sentences" of propositional logic are broken up into terms, variables, predicates, and quantifiers, all keeping the rules of propositional logic with some new ones introduced. (For example, from "All dogs are mammals" we may infer "If Rover is a dog then Rover is a mammal".) With the tools of first-order logic it is possible to formulate a number of theories, either with explicit axioms or by rules of inference, that can themselves be treated as logical calculi. Arithmetic is the best known of these; others include set theory and mereology.
Second-order logic In logic and mathematics, second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory. First-order logic quantifies ...
and other higher-order logics are formal extensions of first-order logic. Thus, it makes sense to refer to propositional logic as ''"zeroth-order logic"'', when comparing it with these logics. Modal logic also offers a variety of inferences that cannot be captured in propositional calculus. For example, from "Necessarily " we may infer that . From we may infer "It is possible that ". The translation between modal logics and algebraic logics concerns classical and intuitionistic logics but with the introduction of a unary operator on Boolean or Heyting algebras, different from the Boolean operations, interpreting the possibility modality, and in the case of Heyting algebra a second operator interpreting necessity (for Boolean algebra this is redundant since necessity is the De Morgan dual of possibility). The first operator preserves 0 and disjunction while the second preserves 1 and conjunction. Many-valued logics are those allowing sentences to have values other than ''true'' and ''false''. (For example, ''neither'' and ''both'' are standard "extra values"; "continuum logic" allows each sentence to have any of an infinite number of "degrees of truth" between ''true'' and ''false''.) These logics often require calculational devices quite distinct from propositional calculus. When the values form a Boolean algebra (which may have more than two or even infinitely many values), many-valued logic reduces to classical logic; many-valued logics are therefore only of independent interest when the values form an algebra that is not Boolean.


Solvers

Deciding satisfiability of propositional logic formulas is an NP-complete problem. However, practical methods exist (e.g., DPLL algorithm, 1962; Chaff algorithm, 2001) that are very fast for many useful cases. Recent work has extended the SAT solver algorithms to work with propositions containing
arithmetic expression In mathematics, an expression or mathematical expression is a finite combination of symbols that is well-formed according to rules that depend on the context. Mathematical symbols can designate numbers ( constants), variables, operations, f ...
s; these are the SMT solvers.


See also


Higher logical levels

*
First-order logic First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quanti ...
* Second-order propositional logic *
Second-order logic In logic and mathematics, second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory. First-order logic quantifies ...
* Higher-order logic


Related topics

* Boolean algebra (logic) * Boolean algebra (structure) * Boolean algebra topics *
Boolean domain In mathematics and abstract algebra, a Boolean domain is a set consisting of exactly two elements whose interpretations include ''false'' and ''true''. In logic, mathematics and theoretical computer science, a Boolean domain is usually written ...
* Boolean function * Boolean-valued function * Categorical logic * Combinational logic * Combinatory logic * Conceptual graph * Disjunctive syllogism * Entitative graph * Equational logic * Existential graph * Frege's propositional calculus * Implicational propositional calculus *
Intuitionistic propositional calculus Intuitionistic logic, sometimes more generally called constructive logic, refers to systems of symbolic logic that differ from the systems used for classical logic by more closely mirroring the notion of constructive proof. In particular, systems ...
* Jean Buridan * '' Laws of Form'' * List of logic symbols * Logical graph * Logical NOR * Logical value *
Mathematical logic Mathematical logic is the study of formal logic within mathematics. Major subareas include model theory, proof theory, set theory, and recursion theory. Research in mathematical logic commonly addresses the mathematical properties of formal ...
*
Operation (mathematics) In mathematics, an operation is a function which takes zero or more input values (also called "'' operands''" or "arguments") to a well-defined output value. The number of operands is the arity of the operation. The most commonly studied operat ...
* Paul of Venice *
Peirce's law In logic, Peirce's law is named after the philosopher and logician Charles Sanders Peirce. It was taken as an axiom in his first axiomatisation of propositional logic. It can be thought of as the law of excluded middle written in a form tha ...
* Peter of Spain (author) * Propositional formula * Symmetric difference * Tautology (rule of inference) * Truth function * Truth table * Walter Burley *
William of Sherwood William of Sherwood or William Sherwood ( Latin: ''Guillielmus de Shireswode''; ), with numerous variant spellings, was a medieval English scholastic philosopher, logician, and teacher. Little is known of his life, but he is thought to have st ...


References


Further reading

* Brown, Frank Markham (2003), ''Boolean Reasoning: The Logic of Boolean Equations'', 1st edition, Kluwer Academic Publishers, Norwell, MA. 2nd edition, Dover Publications, Mineola, NY. * Chang, C.C. and Keisler, H.J. (1973), ''Model Theory'', North-Holland, Amsterdam, Netherlands. * Kohavi, Zvi (1978), ''Switching and Finite Automata Theory'', 1st edition, McGraw–Hill, 1970. 2nd edition, McGraw–Hill, 1978. * Korfhage, Robert R. (1974), ''Discrete Computational Structures'', Academic Press, New York, NY. * Lambek, J. and Scott, P.J. (1986), ''Introduction to Higher Order Categorical Logic'', Cambridge University Press, Cambridge, UK. * Mendelson, Elliot (1964), ''Introduction to Mathematical Logic'', D. Van Nostrand Company.


Related works

*


External links

* Klement, Kevin C. (2006), "Propositional Logic", in James Fieser and Bradley Dowden (eds.), ''
Internet Encyclopedia of Philosophy The ''Internet Encyclopedia of Philosophy'' (''IEP'') is a scholarly online encyclopedia, dealing with philosophy, philosophical topics, and philosophers. The IEP combines open access publication with peer reviewed publication of original pa ...
''
Eprint

Formal Predicate Calculus
contains a systematic formal development along the lines of Alternative calculus *
forall x: an introduction to formal logic
', by
P.D. Magnus PD, P.D., or Pd may refer to: Arts and media * ''People's Democracy'' (newspaper), weekly organ of the Communist Party of India (Marxist) * ''The Plain Dealer'', a Cleveland, Ohio, US newspaper * Post Diaspora, a time frame in the ''Honorverse'' ...
, covers formal semantics and proof theory for sentential logic.
Chapter 2 / Propositional Logic
fro
Logic In ActionPropositional sequent calculus prover
on Project Nayuki. (''note'': implication can be input in the form !X, Y, and a sequent can be a single formula prefixed with > and having no commas)
Propositional Logic - A Generative Grammar
{{Authority control Systems of formal logic Logical calculi Boolean algebra Classical logic Analytic philosophy