First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of
formal systems used in
mathematics,
philosophy,
linguistics
Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Lingu ...
, and
computer science
Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to practical disciplines (includin ...
. First-order logic uses
quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists''"'' is a quantifier, while ''x'' is a variable. This distinguishes it from
propositional logic, which does not use quantifiers or
relations; in this sense, propositional logic is the foundation of first-order logic.
A theory about a topic is usually a first-order logic together with a specified
domain of discourse (over which the quantified variables range), finitely many functions from that domain to itself, finitely many
predicates defined on that domain, and a set of axioms believed to hold about them. Sometimes, "theory" is understood in a more formal sense as just a set of sentences in first-order logic.
The adjective "first-order" distinguishes first-order logic from
higher-order logic, in which there are predicates having predicates or functions as arguments, or in which quantification over predicates or functions, or both, are permitted. In first-order theories, predicates are often associated with sets. In interpreted higher-order theories, predicates may be interpreted as sets of sets.
There are many
deductive systems for first-order logic which are both
sound
In physics, sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid or solid.
In human physiology and psychology, sound is the ''reception'' of such waves and their ''perception'' by ...
(i.e., all provable statements are true in all models) and
complete (i.e. all statements which are true in all models are provable). Although the
logical consequence relation is only
semidecidable, much progress has been made in
automated theorem proving in first-order logic. First-order logic also satisfies several
metalogical theorems that make it amenable to analysis in
proof theory, such as the
Löwenheim–Skolem theorem and the
compactness theorem.
First-order logic is the standard for the formalization of mathematics into
axioms, and is studied in the
foundations of mathematics.
Peano arithmetic and
Zermelo–Fraenkel set theory are axiomatizations of
number theory
Number theory (or arithmetic or higher arithmetic in older usage) is a branch of pure mathematics devoted primarily to the study of the integers and integer-valued functions. German mathematician Carl Friedrich Gauss (1777–1855) said, "Math ...
and
set theory, respectively, into first-order logic.
No first-order theory, however, has the strength to uniquely describe a structure with an infinite domain, such as the
natural number
In mathematics, the natural numbers are those numbers used for counting (as in "there are ''six'' coins on the table") and ordering (as in "this is the ''third'' largest city in the country").
Numbers used for counting are called '' cardinal ...
s or the
real line. Axiom systems that do fully describe these two structures (that is,
categorical axiom systems) can be obtained in stronger logics such as
second-order logic.
The foundations of first-order logic were developed independently by
Gottlob Frege and
Charles Sanders Peirce
Charles Sanders Peirce ( ; September 10, 1839 – April 19, 1914) was an American philosopher, logician, mathematician and scientist who is sometimes known as "the father of pragmatism".
Educated as a chemist and employed as a scientist for ...
. For a history of first-order logic and how it came to dominate formal logic, see José Ferreirós (2001).
Introduction
While
propositional logic deals with simple declarative propositions, first-order logic additionally covers
predicates and
quantification.
A predicate takes an entity or entities in the
domain of discourse and evaluates to
true or
false
False or falsehood may refer to:
* False (logic), the negation of truth in classical logic
*Lie or falsehood, a type of deception in the form of an untruthful statement
* false (Unix), a Unix command
* ''False'' (album), a 1992 album by Gorefest
* ...
. Consider the two sentences "Socrates is a philosopher" and "Plato is a philosopher". In
propositional logic, these sentences are viewed as being unrelated, and might be denoted, for example, by variables such as ''p'' and ''q''. The predicate "is a philosopher" occurs in both sentences, which have a common structure of "''a'' is a philosopher". The variable ''a'' is instantiated as "Socrates" in the first sentence, and is instantiated as "Plato" in the second sentence. While first-order logic allows for the use of predicates, such as "is a philosopher" in this example, propositional logic does not.
Relationships between predicates can be stated using
logical connectives. Consider, for example, the first-order formula "if ''a'' is a philosopher, then ''a'' is a scholar". This formula is a
conditional
Conditional (if then) may refer to:
* Causal conditional, if X then Y, where X is a cause of Y
* Conditional probability, the probability of an event A given that another event B has occurred
*Conditional proof, in logic: a proof that asserts a ...
statement with "''a'' is a philosopher" as its hypothesis, and "''a'' is a scholar" as its conclusion. The truth of this formula depends on which object is denoted by ''a'', and on the interpretations of the predicates "is a philosopher" and "is a scholar".
Quantifiers can be applied to variables in a formula. The variable ''a'' in the previous formula can be universally quantified, for instance, with the first-order sentence "For every ''a'', if ''a'' is a philosopher, then ''a'' is a scholar". The
universal quantifier
In mathematical logic, a universal quantification is a type of quantifier, a logical constant which is interpreted as "given any" or "for all". It expresses that a predicate can be satisfied by every member of a domain of discourse. In other w ...
"for every" in this sentence expresses the idea that the claim "if ''a'' is a philosopher, then ''a'' is a scholar" holds for ''all'' choices of ''a''.
The ''
negation
In logic, negation, also called the logical complement, is an operation that takes a proposition P to another proposition "not P", written \neg P, \mathord P or \overline. It is interpreted intuitively as being true when P is false, and fals ...
'' of the sentence "For every ''a'', if ''a'' is a philosopher, then ''a'' is a scholar" is logically equivalent to the sentence "There exists ''a'' such that ''a'' is a philosopher and ''a'' is not a scholar". The
existential quantifier "there exists" expresses the idea that the claim "''a'' is a philosopher and ''a'' is not a scholar" holds for ''some'' choice of ''a''.
The predicates "is a philosopher" and "is a scholar" each take a single variable. In general, predicates can take several variables. In the first-order sentence "Socrates is the teacher of Plato", the predicate "is the teacher of" takes two variables.
An interpretation (or model) of a first-order formula specifies what each predicate means, and the entities that can instantiate the variables. These entities form the
domain of discourse or universe, which is usually required to be a nonempty set. For example, in an interpretation with the domain of discourse consisting of all human beings and the predicate "is a philosopher" understood as "was the author of the ''
Republic''", the sentence "There exists ''a'' such that ''a'' is a philosopher" is seen as being true, as witnessed by Plato.
There are two key parts of first-order logic. The
syntax
In linguistics, syntax () is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure ( constituency) ...
determines which finite sequences of symbols are well-formed expressions in first-order logic, while the
semantics determines the meanings behind these expressions.
Syntax
Alphabet
Unlike natural languages, such as English, the language of first-order logic is completely formal, so that it can be mechanically determined whether a given expression is well formed. There are two key types of well-formed expressions: ''terms'', which intuitively represent objects, and ''formulas'', which intuitively express statements that can be true or false. The terms and formulas of first-order logic are strings of ''symbols'', where all the symbols together form the ''alphabet'' of the language. As with all
formal language
In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules.
The alphabet of a formal language consists of s ...
s, the nature of the symbols themselves is outside the scope of formal logic; they are often regarded simply as letters and punctuation symbols.
It is common to divide the symbols of the alphabet into ''logical symbols'', which always have the same meaning, and ''non-logical symbols'', whose meaning varies by interpretation. For example, the logical symbol
always represents "and"; it is never interpreted as "or", which is represented by the logical symbol
. However, a non-logical predicate symbol such as Phil(''x'') could be interpreted to mean "''x'' is a philosopher", "''x'' is a man named Philip", or any other unary predicate depending on the interpretation at hand.
Logical symbols
Logical symbols are a set of characters that vary by author, but usually include the following:
*
Quantifier symbols: for universal quantification, and for existential quantification
*
Logical connectives: for
conjunction, for
disjunction
In logic, disjunction is a logical connective typically notated as \lor and read aloud as "or". For instance, the English language sentence "it is raining or it is snowing" can be represented in logic using the disjunctive formula R \lor ...
, for
implication, for
biconditional, for negation. Some authors use C''pq'' instead of and E''pq'' instead of , especially in contexts where → is used for other purposes. Moreover, the horseshoe may replace ; the triple-bar may replace ; a tilde (), N''p'', or F''p'' may replace ; a double bar
,
, or A''pq'' may replace ; and an ampersand , K''pq'', or the middle dot may replace , especially if these symbols are not available for technical reasons. (The aforementioned symbols C''pq'', E''pq'', N''p'', A''pq'', and K''pq'' are used in
Polish notation.)
* Parentheses, brackets, and other punctuation symbols. The choice of such symbols varies depending on context.
* An infinite set of ''variables'', often denoted by lowercase letters at the end of the alphabet ''x'', ''y'', ''z'', ... . Subscripts are often used to distinguish variables:
* An ''equality symbol'' (sometimes, ''identity symbol'') (see below).
Not all of these symbols are required in first-order logic. Either one of the quantifiers along with negation, conjunction (or disjunction), variables, brackets, and equality suffices.
Other logical symbols include the following:
* Truth constants: T, V, or for "true" and F, O, or for "false" (V and O are from Polish notation). Without any such logical operators of valence 0, these two constants can only be expressed using quantifiers.
* Additional logical connectives such as the
Sheffer stroke, D''pq'' (NAND), and
exclusive or, J''pq''.
Non-logical symbols
Non-logical symbols represent predicates (relations), functions and constants. It used to be standard practice to use a fixed, infinite set of non-logical symbols for all purposes:
* For every integer ''n'' ≥ 0, there is a collection of
''n''-''ary'', or ''n''-''place'', ''
predicate symbols''. Because they represent
relations
Relation or relations may refer to:
General uses
*International relations, the study of interconnection of politics, economics, and law on a global level
*Interpersonal relationship, association or acquaintance between two or more people
*Public ...
between ''n'' elements, they are also called ''relation symbols''. For each arity ''n'', there is an infinite supply of them:
*:''P''
''n''0, ''P''
''n''1, ''P''
''n''2, ''P''
''n''3, ...
* For every integer ''n'' ≥ 0, there are infinitely many ''n''-ary ''function symbols'':
*:''f
n''
0, ''f
n''
1, ''f
n''
2, ''f
n''
3, ...
When the arity of a predicate symbol or function symbol is clear from context, the superscript ''n'' is often omitted.
In this traditional approach, there is only one language of first-order logic. This approach is still common, especially in philosophically oriented books.
A more recent practice is to use different non-logical symbols according to the application one has in mind. Therefore, it has become necessary to name the set of all non-logical symbols used in a particular application. This choice is made via a ''
signature''.
Typical signatures in mathematics are or just for
groups, or for
ordered fields. There are no restrictions on the number of non-logical symbols. The signature can be
empty, finite, or infinite, even
uncountable. Uncountable signatures occur for example in modern proofs of the
Löwenheim–Skolem theorem.
Though signatures might in some cases imply how non-logical symbols are to be interpreted,
interpretation
Interpretation may refer to:
Culture
* Aesthetic interpretation, an explanation of the meaning of a work of art
* Allegorical interpretation, an approach that assumes a text should not be interpreted literally
* Dramatic Interpretation, an event ...
of the non-logical symbols in the signature is separate (and not necessarily fixed). Signatures concern syntax rather than semantics.
In this approach, every non-logical symbol is of one of the following types:
* A ''predicate symbol'' (or ''relation symbol'') with some ''valence'' (or ''arity'', number of arguments) greater than or equal to 0. These are often denoted by uppercase letters such as ''P'', ''Q'' and ''R''. Examples:
** In ''P''(''x''), ''P'' is a predicate symbol of valence 1. One possible interpretation is "''x'' is a man".
** In ''Q''(''x'',''y''), ''Q'' is a predicate symbol of valence 2. Possible interpretations include "''x'' is greater than ''y''" and "''x'' is the father of ''y''".
** Relations of valence 0 can be identified with
propositional variables, which can stand for any statement. One possible interpretation of ''R'' is "Socrates is a man".
* A ''function symbol'', with some valence greater than or equal to 0. These are often denoted by lowercase
roman letters such as ''f'', ''g'' and ''h''. Examples:
** ''f''(''x'') may be interpreted as "the father of ''x''". In
arithmetic, it may stand for "-x". In
set theory, it may stand for "the
power set of x".
** In arithmetic, ''g''(''x'',''y'') may stand for "''x''+''y''". In set theory, it may stand for "the union of ''x'' and ''y''".
** Function symbols of valence 0 are called ''constant symbols'', and are often denoted by lowercase letters at the beginning of the alphabet such as ''a'', ''b'' and ''c''. The symbol ''a'' may stand for Socrates. In arithmetic, it may stand for 0. In set theory, it may stand for the
empty set.
The traditional approach can be recovered in the modern approach, by simply specifying the "custom" signature to consist of the traditional sequences of non-logical symbols.
Formation rules
The
formation rules define the terms and formulas of first-order logic. When terms and formulas are represented as strings of symbols, these rules can be used to write a
formal grammar
In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) describes how to form strings from a language's alphabet that are valid according to the language's syntax. A grammar does not describe ...
for terms and formulas. These rules are generally
context-free (each production has a single symbol on the left side), except that the set of symbols may be allowed to be infinite and there may be many start symbols, for example the variables in the case of
terms.
Terms
The set of ''
terms'' is
inductively defined by the following rules:
* ''Variables''. Any variable symbol is a term.
* ''Functions''. If ''f'' is an ''n''-ary function symbol, and ''t''
1, ..., ''t''
''n'' are terms, then ''f''(''t''
1,...,''t''
''n'') is a term. In particular, symbols denoting individual constants are nullary function symbols, and thus are terms.
Only expressions which can be obtained by finitely many applications of rules 1 and 2 are terms. For example, no expression involving a predicate symbol is a term.
Formulas
The set of ''
formulas'' (also called ''
well-formed formulas
In mathematical logic, propositional logic and predicate logic, a well-formed formula, abbreviated WFF or wff, often simply formula, is a finite sequence of symbols from a given alphabet that is part of a formal language. A formal language can b ...
'' or ''WFFs'') is inductively defined by the following rules:
* ''Predicate symbols''. If ''P'' is an ''n''-ary predicate symbol and ''t''
1, ..., ''t''
''n'' are terms then ''P''(''t''
1,...,''t''
''n'') is a formula.
* ''
Equality''. If the equality symbol is considered part of logic, and ''t''
1 and ''t''
2 are terms, then ''t''
1 = ''t''
2 is a formula.
* ''Negation''. If
is a formula, then
is a formula.
* ''Binary connectives''. If and are formulas, then (
) is a formula. Similar rules apply to other binary logical connectives.
* ''Quantifiers''. If
is a formula and ''x'' is a variable, then
(for all x,
holds) and
(there exists x such that
) are formulas.
Only expressions which can be obtained by finitely many applications of rules 1–5 are formulas. The formulas obtained from the first two rules are said to be ''
atomic formulas''.
For example,
:
is a formula, if ''f'' is a unary function symbol, ''P'' a unary predicate symbol, and Q a ternary predicate symbol. However,
is not a formula, although it is a string of symbols from the alphabet.
The role of the parentheses in the definition is to ensure that any formula can only be obtained in one way—by following the inductive definition (i.e., there is a unique
parse tree for each formula). This property is known as ''unique readability'' of formulas. There are many conventions for where parentheses are used in formulas. For example, some authors use colons or full stops instead of parentheses, or change the places in which parentheses are inserted. Each author's particular definition must be accompanied by a proof of unique readability.
This definition of a formula does not support defining an if-then-else function
ite(c, a, b)
, where "c" is a condition expressed as a formula, that would return "a" if c is true, and "b" if it is false. This is because both predicates and functions can only accept terms as parameters, but the first parameter is a formula. Some languages built on first-order logic, such as SMT-LIB 2.0, add this.
Notational conventions
For convenience, conventions have been developed about the precedence of the logical operators, to avoid the need to write parentheses in some cases. These rules are similar to the
order of operations in arithmetic. A common convention is:
*
is evaluated first
*
and
are evaluated next
* Quantifiers are evaluated next
*
is evaluated last.
Moreover, extra punctuation not required by the definition may be inserted—to make formulas easier to read. Thus the formula
:
might be written as
:
In some fields, it is common to use infix notation for binary relations and functions, instead of the prefix notation defined above. For example, in arithmetic, one typically writes "2 + 2 = 4" instead of "=(+(2,2),4)". It is common to regard formulas in infix notation as abbreviations for the corresponding formulas in prefix notation, cf. also
term structure vs. representation.
The definitions above use infix notation for binary connectives such as
. A less common convention is
Polish notation, in which one writes
,
and so on in front of their arguments rather than between them. This convention is advantageous in that it allows all punctuation symbols to be discarded. As such, Polish notation is compact and elegant, but rarely used in practice because it is hard for humans to read. In Polish notation, the formula
:
becomes
Free and bound variables
In a formula, a variable may occur ''free'' or ''bound'' (or both). Intuitively, a variable occurrence is free in a formula if it is not quantified: in , the sole occurrence of variable ''x'' is free while that of ''y'' is bound. The free and bound variable occurrences in a formula are defined inductively as follows.
; Atomic formulas : If ''φ'' is an atomic formula, then ''x'' occurs free in ''φ'' if and only if ''x'' occurs in ''φ''. Moreover, there are no bound variables in any atomic formula.
; Negation : ''x'' occurs free in ¬''φ'' if and only if ''x'' occurs free in ''φ''. ''x'' occurs bound in ¬''φ'' if and only if ''x'' occurs bound in ''φ''
; Binary connectives : ''x'' occurs free in (''φ'' → ''ψ'') if and only if ''x'' occurs free in either ''φ'' or ''ψ''. ''x'' occurs bound in (''φ'' → ''ψ'') if and only if ''x'' occurs bound in either ''φ'' or ''ψ''. The same rule applies to any other binary connective in place of →.
; Quantifiers : ''x'' occurs free in , if and only if x occurs free in ''φ'' and ''x'' is a different symbol from ''y''. Also, ''x'' occurs bound in , if and only if ''x'' is ''y'' or ''x'' occurs bound in ''φ''. The same rule holds with in place of .
For example, in , ''x'' and ''y'' occur only bound, ''z'' occurs only free, and ''w'' is neither because it does not occur in the formula.
Free and bound variables of a formula need not be disjoint sets: in the formula , the first occurrence of ''x'', as argument of ''P'', is free while the second one, as argument of ''Q'', is bound.
A formula in first-order logic with no free variable occurrences is called a ''first-order
sentence''. These are the formulas that will have well-defined
truth values under an interpretation. For example, whether a formula such as Phil(''x'') is true must depend on what ''x'' represents. But the sentence will be either true or false in a given interpretation.
Example: ordered abelian groups
In mathematics, the language of ordered
abelian groups has one constant symbol 0, one unary function symbol −, one binary function symbol +, and one binary relation symbol ≤. Then:
*The expressions +(''x'', ''y'') and +(''x'', +(''y'', −(''z''))) are ''terms''. These are usually written as ''x'' + ''y'' and ''x'' + ''y'' − ''z''.
*The expressions +(''x'', ''y'') = 0 and ≤(+(''x'', +(''y'', −(''z''))), +(''x'', ''y'')) are ''atomic formulas''. These are usually written as ''x'' + ''y'' = 0 and ''x'' + ''y'' − ''z'' ≤ ''x'' + ''y''.
*The expression