HOME

TheInfoList



OR:

In
formal language theory In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules. The alphabet of a formal language consists of symb ...
, a grammar (when the context is not given, often called a formal grammar for clarity) describes how to form strings from a language's
alphabet An alphabet is a standardized set of basic written graphemes (called letters) that represent the phonemes of certain spoken languages. Not all writing systems represent language in this way; in a syllabary, each character represents a syll ...
that are valid according to the language's
syntax In linguistics, syntax () is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure ( constituency) ...
. A grammar does not describe the meaning of the strings or what can be done with them in whatever context—only their form. A formal grammar is defined as a
set Set, The Set, SET or SETS may refer to: Science, technology, and mathematics Mathematics *Set (mathematics), a collection of elements *Category of sets, the category whose objects and morphisms are sets and total functions, respectively Electro ...
of production rules for such strings in a formal language. Formal language theory, the discipline that studies formal grammars and languages, is a branch of
applied mathematics Applied mathematics is the application of mathematical methods by different fields such as physics, engineering, medicine, biology, finance, business, computer science, and industry. Thus, applied mathematics is a combination of mathematical s ...
. Its applications are found in
theoretical computer science Theoretical computer science (TCS) is a subset of general computer science and mathematics that focuses on mathematical aspects of computer science such as the theory of computation, lambda calculus, and type theory. It is difficult to circumsc ...
,
theoretical linguistics Theoretical linguistics is a term in linguistics which, like the related term general linguistics, can be understood in different ways. Both can be taken as a reference to theory of language, or the branch of linguistics which inquires into the n ...
, formal semantics,
mathematical logic Mathematical logic is the study of logic, formal logic within mathematics. Major subareas include model theory, proof theory, set theory, and recursion theory. Research in mathematical logic commonly addresses the mathematical properties of for ...
, and other areas. A formal grammar is a
set Set, The Set, SET or SETS may refer to: Science, technology, and mathematics Mathematics *Set (mathematics), a collection of elements *Category of sets, the category whose objects and morphisms are sets and total functions, respectively Electro ...
of rules for rewriting strings, along with a "start symbol" from which rewriting starts. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "
recognizer A finite-state machine (FSM) or finite-state automaton (FSA, plural: ''automata''), finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number o ...
"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as
automata theory Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science. The word ''automata'' comes from the Greek word αὐτόματο ...
. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages.
Parsing Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term ''parsing'' comes from Lati ...
is the process of recognizing an utterance (a string in natural languages) by breaking it down to a
set Set, The Set, SET or SETS may refer to: Science, technology, and mathematics Mathematics *Set (mathematics), a collection of elements *Category of sets, the category whose objects and morphisms are sets and total functions, respectively Electro ...
of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as
compositional semantics In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. ...
. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its
parse tree A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term ''parse tree'' itself is used primarily in co ...
in computer science, and as its
deep structure Deep structure and surface structure (also D-structure and S-structure although those abbreviated forms are sometimes used with distinct meanings) are concepts used in linguistics, specifically in the study of syntax in the Chomskyan tradition of t ...
in
generative grammar Generative grammar, or generativism , is a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of earlier structuralist theories of linguistic ...
).


History

's treatise ''Astadyayi'' gives formal production rules and definitions to describe the formal grammar of
Sanskrit Sanskrit (; attributively , ; nominally , , ) is a classical language belonging to the Indo-Aryan branch of the Indo-European languages. It arose in South Asia after its predecessor languages had diffused there from the northwest in the late ...
. There are different uses of "form" and "formalism", which have changed over time, depending on the fields the relevant author was in contact with. A historical overview of the concept is given in


Introductory example

A grammar mainly consists of a set of '' production rules'', rewriting rules for transforming strings. Each rule specifies a replacement of a particular string (its ''left-hand side'') with another (its ''right-hand side''). A rule can be applied to each string that contains its left-hand side and produces a string in which an occurrence of that left-hand side has been replaced with its right-hand side. Unlike a
semi-Thue system In theoretical computer science and mathematical logic a string rewriting system (SRS), historically called a semi- Thue system, is a rewriting system over strings from a (usually finite) alphabet. Given a binary relation R between fixed strings ov ...
, which is wholly defined by these rules, a grammar further distinguishes between two kinds of symbols: ''nonterminal'' and ''terminal'' symbols; each left-hand side must contain at least one nonterminal symbol. It also distinguishes a special nonterminal symbol, called the ''start symbol''. The language generated by the grammar is defined to be the set of all strings without any nonterminal symbols that can be generated from the string consisting of a single start symbol by (possibly repeated) application of its rules in whatever way possible. If there are essentially different ways of generating the same single string, the grammar is said to be
ambiguous Ambiguity is the type of meaning (linguistics), meaning in which a phrase, statement or resolution is not explicitly defined, making several interpretations wikt:plausible#Adjective, plausible. A common aspect of ambiguity is uncertainty. It ...
. In the following examples, the terminal symbols are ''a'' and ''b'', and the start symbol is ''S''.


Example 1

Suppose we have the following production rules: : 1. S \rightarrow aSb : 2. S \rightarrow ba then we start with ''S'', and can choose a rule to apply to it. If we choose rule 1, we obtain the string ''aSb''. If we then choose rule 1 again, we replace ''S'' with ''aSb'' and obtain the string ''aaSbb''. If we now choose rule 2, we replace ''S'' with ''ba'' and obtain the string ', and are done. We can write this series of choices more briefly, using symbols: S \Rightarrow aSb \Rightarrow aaSbb \Rightarrow aababb. The language of the grammar is the infinite set \ = \, where a^k is a repeated k times (and n in particular represents the number of times production rule 1 has been applied). This grammar is context-free (only single nonterminals appear as left-hand sides) and unambiguous.


Examples 2 and 3

Suppose the rules are these instead: : 1. S \rightarrow a : 2. S \rightarrow SS : 3. aSa \rightarrow b This grammar is not context-free due to rule 3 and it is ambiguous due to the multiple ways in which rule 2 can be used to generate sequences of Ss. However, the language it generates is simply the set of all nonempty strings consisting of as and/or bs. This is easy to see: to generate a b from an S, use rule 2 twice to generate SSS, then rule 1 twice and rule 3 once to produce b. This means we can generate arbitrary nonempty sequences of Ss and then replace each of them with a or b as we please. That same language can alternatively be generated by a context-free, nonambiguous grammar; for instance, the regular grammar with rules : 1. S \rightarrow aS : 2. S \rightarrow bS : 3. S \rightarrow a : 4. S \rightarrow b


Formal definition


The syntax of grammars

In the classic formalization of generative grammars first proposed by
Noam Chomsky Avram Noam Chomsky (born December 7, 1928) is an American public intellectual: a linguist, philosopher, cognitive scientist, historian, social critic, and political activist. Sometimes called "the father of modern linguistics", Chomsky is ...
in the 1950s, a grammar ''G'' consists of the following components: * A finite set ''N'' of ''
nonterminal symbol In computer science, terminal and nonterminal symbols are the lexical elements used in specifying the production rules constituting a formal grammar. ''Terminal symbols'' are the elementary symbols of the language defined by a formal grammar. ...
s'', that is disjoint with the strings formed from ''G''. * A finite set \Sigma of ''
terminal symbol In computer science, terminal and nonterminal symbols are the lexical elements used in specifying the production rules constituting a formal grammar. ''Terminal symbols'' are the elementary symbols of the language defined by a formal grammar. ...
s'' that is disjoint from ''N''. * A finite set ''P'' of ''production rules'', each rule of the form :: (\Sigma \cup N)^ N (\Sigma \cup N)^ \rightarrow (\Sigma \cup N)^ :where is the
Kleene star In mathematical logic and computer science, the Kleene star (or Kleene operator or Kleene closure) is a unary operation, either on sets of strings or on sets of symbols or characters. In mathematics, it is more commonly known as the free monoid c ...
operator and \cup denotes
set union In set theory, the union (denoted by ∪) of a collection of sets is the set of all elements in the collection. It is one of the fundamental operations through which sets can be combined and related to each other. A refers to a union of ze ...
. That is, each production rule maps from one string of symbols to another, where the first string (the "head") contains an arbitrary number of symbols provided at least one of them is a nonterminal. In the case that the second string (the "body") consists solely of the
empty string In formal language theory, the empty string, or empty word, is the unique string of length zero. Formal theory Formally, a string is a finite, ordered sequence of characters such as letters, digits or spaces. The empty string is the special cas ...
—i.e., that it contains no symbols at all—it may be denoted with a special notation (often \Lambda, ''e'' or \epsilon) in order to avoid confusion. * A distinguished symbol S \in N that is the ''start symbol'', also called the ''sentence symbol''. A grammar is formally defined as the
tuple In mathematics, a tuple is a finite ordered list (sequence) of elements. An -tuple is a sequence (or ordered list) of elements, where is a non-negative integer. There is only one 0-tuple, referred to as ''the empty tuple''. An -tuple is defi ...
(N, \Sigma, P, S). Such a formal grammar is often called a
rewriting system In mathematics, computer science, and logic, rewriting covers a wide range of methods of replacing subterms of a formula with other terms. Such methods may be achieved by rewriting systems (also known as rewrite systems, rewrite engines, or reduc ...
or a
phrase structure grammar The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue (Post canonical systems). Some authors, however, reserve the term for more restricted grammars in the ...
in the literature.


Some mathematical constructs regarding formal grammars

The operation of a grammar can be defined in terms of relations on strings: * Given a grammar G = (N, \Sigma, P, S), the binary relation \underset G \Rightarrow (pronounced as "G derives in one step") on strings in (\Sigma \cup N)^ is defined by: *:x \underset G \Rightarrow y \iff \exists u, v, p, q \in (\Sigma \cup N)^*: (x = upv) \wedge (p \rightarrow q \in P) \wedge (y = uqv) * the relation \overset * (pronounced as ''G derives in zero or more steps'') is defined as the
reflexive transitive closure In mathematics, a subset of a given set is closed under an operation of the larger set if performing that operation on members of the subset always produces a member of that subset. For example, the natural numbers are closed under addition, but ...
of \underset G \Rightarrow * a ''sentential form'' is a member of (\Sigma \cup N)^* that can be derived in a finite number of steps from the start symbol S; that is, a sentential form is a member of \left\. A sentential form that contains no nonterminal symbols (i.e. is a member of \Sigma^*) is called a ''sentence''. * the ''language'' of G, denoted as \boldsymbol(G), is defined as the set of sentences built by G. Note that the grammar G = (N, \Sigma, P, S) is effectively the
semi-Thue system In theoretical computer science and mathematical logic a string rewriting system (SRS), historically called a semi- Thue system, is a rewriting system over strings from a (usually finite) alphabet. Given a binary relation R between fixed strings ov ...
(N \cup \Sigma, P), rewriting strings in exactly the same way; the only difference is in that we distinguish specific ''nonterminal'' symbols, which must be rewritten in rewrite rules, and are only interested in rewritings from the designated start symbol S to strings without nonterminal symbols.


Example

''For these examples, formal languages are specified using
set-builder notation In set theory and its applications to logic, mathematics, and computer science, set-builder notation is a mathematical notation for describing a set by enumerating its elements, or stating the properties that its members must satisfy. Defining ...
.'' Consider the grammar G where N = \left \, \Sigma = \left \, S is the start symbol, and P consists of the following production rules: : 1. S \rightarrow aBSc : 2. S \rightarrow abc : 3. Ba \rightarrow aB : 4. Bb \rightarrow bb This grammar defines the language L(G) = \left \ where a^ denotes a string of ''n'' consecutive a's. Thus, the language is the set of strings that consist of 1 or more a's, followed by the same number of b's, followed by the same number of c's. Some examples of the derivation of strings in L(G) are: :(Note on notation: P \underset i \Rightarrow Q reads "String generates string by means of production ", and the generated part is each time indicated in bold type.)


The Chomsky hierarchy

When
Noam Chomsky Avram Noam Chomsky (born December 7, 1928) is an American public intellectual: a linguist, philosopher, cognitive scientist, historian, social critic, and political activist. Sometimes called "the father of modern linguistics", Chomsky is ...
first formalized generative grammars in 1956, he classified them into types now known as the
Chomsky hierarchy In formal language theory, computer science and linguistics, the Chomsky hierarchy (also referred to as the Chomsky–Schützenberger hierarchy) is a containment hierarchy of classes of formal grammars. This hierarchy of grammars was described by ...
. The difference between these types is that they have increasingly strict production rules and can therefore express fewer formal languages. Two important types are ''
context-free grammar In formal language theory, a context-free grammar (CFG) is a formal grammar whose production rules are of the form :A\ \to\ \alpha with A a ''single'' nonterminal symbol, and \alpha a string of terminals and/or nonterminals (\alpha can be empt ...
s'' (Type 2) and ''
regular grammar In theoretical computer science and formal language theory, a regular grammar is a grammar that is ''right-regular'' or ''left-regular''. While their exact definition varies from textbook to textbook, they all require that * all production rules ...
s'' (Type 3). The languages that can be described with such a grammar are called ''
context-free language In formal language theory, a context-free language (CFL) is a language generated by a context-free grammar (CFG). Context-free languages have many applications in programming languages, in particular, most arithmetic expressions are generated by ...
s'' and ''
regular language In theoretical computer science and formal language theory, a regular language (also called a rational language) is a formal language that can be defined by a regular expression, in the strict sense in theoretical computer science (as opposed to ...
s'', respectively. Although much less powerful than
unrestricted grammar In automata theory, the class of unrestricted grammars (also called semi-Thue, type-0 or phrase structure grammars) is the most general class of grammars in the Chomsky hierarchy. No restrictions are made on the productions of an unrestricted gramma ...
s (Type 0), which can in fact express any language that can be accepted by a
Turing machine A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algori ...
, these two restricted types of grammars are most often used because parsers for them can be efficiently implemented.Grune, Dick & Jacobs, Ceriel H., ''Parsing Techniques – A Practical Guide'', Ellis Horwood, England, 1990. For example, all regular languages can be recognized by a
finite-state machine A finite-state machine (FSM) or finite-state automaton (FSA, plural: ''automata''), finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number o ...
, and for useful subsets of context-free grammars there are well-known algorithms to generate efficient
LL parser In computer science, an LL parser (Left-to-right, leftmost derivation) is a top-down parser for a restricted context-free language. It parses the input from Left to right, performing Leftmost derivation of the sentence. An LL parser is called a ...
s and
LR parser In computer science, LR parsers are a type of bottom-up parser that analyse deterministic context-free languages in linear time. There are several variants of LR parsers: SLR parsers, LALR parsers, Canonical LR(1) parsers, Minimal LR(1) parsers ...
s to recognize the corresponding languages those grammars generate.


Context-free grammars

A ''
context-free grammar In formal language theory, a context-free grammar (CFG) is a formal grammar whose production rules are of the form :A\ \to\ \alpha with A a ''single'' nonterminal symbol, and \alpha a string of terminals and/or nonterminals (\alpha can be empt ...
'' is a grammar in which the left-hand side of each production rule consists of only a single nonterminal symbol. This restriction is non-trivial; not all languages can be generated by context-free grammars. Those that can are called ''context-free languages''. The language L(G) = \left \ defined above is not a context-free language, and this can be strictly proven using the
pumping lemma for context-free languages Pumping may refer to: * The operation of a pump, for moving a liquid from one location to another **The use of a breast pump for extraction of milk * Pumping (audio), a creative misuse of dynamic range compression * Pumping (computer systems), the ...
, but for example the language \left \ (at least 1 a followed by the same number of b's) is context-free, as it can be defined by the grammar G_2 with N=\left \, \Sigma=\left \, S the start symbol, and the following production rules: : 1. S \rightarrow aSb : 2. S \rightarrow ab A context-free language can be recognized in O(n^3) time (''see''
Big O notation Big ''O'' notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Lan ...
) by an algorithm such as Earley's recogniser. That is, for every context-free language, a machine can be built that takes a string as input and determines in O(n^3) time whether the string is a member of the language, where n is the length of the string.Earley, Jay,
An Efficient Context-Free Parsing Algorithm
" ''Communications of the ACM'', Vol. 13 No. 2, pp. 94-102, February 1970.
Deterministic context-free language In formal language theory, deterministic context-free languages (DCFL) are a proper subset of context-free languages. They are the context-free languages that can be accepted by a deterministic pushdown automaton. DCFLs are always unambiguous, mea ...
s is a subset of context-free languages that can be recognized in linear time. There exist various algorithms that target either this set of languages or some subset of it.


Regular grammars

In
regular grammar In theoretical computer science and formal language theory, a regular grammar is a grammar that is ''right-regular'' or ''left-regular''. While their exact definition varies from textbook to textbook, they all require that * all production rules ...
s, the left hand side is again only a single nonterminal symbol, but now the right-hand side is also restricted. The right side may be the empty string, or a single terminal symbol, or a single terminal symbol followed by a nonterminal symbol, but nothing else. (Sometimes a broader definition is used: one can allow longer strings of terminals or single nonterminals without anything else, making languages easier to denote while still defining the same class of languages.) The language \left \ defined above is not regular, but the language \left \ (at least 1 a followed by at least 1 b, where the numbers may be different) is, as it can be defined by the grammar G_3 with N=\left \, \Sigma=\left \, S the start symbol, and the following production rules: :# S \rightarrow aA :# A \rightarrow aA :# A \rightarrow bB :# B \rightarrow bB :# B \rightarrow \epsilon All languages generated by a regular grammar can be recognized in O(n) time by a finite-state machine. Although in practice, regular grammars are commonly expressed using
regular expression A regular expression (shortened as regex or regexp; sometimes referred to as rational expression) is a sequence of characters that specifies a search pattern in text. Usually such patterns are used by string-searching algorithms for "find" or ...
s, some forms of regular expression used in practice do not strictly generate the regular languages and do not show linear recognitional performance due to those deviations.


Other forms of generative grammars

Many extensions and variations on Chomsky's original hierarchy of formal grammars have been developed, both by linguists and by computer scientists, usually either in order to increase their expressive power or in order to make them easier to analyze or parse. Some forms of grammars developed include: *
Tree-adjoining grammar Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi. Tree-adjoining grammars are somewhat similar to context-free grammars, but the elementary unit of rewriting is the tree rather than the symbol. Whereas context-free gram ...
s increase the expressiveness of conventional generative grammars by allowing rewrite rules to operate on
parse tree A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term ''parse tree'' itself is used primarily in co ...
s instead of just strings.Joshi, Aravind K., ''et al.'',
Tree Adjunct Grammars
" ''Journal of Computer Systems Science'', Vol. 10 No. 1, pp. 136-163, 1975.
*
Affix grammar An affix grammar is a kind of formal grammar; it is used to describe the Syntax (programming languages), syntax of languages, mainly computer languages, using an approach based on how natural language is typically described.Koster, Cornelis HA.Affi ...
sKoster , Cornelis H. A., "Affix Grammars," in ''ALGOL 68 Implementation'', North Holland Publishing Company, Amsterdam, p. 95-109, 1971. and
attribute grammar An attribute grammar is a formal way to supplement a formal grammar with semantic information processing. Semantic information is stored in attributes associated with terminal and nonterminal symbols of the grammar. The values of attributes are resu ...
sKnuth, Donald E.,
Semantics of Context-Free Languages
" ''Mathematical Systems Theory'', Vol. 2 No. 2, pp. 127-145, 1968.
Knuth, Donald E., "Semantics of Context-Free Languages (correction)," ''Mathematical Systems Theory'', Vol. 5 No. 1, pp 95-96, 1971. allow rewrite rules to be augmented with semantic attributes and operations, useful both for increasing grammar expressiveness and for constructing practical language translation tools.


Recursive grammars

A recursive grammar is a grammar that contains production rules that are
recursive Recursion (adjective: ''recursive'') occurs when a thing is defined in terms of itself or of its type. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in mathematics ...
. For example, a grammar for a
context-free language In formal language theory, a context-free language (CFL) is a language generated by a context-free grammar (CFG). Context-free languages have many applications in programming languages, in particular, most arithmetic expressions are generated by ...
is left-recursive if there exists a non-terminal symbol ''A'' that can be put through the production rules to produce a string with ''A'' as the leftmost symbol. An example of recursive grammar is a clause within a sentence separated by two commas. All types of grammars in the
Okoye hierarchy Okoye is a family name ( surname) originating in Nigeria. It is an Anambra dialect derived from the central Igbo name Okorie (meaning someone born on orie market day, or Oye market day as known in Anambra State). Notable people with the name ...
can be recursive.


Analytic grammars

Though there is a tremendous body of literature on
parsing algorithm Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term ''parsing'' comes from L ...
s, most of these algorithms assume that the language to be parsed is initially ''described'' by means of a ''generative'' formal grammar, and that the goal is to transform this generative grammar into a working parser. Strictly speaking, a generative grammar does not in any way correspond to the algorithm used to parse a language, and various algorithms have different restrictions on the form of production rules that are considered well-formed. An alternative approach is to formalize the language in terms of an analytic grammar in the first place, which more directly corresponds to the structure and semantics of a parser for the language. Examples of analytic grammar formalisms include the following:
The Language Machine
directly implements unrestricted analytic grammars. Substitution rules are used to transform an input to produce outputs and behaviour. The system can also produc

which shows what happens when the rules of an unrestricted analytic grammar are being applied. *
Top-down parsing language Top-Down Parsing Language (TDPL) is a type of analytic formal grammar developed by Alexander Birman in the early 1970s in order to study formally the behavior of a common class of practical top-down parsers that support a limited form of backtrac ...
(TDPL): a highly minimalist analytic grammar formalism developed in the early 1970s to study the behavior of top-down parsers.Birman, Alexander,
The TMG Recognition Schema
', Doctoral thesis, Princeton University, Dept. of Electrical Engineering, February 1970.
*
Link grammar Link grammar (LG) is a theory of syntax by Davy Temperley and Daniel Sleator which builds relations between pairs of words, rather than constructing constituents in a phrase structure hierarchy. Link grammar is similar to dependency grammar, but d ...
s: a form of analytic grammar designed for
linguistics Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Linguis ...
, which derives syntactic structure by examining the positional relationships between pairs of words.Sleator, Daniel D. & Temperly, Davy,
Parsing English with a Link Grammar
" Technical Report CMU-CS-91-196, Carnegie Mellon University Computer Science, 1991.
Sleator, Daniel D. & Temperly, Davy, "Parsing English with a Link Grammar," ''Third International Workshop on Parsing Technologies'', 1993. (Revised version of above report.) *
Parsing expression grammar In computer science, a parsing expression grammar (PEG) is a type of analytic formal grammar, i.e. it describes a formal language in terms of a set of rules for recognizing strings in the language. The formalism was introduced by Bryan Ford in 200 ...
s (PEGs): a more recent generalization of TDPL designed around the practical expressiveness needs of
programming language A programming language is a system of notation for writing computer programs. Most programming languages are text-based formal languages, but they may also be graphical. They are a kind of computer language. The description of a programming ...
and
compiler In computing, a compiler is a computer program that translates computer code written in one programming language (the ''source'' language) into another language (the ''target'' language). The name "compiler" is primarily used for programs that ...
writers.Ford, Bryan,
Packrat Parsing: a Practical Linear-Time Algorithm with Backtracking
', Master’s thesis, Massachusetts Institute of Technology, Sept. 2002.


See also

*
Abstract syntax tree In computer science, an abstract syntax tree (AST), or just syntax tree, is a tree representation of the abstract syntactic structure of text (often source code) written in a formal language. Each node of the tree denotes a construct occurring ...
*
Adaptive grammar An adaptive grammar is a formal grammar that explicitly provides mechanisms within the formalism to allow its own production rules to be manipulated. Overview John N. Shutt defines adaptive grammar as a grammatical formalism that allows rule set ...
*
Ambiguous grammar In computer science, an ambiguous grammar is a context-free grammar for which there exists a string that can have more than one leftmost derivation or parse tree, while an unambiguous grammar is a context-free grammar for which every valid string ...
* Backus–Naur form (BNF) *
Categorial grammar Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and seman ...
*
Concrete syntax tree A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term ''parse tree'' itself is used primarily in com ...
* Extended Backus–Naur form (EBNF) *
Grammar framework In linguistics, the grammar of a natural language is its set of structural constraints on speakers' or writers' composition of clauses, phrases, and words. The term can also refer to the study of such constraints, a field that includes domain ...
*
L-system An L-system or Lindenmayer system is a parallel rewriting system and a type of formal grammar. An L-system consists of an alphabet of symbols that can be used to make strings, a collection of production rules that expand each symbol into some ...
*
Lojban Lojban (pronounced ) is a logical, constructed, human language created by the Logical Language Group which aims to be syntactically unambigious. It succeeds the Loglan project. The Logical Language Group (LLG) began developing Lojban in 1987. ...
*
Post canonical system A Post canonical system, also known as a Post production system, as created by Emil Post, is a string-manipulation system that starts with finitely-many strings and repeatedly transforms them by applying a finite set j of specified rules of a cert ...
* Shape grammar *
Well-formed formula In mathematical logic, propositional logic and predicate logic, a well-formed formula, abbreviated WFF or wff, often simply formula, is a finite sequence of symbols from a given alphabet that is part of a formal language. A formal language can be ...


References


External links

{{Authority control Formal languages Grammar Mathematical logic Syntax Automata (computation)